WorldWideScience

Sample records for bayesian source separation

  1. Low Complexity Bayesian Single Channel Source Separation

    DEFF Research Database (Denmark)

    Beierholm, Thomas; Pedersen, Brian Dam; Winther, Ole

    2004-01-01

    We propose a simple Bayesian model for performing single channel speech separation using factorized source priors in a sliding window linearly transformed domain. Using a one dimensional mixture of Gaussians to model each band source leads to fast tractable inference for the source signals. Simul...

  2. Nonstationary source separation using sequential and variational Bayesian learning.

    Science.gov (United States)

    Chien, Jen-Tzung; Hsieh, Hsin-Lung

    2013-05-01

    Independent component analysis (ICA) is a popular approach for blind source separation where the mixing process is assumed to be unchanged with a fixed set of stationary source signals. However, the mixing system and source signals are nonstationary in real-world applications, e.g., the source signals may abruptly appear or disappear, the sources may be replaced by new ones or even moving by time. This paper presents an online learning algorithm for the Gaussian process (GP) and establishes a separation procedure in the presence of nonstationary and temporally correlated mixing coefficients and source signals. In this procedure, we capture the evolved statistics from sequential signals according to online Bayesian learning. The activity of nonstationary sources is reflected by an automatic relevance determination, which is incrementally estimated at each frame and continuously propagated to the next frame. We employ the GP to characterize the temporal structures of time-varying mixing coefficients and source signals. A variational Bayesian inference is developed to approximate the true posterior for estimating the nonstationary ICA parameters and for characterizing the activity of latent sources. The differences between this ICA method and the sequential Monte Carlo ICA are illustrated. In the experiments, the proposed algorithm outperforms the other ICA methods for the separation of audio signals in the presence of different nonstationary scenarios.

  3. Bayesian mixture models for source separation in MEG

    International Nuclear Information System (INIS)

    Calvetti, Daniela; Homa, Laura; Somersalo, Erkki

    2011-01-01

    This paper discusses the problem of imaging electromagnetic brain activity from measurements of the induced magnetic field outside the head. This imaging modality, magnetoencephalography (MEG), is known to be severely ill posed, and in order to obtain useful estimates for the activity map, complementary information needs to be used to regularize the problem. In this paper, a particular emphasis is on finding non-superficial focal sources that induce a magnetic field that may be confused with noise due to external sources and with distributed brain noise. The data are assumed to come from a mixture of a focal source and a spatially distributed possibly virtual source; hence, to differentiate between those two components, the problem is solved within a Bayesian framework, with a mixture model prior encoding the information that different sources may be concurrently active. The mixture model prior combines one density that favors strongly focal sources and another that favors spatially distributed sources, interpreted as clutter in the source estimation. Furthermore, to address the challenge of localizing deep focal sources, a novel depth sounding algorithm is suggested, and it is shown with simulated data that the method is able to distinguish between a signal arising from a deep focal source and a clutter signal. (paper)

  4. Estimation of Input Function from Dynamic PET Brain Data Using Bayesian Blind Source Separation

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej; Šmídl, Václav

    2015-01-01

    Roč. 12, č. 4 (2015), s. 1273-1287 ISSN 1820-0214 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : blind source separation * Variational Bayes method * dynamic PET * input function * deconvolution Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.623, year: 2015 http://library.utia.cas.cz/separaty/2015/AS/tichy-0450509.pdf

  5. Radiation Source Mapping with Bayesian Inverse Methods

    Science.gov (United States)

    Hykes, Joshua Michael

    We present a method to map the spectral and spatial distributions of radioactive sources using a small number of detectors. Locating and identifying radioactive materials is important for border monitoring, accounting for special nuclear material in processing facilities, and in clean-up operations. Most methods to analyze these problems make restrictive assumptions about the distribution of the source. In contrast, the source-mapping method presented here allows an arbitrary three-dimensional distribution in space and a flexible group and gamma peak distribution in energy. To apply the method, the system's geometry and materials must be known. A probabilistic Bayesian approach is used to solve the resulting inverse problem (IP) since the system of equations is ill-posed. The probabilistic approach also provides estimates of the confidence in the final source map prediction. A set of adjoint flux, discrete ordinates solutions, obtained in this work by the Denovo code, are required to efficiently compute detector responses from a candidate source distribution. These adjoint fluxes are then used to form the linear model to map the state space to the response space. The test for the method is simultaneously locating a set of 137Cs and 60Co gamma sources in an empty room. This test problem is solved using synthetic measurements generated by a Monte Carlo (MCNP) model and using experimental measurements that we collected for this purpose. With the synthetic data, the predicted source distributions identified the locations of the sources to within tens of centimeters, in a room with an approximately four-by-four meter floor plan. Most of the predicted source intensities were within a factor of ten of their true value. The chi-square value of the predicted source was within a factor of five from the expected value based on the number of measurements employed. With a favorable uniform initial guess, the predicted source map was nearly identical to the true distribution

  6. Bayesian Separation of Non-Stationary Mixtures of Dependent Gaus

    Data.gov (United States)

    National Aeronautics and Space Administration — In this work, we propose a novel approach to perform Dependent Component Analysis (DCA). DCA can be thought as the separation of latent, dependent sources from their...

  7. Convolutive Blind Source Separation Methods

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Larsen, Jan; Kjems, Ulrik

    2008-01-01

    During the past decades, much attention has been given to the separation of mixed sources, in particular for the blind case where both the sources and the mixing process are unknown and only recordings of the mixtures are available. In several situations it is desirable to recover all sources from...... the recorded mixtures, or at least to segregate a particular source. Furthermore, it may be useful to identify the mixing process itself to reveal information about the physical mixing system. In some simple mixing models each recording consists of a sum of differently weighted source signals. However, in many...... real-world applications, such as in acoustics, the mixing process is more complex. In such systems, the mixtures are weighted and delayed, and each source contributes to the sum with multiple delays corresponding to the multiple paths by which an acoustic signal propagates to a microphone...

  8. Source reconstruction accuracy of MEG and EEG Bayesian inversion approaches.

    Directory of Open Access Journals (Sweden)

    Paolo Belardinelli

    Full Text Available Electro- and magnetoencephalography allow for non-invasive investigation of human brain activation and corresponding networks with high temporal resolution. Still, no correct network detection is possible without reliable source localization. In this paper, we examine four different source localization schemes under a common Variational Bayesian framework. A Bayesian approach to the Minimum Norm Model (MNM, an Empirical Bayesian Beamformer (EBB and two iterative Bayesian schemes (Automatic Relevance Determination (ARD and Greedy Search (GS are quantitatively compared. While EBB and MNM each use a single empirical prior, ARD and GS employ a library of anatomical priors that define possible source configurations. The localization performance was investigated as a function of (i the number of sources (one vs. two vs. three, (ii the signal to noise ratio (SNR; 5 levels and (iii the temporal correlation of source time courses (for the cases of two or three sources. We also tested whether the use of additional bilateral priors specifying source covariance for ARD and GS algorithms improved performance. Our results show that MNM proves effective only with single source configurations. EBB shows a spatial accuracy of few millimeters with high SNRs and low correlation between sources. In contrast, ARD and GS are more robust to noise and less affected by temporal correlations between sources. However, the spatial accuracy of ARD and GS is generally limited to the order of one centimeter. We found that the use of correlated covariance priors made no difference to ARD/GS performance.

  9. Bayesian statistics in radionuclide metrology: measurement of a decaying source

    International Nuclear Information System (INIS)

    Bochud, F. O.; Bailat, C.J.; Laedermann, J.P.

    2007-01-01

    The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation. (authors)

  10. MUSIC REMIXING AND UPMIXING USING SOURCE SEPARATION

    OpenAIRE

    Roma, G; Grais, Emad M; Simpson, AJR; Plumbley, Mark D

    2016-01-01

    Current research on audio source separation provides tools to estimate the signals contributed by different instruments in polyphonic music mixtures. Such tools can be already incorporated in music production and post-production workflows. In this paper, we describe recent experiments where audio source separation is applied to remixing and upmixing existing mono and stereo music content

  11. Removal of micropollutants in source separated sanitation

    NARCIS (Netherlands)

    Butkovskyi, A.

    2015-01-01

    Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and

  12. Transform domain steganography with blind source separation

    Science.gov (United States)

    Jouny, Ismail

    2015-05-01

    This paper applies blind source separation or independent component analysis for images that may contain mixtures of text, audio, or other images for steganography purposes. The paper focuses on separating mixtures in the transform domain such as Fourier domain or the Wavelet domain. The study addresses the effectiveness of steganography when using linear mixtures of multimedia components and the ability of standard blind sources separation techniques to discern hidden multimedia messages. Mixing in the space, frequency, and wavelet (scale) domains is compared. Effectiveness is measured using mean square error rate between original and recovered images.

  13. Joint Bayesian Component Separation and CMB Power Spectrum Estimation

    Science.gov (United States)

    Eriksen, H. K.; Jewell, J. B.; Dickinson, C.; Banday, A. J.; Gorski, K. M.; Lawrence, C. R.

    2008-01-01

    We describe and implement an exact, flexible, and computationally efficient algorithm for joint component separation and CMB power spectrum estimation, building on a Gibbs sampling framework. Two essential new features are (1) conditional sampling of foreground spectral parameters and (2) joint sampling of all amplitude-type degrees of freedom (e.g., CMB, foreground pixel amplitudes, and global template amplitudes) given spectral parameters. Given a parametric model of the foreground signals, we estimate efficiently and accurately the exact joint foreground- CMB posterior distribution and, therefore, all marginal distributions such as the CMB power spectrum or foreground spectral index posteriors. The main limitation of the current implementation is the requirement of identical beam responses at all frequencies, which restricts the analysis to the lowest resolution of a given experiment. We outline a future generalization to multiresolution observations. To verify the method, we analyze simple models and compare the results to analytical predictions. We then analyze a realistic simulation with properties similar to the 3 yr WMAP data, downgraded to a common resolution of 3 deg FWHM. The results from the actual 3 yr WMAP temperature analysis are presented in a companion Letter.

  14. Source / component separation with NMF and scarlet

    Science.gov (United States)

    Melchior, Peter; Moolekamp, Fred; LSST Data Management, WFIRST Preparatory Science

    2018-01-01

    Astronomical data are often superpositions of multiple source signals. I will introduce the open-source analysis framework scarlet, based on the Non-negative Matrix Factorization (NMF), that achieves efficient source separation and enables flexible constraints or priors on the shape of the signals and/or the signal amplitude across multiple observations.I will demonstrate scarlet's capabilities of separating multi-component photo-z distributions, AGN jets from host galaxies, and more generally: crowded extragalactic fields in the HSC survey. I will also discuss extensions for joint pixel-level deblending with images from LSST and WFIRST, and for hyperspectral or grism data.

  15. Perceptually controlled doping for audio source separation

    Science.gov (United States)

    Mahé, Gaël; Nadalin, Everton Z.; Suyama, Ricardo; Romano, João MT

    2014-12-01

    The separation of an underdetermined audio mixture can be performed through sparse component analysis (SCA) that relies however on the strong hypothesis that source signals are sparse in some domain. To overcome this difficulty in the case where the original sources are available before the mixing process, the informed source separation (ISS) embeds in the mixture a watermark, which information can help a further separation. Though powerful, this technique is generally specific to a particular mixing setup and may be compromised by an additional bitrate compression stage. Thus, instead of watermarking, we propose a `doping' method that makes the time-frequency representation of each source more sparse, while preserving its audio quality. This method is based on an iterative decrease of the distance between the distribution of the signal and a target sparse distribution, under a perceptual constraint. We aim to show that the proposed approach is robust to audio coding and that the use of the sparsified signals improves the source separation, in comparison with the original sources. In this work, the analysis is made only in instantaneous mixtures and focused on voice sources.

  16. Nitrate source apportionment in a subtropical watershed using Bayesian model

    International Nuclear Information System (INIS)

    Yang, Liping; Han, Jiangpei; Xue, Jianlong; Zeng, Lingzao; Shi, Jiachun; Wu, Laosheng; Jiang, Yonghai

    2013-01-01

    Nitrate (NO 3 − ) pollution in aquatic system is a worldwide problem. The temporal distribution pattern and sources of nitrate are of great concern for water quality. The nitrogen (N) cycling processes in a subtropical watershed located in Changxing County, Zhejiang Province, China were greatly influenced by the temporal variations of precipitation and temperature during the study period (September 2011 to July 2012). The highest NO 3 − concentration in water was in May (wet season, mean ± SD = 17.45 ± 9.50 mg L −1 ) and the lowest concentration occurred in December (dry season, mean ± SD = 10.54 ± 6.28 mg L −1 ). Nevertheless, no water sample in the study area exceeds the WHO drinking water limit of 50 mg L −1 NO 3 − . Four sources of NO 3 − (atmospheric deposition, AD; soil N, SN; synthetic fertilizer, SF; manure and sewage, M and S) were identified using both hydrochemical characteristics [Cl − , NO 3 − , HCO 3 − , SO 4 2− , Ca 2+ , K + , Mg 2+ , Na + , dissolved oxygen (DO)] and dual isotope approach (δ 15 N–NO 3 − and δ 18 O–NO 3 − ). Both chemical and isotopic characteristics indicated that denitrification was not the main N cycling process in the study area. Using a Bayesian model (stable isotope analysis in R, SIAR), the contribution of each source was apportioned. Source apportionment results showed that source contributions differed significantly between the dry and wet season, AD and M and S contributed more in December than in May. In contrast, SN and SF contributed more NO 3 − to water in May than that in December. M and S and SF were the major contributors in December and May, respectively. Moreover, the shortcomings and uncertainties of SIAR were discussed to provide implications for future works. With the assessment of temporal variation and sources of NO 3 − , better agricultural management practices and sewage disposal programs can be implemented to sustain water quality in subtropical watersheds

  17. Monaural separation of dependent audio sources based on a generalized Wiener filter

    DEFF Research Database (Denmark)

    Ma, Guilin; Agerkvist, Finn T.; Luther, J.B.

    2007-01-01

    This paper presents a two-stage approach for single- channel separation of dependent audio sources. The proposed algorithm is developed in the Bayesian framework and designed for general audio signals. In the first stage of the algorithm, the joint distribution of discrete Fourier transform (DFT)...

  18. Fast Bayesian optimal experimental design for seismic source inversion

    KAUST Repository

    Long, Quan

    2015-07-01

    We develop a fast method for optimally designing experiments in the context of statistical seismic source inversion. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by elastodynamic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the "true" parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem. © 2015 Elsevier B.V.

  19. Fast Bayesian Optimal Experimental Design for Seismic Source Inversion

    KAUST Repository

    Long, Quan

    2016-01-06

    We develop a fast method for optimally designing experiments [1] in the context of statistical seismic source inversion [2]. In particular, we efficiently compute the optimal number and locations of the receivers or seismographs. The seismic source is modeled by a point moment tensor multiplied by a time-dependent function. The parameters include the source location, moment tensor components, and start time and frequency in the time function. The forward problem is modeled by the elastic wave equations. We show that the Hessian of the cost functional, which is usually defined as the square of the weighted L2 norm of the difference between the experimental data and the simulated data, is proportional to the measurement time and the number of receivers. Consequently, the posterior distribution of the parameters, in a Bayesian setting, concentrates around the true parameters, and we can employ Laplace approximation and speed up the estimation of the expected Kullback-Leibler divergence (expected information gain), the optimality criterion in the experimental design procedure. Since the source parameters span several magnitudes, we use a scaling matrix for efficient control of the condition number of the original Hessian matrix. We use a second-order accurate finite difference method to compute the Hessian matrix and either sparse quadrature or Monte Carlo sampling to carry out numerical integration. We demonstrate the efficiency, accuracy, and applicability of our method on a two-dimensional seismic source inversion problem.

  20. A Bayesian Algorithm for Assessing Uncertainty in Radionuclide Source Terms

    Science.gov (United States)

    Robins, Peter

    2015-04-01

    Inferring source term parameters for a radionuclide release is difficult, due to the large uncertainties in forward dispersion modelling as a consequence of imperfect knowledge pertaining to wind vector fields and turbulent diffusion in the Earth's atmosphere. Additional sources of error include the radionuclide measurements obtained from sensors. These measurements may either be subject to random fluctuations or are simple indications that the true, unobserved quantity is below a detection limit. Consequent large reconstruction uncertainties can render a "best" estimate meaningless. A Markov Chain Monte Carlo (MCMC) Bayesian Algorithm is presented that attempts to account for uncertainties in atmospheric transport modelling and radionuclide sensor measurements to quantify uncertainties in radionuclide release source term parameters. Prior probability distributions are created for likely release locations at existing nuclear facilities and seismic events. Likelihood models are constructed using CTBTO adjoint modelling output and probability distributions of sensor response. Samples from the resulting multi-isotope source term parameters posterior probability distribution are generated that can be used to make probabilistic statements about the source term. Examples are given of marginal probability distributions obtained from simulated sensor data. The consequences of errors in numerical weather prediction wind fields are demonstrated with a reconstruction of the Fukushima nuclear reactor accident from International Monitoring System radionuclide particulate sensor data.

  1. Removal of micropollutants in source separated sanitation

    OpenAIRE

    Butkovskyi, A.

    2015-01-01

    Source separated sanitation is an innovative sanitation method designed for minimizing use of energy and clean drinking water, and maximizing reuse of water, organics and nutrients from waste water. This approach is based on separate collection and treatment of toilet wastewater (black water) and the rest of the domestic wastewater (grey water). Different characteristics of wastewater streams facilitate recovery of energy, nutrients and fresh water. To ensure agricultural or ecological reuse ...

  2. Blind Source Separation of Multispectral Astronomical Images

    Science.gov (United States)

    Bijaoui, Albert; Nuzillard, Danielle

    Multispectral images lead to classify pixels, but often with the drawback that each pixel value is the result of a combination of different sources. We examined the ability of Blind Source Separation (BSS) methods to restore the independent sources. We tested different tools on HST images of the Seyfert galaxy 3C120: the Karhunen-Loéve expansion based on the diagonalization of the cross correlation matrix, algorithms which maximize contrast functions and programs which take into account the cross correlation between shift sources. With the last tools we obtained similar decompositions corresponding mainly to real phenomena. BSS can be considered as an interesting exploratory tool for astronomical data mining.

  3. Blind source separation theory and applications

    CERN Document Server

    Yu, Xianchuan; Xu, Jindong

    2013-01-01

    A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies    The book presents an overview of Blind Source Separation, a relatively new signal processing method.  Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers

  4. Bayesian source term determination with unknown covariance of measurements

    Science.gov (United States)

    Belal, Alkomiet; Tichý, Ondřej; Šmídl, Václav

    2017-04-01

    Determination of a source term of release of a hazardous material into the atmosphere is a very important task for emergency response. We are concerned with the problem of estimation of the source term in the conventional linear inverse problem, y = Mx, where the relationship between the vector of observations y is described using the source-receptor-sensitivity (SRS) matrix M and the unknown source term x. Since the system is typically ill-conditioned, the problem is recast as an optimization problem minR,B(y - Mx)TR-1(y - Mx) + xTB-1x. The first term minimizes the error of the measurements with covariance matrix R, and the second term is a regularization of the source term. There are different types of regularization arising for different choices of matrices R and B, for example, Tikhonov regularization assumes covariance matrix B as the identity matrix multiplied by scalar parameter. In this contribution, we adopt a Bayesian approach to make inference on the unknown source term x as well as unknown R and B. We assume prior on x to be a Gaussian with zero mean and unknown diagonal covariance matrix B. The covariance matrix of the likelihood R is also unknown. We consider two potential choices of the structure of the matrix R. First is the diagonal matrix and the second is a locally correlated structure using information on topology of the measuring network. Since the inference of the model is intractable, iterative variational Bayes algorithm is used for simultaneous estimation of all model parameters. The practical usefulness of our contribution is demonstrated on an application of the resulting algorithm to real data from the European Tracer Experiment (ETEX). This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases by Inverse Atmospheric Dispersion Modelling (STRADI).

  5. Direction-of-Arrival Estimation for Coherent Sources via Sparse Bayesian Learning

    Directory of Open Access Journals (Sweden)

    Zhang-Meng Liu

    2014-01-01

    Full Text Available A spatial filtering-based relevance vector machine (RVM is proposed in this paper to separate coherent sources and estimate their directions-of-arrival (DOA, with the filter parameters and DOA estimates initialized and refined via sparse Bayesian learning. The RVM is used to exploit the spatial sparsity of the incident signals and gain improved adaptability to much demanding scenarios, such as low signal-to-noise ratio (SNR, limited snapshots, and spatially adjacent sources, and the spatial filters are introduced to enhance global convergence of the original RVM in the case of coherent sources. The proposed method adapts to arbitrary array geometry, and simulation results show that it surpasses the existing methods in DOA estimation performance.

  6. Bayesian Source Attribution of Salmonellosis in South Australia.

    Science.gov (United States)

    Glass, K; Fearnley, E; Hocking, H; Raupach, J; Veitch, M; Ford, L; Kirk, M D

    2016-03-01

    Salmonellosis is a significant cause of foodborne gastroenteritis in Australia, and rates of illness have increased over recent years. We adopt a Bayesian source attribution model to estimate the contribution of different animal reservoirs to illness due to Salmonella spp. in South Australia between 2000 and 2010, together with 95% credible intervals (CrI). We excluded known travel associated cases and those of rare subtypes (fewer than 20 human cases or fewer than 10 isolates from included sources over the 11-year period), and the remaining 76% of cases were classified as sporadic or outbreak associated. Source-related parameters were included to allow for different handling and consumption practices. We attributed 35% (95% CrI: 20-49) of sporadic cases to chicken meat and 37% (95% CrI: 23-53) of sporadic cases to eggs. Of outbreak-related cases, 33% (95% CrI: 20-62) were attributed to chicken meat and 59% (95% CrI: 29-75) to eggs. A comparison of alternative model assumptions indicated that biases due to possible clustering of samples from sources had relatively minor effects on these estimates. Analysis of source-related parameters showed higher risk of illness from contaminated eggs than from contaminated chicken meat, suggesting that consumption and handling practices potentially play a bigger role in illness due to eggs, considering low Salmonella prevalence on eggs. Our results strengthen the evidence that eggs and chicken meat are important vehicles for salmonellosis in South Australia. © 2015 Society for Risk Analysis.

  7. Joint Matrices Decompositions and Blind Source Separation

    Czech Academy of Sciences Publication Activity Database

    Chabriel, G.; Kleinsteuber, M.; Moreau, E.; Shen, H.; Tichavský, Petr; Yeredor, A.

    2014-01-01

    Roč. 31, č. 3 (2014), s. 34-43 ISSN 1053-5888 R&D Projects: GA ČR GA102/09/1278 Institutional support: RVO:67985556 Keywords : joint matrices decomposition * tensor decomposition * blind source separation Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 5.852, year: 2014 http://library.utia.cas.cz/separaty/2014/SI/tichavsky-0427607.pdf

  8. Blind source separation problem in GPS time series

    Science.gov (United States)

    Gualandi, A.; Serpelloni, E.; Belardinelli, M. E.

    2016-04-01

    A critical point in the analysis of ground displacement time series, as those recorded by space geodetic techniques, is the development of data-driven methods that allow the different sources of deformation to be discerned and characterized in the space and time domains. Multivariate statistic includes several approaches that can be considered as a part of data-driven methods. A widely used technique is the principal component analysis (PCA), which allows us to reduce the dimensionality of the data space while maintaining most of the variance of the dataset explained. However, PCA does not perform well in finding the solution to the so-called blind source separation (BSS) problem, i.e., in recovering and separating the original sources that generate the observed data. This is mainly due to the fact that PCA minimizes the misfit calculated using an L2 norm (χ 2), looking for a new Euclidean space where the projected data are uncorrelated. The independent component analysis (ICA) is a popular technique adopted to approach the BSS problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, we test the use of a modified variational Bayesian ICA (vbICA) method to recover the multiple sources of ground deformation even in the presence of missing data. The vbICA method models the probability density function (pdf) of each source signal using a mix of Gaussian distributions, allowing for more flexibility in the description of the pdf of the sources with respect to standard ICA, and giving a more reliable estimate of them. Here we present its application to synthetic global positioning system (GPS) position time series, generated by simulating deformation near an active fault, including inter-seismic, co-seismic, and post-seismic signals, plus seasonal signals and noise, and an additional time-dependent volcanic source. We evaluate the ability of the PCA and ICA decomposition

  9. Research on Factors Influencing Municipal Household Solid Waste Separate Collection: Bayesian Belief Networks

    Directory of Open Access Journals (Sweden)

    Zhujie Chu

    2016-02-01

    Full Text Available Municipal household solid waste (MHSW has become a serious problem in China over the course of the last two decades, resulting in significant side effects to the environment. Therefore, effective management of MHSW has attracted wide attention from both researchers and practitioners. Separate collection, the first and crucial step to solve the MHSW problem, however, has not been thoroughly studied to date. An empirical survey has been conducted among 387 households in Harbin, China in this study. We use Bayesian Belief Networks model to determine the influencing factors on separate collection. Four types of factors are identified, including political, economic, social cultural and technological based on the PEST (political, economic, social and technological analytical method. In addition, we further analyze the influential power of different factors, based on the network structure and probability changes obtained by Netica software. Results indicate that technological dimension has the greatest impact on MHSW separate collection, followed by the political dimension and economic dimension; social cultural dimension impacts MHSW the least.

  10. The Bayesian Power Imaging (BPI) method for magnetic source imaging

    OpenAIRE

    Hasson, R.; Swithenby, S. J.

    2001-01-01

    In the biomagnetic inverse problem the main interest is the activation of a region of interest, i.e. the power dissipated in that region. The Bayesian power imaging method (BPI) provides a quantified probability that the activation of a region of interest is above a given threshold. This paper introduces the method and derives the equations used. The method is illustrated in this paper using both experimental and simulated data.

  11. Blind Source Separation For Ion Mobility Spectra

    International Nuclear Information System (INIS)

    Marco, S.; Pomareda, V.; Pardo, A.; Kessler, M.; Goebel, J.; Mueller, G.

    2009-01-01

    Miniaturization is a powerful trend for smart chemical instrumentation in a diversity of applications. It is know that miniaturization in IMS leads to a degradation of the system characteristics. For the present work, we are interested in signal processing solutions to mitigate limitations introduced by limited drift tube length that basically involve a loss of chemical selectivity. While blind source separation techniques (BSS) are popular in other domains, their application for smart chemical instrumentation is limited. However, in some conditions, basically linearity, BSS may fully recover the concentration time evolution and the pure spectra with few underlying hypothesis. This is extremely helpful in conditions where non-expected chemical interferents may appear, or unwanted perturbations may pollute the spectra. SIMPLISMA has been advocated by Harrington et al. in several papers. However, more modern methods of BSS for bilinear decomposition with the restriction of positiveness have appeared in the last decade. In order to explore and compare the performances of those methods a series of experiments were performed.

  12. Restrictive loads powered by separate or by common electrical sources

    Science.gov (United States)

    Appelbaum, J.

    1989-01-01

    In designing a multiple load electrical system, the designer may wish to compare the performance of two setups: a common electrical source powering all loads, or separate electrical sources powering individual loads. Three types of electrical sources: an ideal voltage source, an ideal current source, and solar cell source powering resistive loads were analyzed for their performances in separate and common source systems. A mathematical proof is given, for each case, indicating the merit of the separate or common source system. The main conclusions are: (1) identical resistive loads powered by ideal voltage sources perform the same in both system setups, (2) nonidentical resistive loads powered by ideal voltage sources perform the same in both system setups, (3) nonidentical resistive loads powered by ideal current sources have higher performance in separate source systems, and (4) nonidentical resistive loads powered by solar cells have higher performance in a common source system for a wide range of load resistances.

  13. Sensitivity of fluvial sediment source apportionment to mixing model assumptions: A Bayesian model comparison.

    Science.gov (United States)

    Cooper, Richard J; Krueger, Tobias; Hiscock, Kevin M; Rawlins, Barry G

    2014-11-01

    Mixing models have become increasingly common tools for apportioning fluvial sediment load to various sediment sources across catchments using a wide variety of Bayesian and frequentist modeling approaches. In this study, we demonstrate how different model setups can impact upon resulting source apportionment estimates in a Bayesian framework via a one-factor-at-a-time (OFAT) sensitivity analysis. We formulate 13 versions of a mixing model, each with different error assumptions and model structural choices, and apply them to sediment geochemistry data from the River Blackwater, Norfolk, UK, to apportion suspended particulate matter (SPM) contributions from three sources (arable topsoils, road verges, and subsurface material) under base flow conditions between August 2012 and August 2013. Whilst all 13 models estimate subsurface sources to be the largest contributor of SPM (median ∼76%), comparison of apportionment estimates reveal varying degrees of sensitivity to changing priors, inclusion of covariance terms, incorporation of time-variant distributions, and methods of proportion characterization. We also demonstrate differences in apportionment results between a full and an empirical Bayesian setup, and between a Bayesian and a frequentist optimization approach. This OFAT sensitivity analysis reveals that mixing model structural choices and error assumptions can significantly impact upon sediment source apportionment results, with estimated median contributions in this study varying by up to 21% between model versions. Users of mixing models are therefore strongly advised to carefully consider and justify their choice of model structure prior to conducting sediment source apportionment investigations. An OFAT sensitivity analysis of sediment fingerprinting mixing models is conductedBayesian models display high sensitivity to error assumptions and structural choicesSource apportionment results differ between Bayesian and frequentist approaches.

  14. Source Signals Separation and Reconstruction Following Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    WANG Cheng

    2014-02-01

    Full Text Available For separation and reconstruction of source signals from observed signals problem, the physical significance of blind source separation modal and independent component analysis is not very clear, and its solution is not unique. Aiming at these disadvantages, a new linear and instantaneous mixing model and a novel source signals separation reconstruction solving method from observed signals based on principal component analysis (PCA are put forward. Assumption of this new model is statistically unrelated rather than independent of source signals, which is different from the traditional blind source separation model. A one-to-one relationship between linear and instantaneous mixing matrix of new model and linear compound matrix of PCA, and a one-to-one relationship between unrelated source signals and principal components are demonstrated using the concept of linear separation matrix and unrelated of source signals. Based on this theoretical link, source signals separation and reconstruction problem is changed into PCA of observed signals then. The theoretical derivation and numerical simulation results show that, in despite of Gauss measurement noise, wave form and amplitude information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal and normalized; only wave form information of unrelated source signal can be separated and reconstructed by PCA when linear mixing matrix is column orthogonal but not normalized, unrelated source signal cannot be separated and reconstructed by PCA when mixing matrix is not column orthogonal or linear.

  15. Gradient Flow Convolutive Blind Source Separation

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Nielsen, Chinton Møller

    2004-01-01

    Experiments have shown that the performance of instantaneous gradient flow beamforming by Cauwenberghs et al. is reduced significantly in reverberant conditions. By expanding the gradient flow principle to convolutive mixtures, separation in a reverberant environment is possible. By use of a circ...

  16. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  17. Noise source separation of diesel engine by combining binaural sound localization method and blind source separation method

    Science.gov (United States)

    Yao, Jiachi; Xiang, Yang; Qian, Sichong; Li, Shengyang; Wu, Shaowei

    2017-11-01

    In order to separate and identify the combustion noise and the piston slap noise of a diesel engine, a noise source separation and identification method that combines a binaural sound localization method and blind source separation method is proposed. During a diesel engine noise and vibration test, because a diesel engine has many complex noise sources, a lead covering method was carried out on a diesel engine to isolate other interference noise from the No. 1-5 cylinders. Only the No. 6 cylinder parts were left bare. Two microphones that simulated the human ears were utilized to measure the radiated noise signals 1 m away from the diesel engine. First, a binaural sound localization method was adopted to separate the noise sources that are in different places. Then, for noise sources that are in the same place, a blind source separation method is utilized to further separate and identify the noise sources. Finally, a coherence function method, continuous wavelet time-frequency analysis method, and prior knowledge of the diesel engine are combined to further identify the separation results. The results show that the proposed method can effectively separate and identify the combustion noise and the piston slap noise of a diesel engine. The frequency of the combustion noise and the piston slap noise are respectively concentrated at 4350 Hz and 1988 Hz. Compared with the blind source separation method, the proposed method has superior separation and identification effects, and the separation results have fewer interference components from other noise.

  18. An Extended Bayesian Framework for Atrial and Ventricular Activity Separation in Atrial Fibrillation.

    Science.gov (United States)

    Roonizi, Ebadollah Kheirati; Sassi, Roberto

    2017-11-01

    An extended nonlinear Bayesian filtering framework is introduced for the analysis of atrial fibrillation (AF), in particular with single-channel electrocardiographical (ECG) recordings. It is suitable for simultaneously tracking the fundamental frequency of atrial fibrillatory waves (f-waves), and separating signals, linked to atrial and ventricular activity, during AF. In this framework, high-power ECG components, i.e., Q, R, S, and T waves, are modeled using sum of Gaussian functions. The atrial activity dynamical model is instead based on a trigonometrical function, with a fundamental frequency (the inverse of the dominant atrial cycle length), and its harmonics. The state variables of both dynamical models (QRS-T and f-waves) are hidden and, then estimated, sample by sample, using a Kalman smoother. Remarkably, the scheme is capable of separating ventricular and atrial activity signals, while contemporarily tracking the atrial fundamental frequency in time. The proposed method was evaluated using synthetic signals. In 290 ECGs in sinus rhythm from the PhysioNet PTB Diagnostic ECG Database, the P-waves were replaced with a synthetic f-wave. Broadband noise at different signal-to-noise ratio (SNR) (from 0 to 40 dB) was added to study the performance of the filter, under different SNR conditions. The results of the study demonstrated superior results in atrial and ventricular signal separation when compared with traditional average beat subtraction (ABS), one of the most widely used method for QRS-T cancellation (normalized mean square error = 0.045 for extended Kalman smoother (EKS) and 0.18 for ABS, SNR improvement was 21.1 dB for EKS and 12.2 dB for ABS in f-wave extraction). Various advantages of the proposed method have been addressed and demonstrated, including the problem of tracking the fundamental frequency of f-waves (root mean square error (RMSE) Hz for gradually changing frequency at SNR=15 dB) and of estimating robust QT/RR values during AF (RMSE at

  19. Bayesian Blind Separation and Deconvolution of Dynamic Image Sequences Using Sparsity Priors

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej; Šmídl, Václav

    2015-01-01

    Roč. 34, č. 1 (2015), s. 258-266 ISSN 0278-0062 R&D Projects: GA ČR GA13-29225S Keywords : Functional imaging * Blind source separation * Computer-aided detection and diagnosis * Probabilistic and statistical methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.756, year: 2015 http://library.utia.cas.cz/separaty/2014/AS/tichy-0431090.pdf

  20. Knowledge, Attitude and Practice Survey of Source-Separation of ...

    African Journals Online (AJOL)

    Source-separation is a solid waste management strategy which aids recycling. This concept is relatively new in Nigeria. The study therefore documented the Knowledge, Attitude and Practice of Source-separation among workers such as Non- Academic Staff and Business Operators at the University of Ibadan, Nigeria.

  1. Sediment source apportionment in Laurel Hill Creek, PA, using Bayesian chemical mass balance and isotope fingerprinting

    Science.gov (United States)

    Stewart, Heather; Massoudieh, Arash; Gellis, Allen C.

    2015-01-01

    A Bayesian chemical mass balance (CMB) approach was used to assess the contribution of potential sources for fluvial samples from Laurel Hill Creek in southwest Pennsylvania. The Bayesian approach provides joint probability density functions of the sources' contributions considering the uncertainties due to source and fluvial sample heterogeneity and measurement error. Both elemental profiles of sources and fluvial samples and 13C and 15N isotopes were used for source apportionment. The sources considered include stream bank erosion, forest, roads and agriculture (pasture and cropland). Agriculture was found to have the largest contribution, followed by stream bank erosion. Also, road erosion was found to have a significant contribution in three of the samples collected during lower-intensity rain events. The source apportionment was performed with and without isotopes. The results were largely consistent; however, the use of isotopes was found to slightly increase the uncertainty in most of the cases. The correlation analysis between the contributions of sources shows strong correlations between stream bank and agriculture, whereas roads and forest seem to be less correlated to other sources. Thus, the method was better able to estimate road and forest contributions independently. The hypothesis that the contributions of sources are not seasonally changing was tested by assuming that all ten fluvial samples had the same source contributions. This hypothesis was rejected, demonstrating a significant seasonal variation in the sources of sediments in the stream.

  2. Bayesian spatial filters for source signal extraction: a study in the peripheral nerve.

    Science.gov (United States)

    Tang, Y; Wodlinger, B; Durand, D M

    2014-03-01

    The ability to extract physiological source signals to control various prosthetics offer tremendous therapeutic potential to improve the quality of life for patients suffering from motor disabilities. Regardless of the modality, recordings of physiological source signals are contaminated with noise and interference along with crosstalk between the sources. These impediments render the task of isolating potential physiological source signals for control difficult. In this paper, a novel Bayesian Source Filter for signal Extraction (BSFE) algorithm for extracting physiological source signals for control is presented. The BSFE algorithm is based on the source localization method Champagne and constructs spatial filters using Bayesian methods that simultaneously maximize the signal to noise ratio of the recovered source signal of interest while minimizing crosstalk interference between sources. When evaluated over peripheral nerve recordings obtained in vivo, the algorithm achieved the highest signal to noise interference ratio ( 7.00 ±3.45 dB) amongst the group of methodologies compared with average correlation between the extracted source signal and the original source signal R = 0.93. The results support the efficacy of the BSFE algorithm for extracting source signals from the peripheral nerve.

  3. Bayesian image processing of data from fuzzy pattern sources

    International Nuclear Information System (INIS)

    Liang, Z.; Hart, H.

    1986-01-01

    In some radioisotopic organ image applications, a priori or supplementary source information may exist and can be characterized in terms of probability density functions P (phi) of the source elements {phi/sub j/} = phi (where phi/sub j/ (j = 1,2,..α) is the estimated average photon emission in voxel j per unit time at t = 0). For example, in cardiac imaging studies it is possible to evaluate the radioisotope concentration of the blood filling the cardiac chambers independently as a function of time by peripheral measurement. The blood concentration information in effect serves to limit amplitude uncertainty to the chamber boundary voxels and thus reduces the extent of amplitude ambiguities in the overall cardiac imaging reconstruction. The a priori or supplementary information may more generally be spatial, amplitude-dependent probability distributions P(phi), fuzzy patterns superimposed upon a background

  4. Bayesian Inference for Source Reconstruction: A Real-World Application

    Science.gov (United States)

    2014-09-25

    for the reconstruction of the location and emission rate from an actual contaminant source (emission from the Chalk River Laboratories medical isotope...consists of a comprehensive network of seismic, hydroacoustic , 6 International Scholarly Research Notices Figure 1: Locations of the three sampling stations...was at Chalk River Laboratories . infrasound, and radionuclide sensors as part of the verification regime of the Comprehensive Nuclear-Test- Ban Treaty

  5. Open Source Bayesian Models. 1. Application to ADME/Tox and Drug Discovery Datasets

    Science.gov (United States)

    2015-01-01

    On the order of hundreds of absorption, distribution, metabolism, excretion, and toxicity (ADME/Tox) models have been described in the literature in the past decade which are more often than not inaccessible to anyone but their authors. Public accessibility is also an issue with computational models for bioactivity, and the ability to share such models still remains a major challenge limiting drug discovery. We describe the creation of a reference implementation of a Bayesian model-building software module, which we have released as an open source component that is now included in the Chemistry Development Kit (CDK) project, as well as implemented in the CDD Vault and in several mobile apps. We use this implementation to build an array of Bayesian models for ADME/Tox, in vitro and in vivo bioactivity, and other physicochemical properties. We show that these models possess cross-validation receiver operator curve values comparable to those generated previously in prior publications using alternative tools. We have now described how the implementation of Bayesian models with FCFP6 descriptors generated in the CDD Vault enables the rapid production of robust machine learning models from public data or the user’s own datasets. The current study sets the stage for generating models in proprietary software (such as CDD) and exporting these models in a format that could be run in open source software using CDK components. This work also demonstrates that we can enable biocomputation across distributed private or public datasets to enhance drug discovery. PMID:25994950

  6. Comparison of methods for separating vibration sources in rotating machinery

    Science.gov (United States)

    Klein, Renata

    2017-12-01

    Vibro-acoustic signatures are widely used for diagnostics of rotating machinery. Vibration based automatic diagnostics systems need to achieve a good separation between signals generated by different sources. The separation task may be challenging, since the effects of the different vibration sources often overlap. In particular, there is a need to separate between signals related to the natural frequencies of the structure and signals resulting from the rotating components (signal whitening), as well as a need to separate between signals generated by asynchronous components like bearings and signals generated by cyclo-stationary components like gears. Several methods were proposed to achieve the above separation tasks. The present study compares between some of these methods. The paper also presents a new method for whitening, Adaptive Clutter Separation, as well as a new efficient algorithm for dephase, which separates between asynchronous and cyclo-stationary signals. For whitening the study compares between liftering of the high quefrencies and adaptive clutter separation. For separating between the asynchronous and the cyclo-stationary signals the study compares between liftering in the quefrency domain and dephase. The methods are compared using both simulated signals and real data.

  7. Source separation of household waste: A case study in China

    International Nuclear Information System (INIS)

    Zhuang Ying; Wu Songwei; Wang Yunlong; Wu Weixiang; Chen Yingxu

    2008-01-01

    A pilot program concerning source separation of household waste was launched in Hangzhou, capital city of Zhejiang province, China. Detailed investigations on the composition and properties of household waste in the experimental communities revealed that high water content and high percentage of food waste are the main limiting factors in the recovery of recyclables, especially paper from household waste, and the main contributors to the high cost and low efficiency of waste disposal. On the basis of the investigation, a novel source separation method, according to which household waste was classified as food waste, dry waste and harmful waste, was proposed and performed in four selected communities. In addition, a corresponding household waste management system that involves all stakeholders, a recovery system and a mechanical dehydration system for food waste were constituted to promote source separation activity. Performances and the questionnaire survey results showed that the active support and investment of a real estate company and a community residential committee play important roles in enhancing public participation and awareness of the importance of waste source separation. In comparison with the conventional mixed collection and transportation system of household waste, the established source separation and management system is cost-effective. It could be extended to the entire city and used by other cities in China as a source of reference

  8. A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)

    Science.gov (United States)

    Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.

    2017-12-01

    Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we

  9. Atmospheric Dispersion Unknown Source Parameters Determination Using AERMOD and Bayesian Inference Along Markov Chain Monte Carlo

    International Nuclear Information System (INIS)

    Haghighattalab, A.; Zolfaghari, A. R.; Minouchehr, A. H.; Kiya, H. A.

    2012-01-01

    Occurrence of hazardous accident in nuclear power plants and industrial units usually lead to release of radioactive materials and pollutants in environment. These materials and pollutants can be transported to a far downstream by the wind flow. In this paper, we implemented an atmospheric dispersion code to solve the inverse problem. Having received and detected the pollutants in one region, we may estimate the rate and location of the unknown source. For the modeling, one needs a model with ability of atmospheric dispersion calculation. Furthermore, it is required to implement a mathematical approach to infer the source location and the related rates. In this paper the AERMOD software and Bayesian inference along the Markov Chain Monte Carlo have been applied. Implementing, Bayesian approach and Markov Chain Monte Carlo for the aforementioned subject is not a new approach, but the AERMOD model coupled with the said methods is a new and well known regulatory software, and enhances the reliability of outcomes. To evaluate the method, an example is considered by defining pollutants concentration in a specific region and then obtaining the source location and intensity by a direct calculation. The result of the calculation estimates the average source location at a distance of 7km with an accuracy of 5m which is good enough to support the ability of the proposed algorithm.

  10. Bayesian Model Development for Analysis of Open Source Information to Support the Assessment of Nuclear Programs

    Energy Technology Data Exchange (ETDEWEB)

    Gastelum, Zoe N.; Whitney, Paul D.; White, Amanda M.

    2013-07-15

    Pacific Northwest National Laboratory has spent several years researching, developing, and validating large Bayesian network models to support integration of open source data sets for nuclear proliferation research. Our current work focuses on generating a set of interrelated models for multi-source assessment of nuclear programs, as opposed to a single comprehensive model. By using this approach, we can break down the models to cover logical sub-problems that can utilize different expertise and data sources. This approach allows researchers to utilize the models individually or in combination to detect and characterize a nuclear program and identify data gaps. The models operate at various levels of granularity, covering a combination of state-level assessments with more detailed models of site or facility characteristics. This paper will describe the current open source-driven, nuclear nonproliferation models under development, the pros and cons of the analytical approach, and areas for additional research.

  11. Extended Nonnegative Tensor Factorisation Models for Musical Sound Source Separation

    Directory of Open Access Journals (Sweden)

    Derry FitzGerald

    2008-01-01

    Full Text Available Recently, shift-invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, in practice, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attempting to resynthesise the separated sources. Further, it is difficult to impose harmonicity constraints on the recovered basis functions. This paper proposes a new additive synthesis-based approach which allows the use of linear-frequency spectrograms as well as imposing strict harmonic constraints, resulting in an improved model. Further, these additional constraints allow the addition of a source filter model to the factorisation framework, and an extended model which is capable of separating mixtures of pitched and percussive instruments simultaneously.

  12. Extended nonnegative tensor factorisation models for musical sound source separation.

    Science.gov (United States)

    FitzGerald, Derry; Cranitch, Matt; Coyle, Eugene

    2008-01-01

    Recently, shift-invariant tensor factorisation algorithms have been proposed for the purposes of sound source separation of pitched musical instruments. However, in practice, existing algorithms require the use of log-frequency spectrograms to allow shift invariance in frequency which causes problems when attempting to resynthesise the separated sources. Further, it is difficult to impose harmonicity constraints on the recovered basis functions. This paper proposes a new additive synthesis-based approach which allows the use of linear-frequency spectrograms as well as imposing strict harmonic constraints, resulting in an improved model. Further, these additional constraints allow the addition of a source filter model to the factorisation framework, and an extended model which is capable of separating mixtures of pitched and percussive instruments simultaneously.

  13. Algorithms for Source Separation - with Cocktail Party Applications

    DEFF Research Database (Denmark)

    Olsson, Rasmus Kongsgaard

    2007-01-01

    of speech in a time-frequency representation. My own contributions are based on learning dictionaries for each speaker separately and subsequently applying a concatenation of these dictionaries to separate a mixture. Sparse decompositions required for the decomposition are computed using nonnegative matrix......In this thesis, a number of possible solutions to source separation are suggested. Although they differ significantly in shape and intent, they share a heavy reliance on prior domain knowledge. Most of the developed algorithms are intended for speech applications, and hence, structural features...

  14. [Processing GC-FTIR by the blind source separation].

    Science.gov (United States)

    Yao, Zhi-Xiang; Huang, Hong; Liu, Huan-Bin

    2006-08-01

    An analysis method for separating chromatographic overlapped peaks and purifying infrared spectra is put forward, based on the blind source separation technique and the multi-dimensional data of GC-FTIR, Using various information from hyphenated instruments, this method was used to separate completely a organic mixture, the xylene isomerism system, a problem unable to solve usually. The method can confirm the rationality of theory and algorithm and give integral explanations of the independent component analysis data. The reason for the error in quantitative analysis is discussed.

  15. Common source-multiple load vs. separate source-individual load photovoltaic system

    Science.gov (United States)

    Appelbaum, Joseph

    1989-01-01

    A comparison of system performance is made for two possible system setups: (1) individual loads powered by separate solar cell sources; and (2) multiple loads powered by a common solar cell source. A proof for resistive loads is given that shows the advantage of a common source over a separate source photovoltaic system for a large range of loads. For identical loads, both systems perform the same.

  16. The Lifecycle of Bayesian Network Models Developed for Multi-Source Signature Assessment of Nuclear Programs

    Energy Technology Data Exchange (ETDEWEB)

    Gastelum, Zoe N.; White, Amanda M.; Whitney, Paul D.; Gosink, Luke J.; Sego, Landon H.

    2013-06-04

    The Multi-Source Signatures for Nuclear Programs project, part of Pacific Northwest National Laboratory’s (PNNL) Signature Discovery Initiative, seeks to computationally capture expert assessment of multi-type information such as text, sensor output, imagery, or audio/video files, to assess nuclear activities through a series of Bayesian network (BN) models. These models incorporate knowledge from a diverse range of information sources in order to help assess a country’s nuclear activities. The models span engineering topic areas, state-level indicators, and facility-specific characteristics. To illustrate the development, calibration, and use of BN models for multi-source assessment, we present a model that predicts a country’s likelihood to participate in the international nuclear nonproliferation regime. We validate this model by examining the extent to which the model assists non-experts arrive at conclusions similar to those provided by nuclear proliferation experts. We also describe the PNNL-developed software used throughout the lifecycle of the Bayesian network model development.

  17. The Lifecycle of Bayesian Network Models Developed for Multi-Source Signature Assessment of Nuclear Programs

    International Nuclear Information System (INIS)

    Gastelum, Zoe N.; White, Amanda M.; Whitney, Paul D.; Gosink, Luke J.; Sego, Landon H.

    2013-01-01

    The Multi-Source Signatures for Nuclear Programs project, part of Pacific Northwest National Laboratory's (PNNL) Signature Discovery Initiative, seeks to computationally capture expert assessment of multi-type information such as text, sensor output, imagery, or audio/video files, to assess nuclear activities through a series of Bayesian network (BN) models. These models incorporate knowledge from a diverse range of information sources in order to help assess a country's nuclear activities. The models span engineering topic areas, state-level indicators, and facility-specific characteristics. To illustrate the development, calibration, and use of BN models for multi-source assessment, we present a model that predicts a country's likelihood to participate in the international nuclear nonproliferation regime. We validate this model by examining the extent to which the model assists non-experts arrive at conclusions similar to those provided by nuclear proliferation experts. We also describe the PNNL-developed software used throughout the lifecycle of the Bayesian network model development

  18. A Bayesian approach to quantify the contribution of animal-food sources to human salmonellosis

    DEFF Research Database (Denmark)

    Hald, Tine; Vose, D.; Wegener, Henrik Caspar

    2004-01-01

    Based on the data from the integrated Danish Salmonella surveillance in 1999, we developed a mathematical model for quantifying the contribution of each of the major animal-food sources to human salmonellosis. The model was set up to calculate the number of domestic and sporadic cases caused...... salmonellosis was also included. The joint posterior distribution was estimated by fitting the model to the reported number of domestic and sporadic cases per Salmonella type in a Bayesian framework using Markov Chain Monte Carlo simulation. The number of domestic and sporadic cases was obtained by subtracting.......8-10.4%) of the cases, respectively. Taken together, imported foods were estimated to account for 11.8% (95% CI: 5.0-19.0%) of the cases. Other food sources considered had only a minor impact, whereas 25% of the cases could not be associated with any source. This approach of quantifying the contribution of the various...

  19. Application of hierarchical Bayesian unmixing models in river sediment source apportionment

    Science.gov (United States)

    Blake, Will; Smith, Hugh; Navas, Ana; Bodé, Samuel; Goddard, Rupert; Zou Kuzyk, Zou; Lennard, Amy; Lobb, David; Owens, Phil; Palazon, Leticia; Petticrew, Ellen; Gaspar, Leticia; Stock, Brian; Boeckx, Pacsal; Semmens, Brice

    2016-04-01

    Fingerprinting and unmixing concepts are used widely across environmental disciplines for forensic evaluation of pollutant sources. In aquatic and marine systems, this includes tracking the source of organic and inorganic pollutants in water and linking problem sediment to soil erosion and land use sources. It is, however, the particular complexity of ecological systems that has driven creation of the most sophisticated mixing models, primarily to (i) evaluate diet composition in complex ecological food webs, (ii) inform population structure and (iii) explore animal movement. In the context of the new hierarchical Bayesian unmixing model, MIXSIAR, developed to characterise intra-population niche variation in ecological systems, we evaluate the linkage between ecological 'prey' and 'consumer' concepts and river basin sediment 'source' and sediment 'mixtures' to exemplify the value of ecological modelling tools to river basin science. Recent studies have outlined advantages presented by Bayesian unmixing approaches in handling complex source and mixture datasets while dealing appropriately with uncertainty in parameter probability distributions. MixSIAR is unique in that it allows individual fixed and random effects associated with mixture hierarchy, i.e. factors that might exert an influence on model outcome for mixture groups, to be explored within the source-receptor framework. This offers new and powerful ways of interpreting river basin apportionment data. In this contribution, key components of the model are evaluated in the context of common experimental designs for sediment fingerprinting studies namely simple, nested and distributed catchment sampling programmes. Illustrative examples using geochemical and compound specific stable isotope datasets are presented and used to discuss best practice with specific attention to (1) the tracer selection process, (2) incorporation of fixed effects relating to sample timeframe and sediment type in the modelling

  20. Carbon footprint of urban source separation for nutrient recovery.

    Science.gov (United States)

    Kjerstadius, H; Bernstad Saraiva, A; Spångberg, J; Davidsson, Å

    2017-07-15

    Source separation systems for the management of domestic wastewater and food waste has been suggested as more sustainable sanitation systems for urban areas. The present study used an attributional life cycle assessment to investigate the carbon footprint and potential for nutrient recovery of two sanitation systems for a hypothetical urban area in Southern Sweden. The systems represented a typical Swedish conventional system and a possible source separation system with increased nutrient recovery. The assessment included the management chain from household collection, transport, treatment and final return of nutrients to agriculture or disposal of the residuals. The results for carbon footprint and nutrient recovery (phosphorus and nitrogen) concluded that the source separation system could increase nutrient recovery (0.30-0.38 kg P capita -1 year -1 and 3.10-3.28 kg N capita -1 year -1 ), while decreasing the carbon footprint (-24 to -58 kg CO 2 -eq. capita -1 year -1 ), compared to the conventional system. The nutrient recovery was increased by the use of struvite precipitation and ammonium stripping at the wastewater treatment plant. The carbon footprint decreased, mainly due to the increased biogas production, increased replacement of mineral fertilizer in agriculture and less emissions of nitrous oxide from wastewater treatment. In conclusion, the study showed that source separation systems could potentially be used to increase nutrient recovery from urban areas, while decreasing the climate impact. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Blind separation of more sources than sensors in convolutive mixtures

    DEFF Research Database (Denmark)

    Olsson, Rasmus Kongsgaard; Hansen, Lars Kai

    2006-01-01

    We demonstrate that blind separation of more sources than sensors can be performed based solely on the second order statistics of the observed mixtures. This a generalization of well-known robust algorithms that are suited for equal number of sources and sensors. It is assumed that the sources...... are non-stationary and sparsely distributed in the time-frequency plane. The mixture model is convolutive, i.e. acoustic setups such as the cocktail party problem are contained. The limits of identifiability are determined in the framework of the PARAFAC model. In the experimental section...

  2. Underdetermined Wideband DOA Estimation for Off-Grid Sources with Coprime Array Using Sparse Bayesian Learning.

    Science.gov (United States)

    Qin, Yanhua; Liu, Yumin; Liu, Jianyi; Yu, Zhongyuan

    2018-01-16

    Sparse Bayesian learning (SBL) is applied to the coprime array for underdetermined wideband direction of arrival (DOA) estimation. Using the augmented covariance matrix, the coprime array can achieve a higher number of degrees of freedom (DOFs) to resolve more sources than the number of physical sensors. The sparse-based DOA estimation can deteriorate the detection and estimation performance because the sources may be off the search grid no matter how fine the grid is. This dictionary mismatch problem can be well resolved by the SBL using fixed point updates. The SBL can automatically choose sparsity and approximately resolve the non-convex optimizaton problem. Numerical simulations are conducted to validate the effectiveness of the underdetermined wideband DOA estimation via SBL based on coprime array. It is clear that SBL can obtain good performance in detection and estimation compared to least absolute shrinkage and selection operator (LASSO), simultaneous orthogonal matching pursuit least squares (SOMP-LS) , simultaneous orthogonal matching pursuit total least squares (SOMP-TLS) and off-grid sparse Bayesian inference (OGSBI).

  3. Variable Step-Size Method Based on a Reference Separation System for Source Separation

    Directory of Open Access Journals (Sweden)

    Pengcheng Xu

    2015-01-01

    Full Text Available Traditional variable step-size methods are effective to solve the problem of choosing step-size in adaptive blind source separation process. But the initial setting of learning rate is vital, and the convergence speed is still low. This paper proposes a novel variable step-size method based on reference separation system for online blind source separation. The correlation between the estimated source signals and original source signals increases along with iteration. Therefore, we introduce a reference separation system to approximately estimate the correlation in terms of mean square error (MSE, which is utilized to update the step-size. The use of “minibatches” for the computation of MSE can reduce the complexity of the algorithm to some extent. Moreover, simulations demonstrate that the proposed method exhibits superior convergence and better steady-state performance over the fixed step-size method in the noise-free case, while converging faster than classical variable step-size methods in both stationary and nonstationary environments.

  4. Underdetermined Blind Audio Source Separation Using Modal Decomposition

    Directory of Open Access Journals (Sweden)

    Abdeldjalil Aïssa-El-Bey

    2007-03-01

    Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.

  5. Underdetermined Blind Audio Source Separation Using Modal Decomposition

    Directory of Open Access Journals (Sweden)

    Aïssa-El-Bey Abdeldjalil

    2007-01-01

    Full Text Available This paper introduces new algorithms for the blind separation of audio sources using modal decomposition. Indeed, audio signals and, in particular, musical signals can be well approximated by a sum of damped sinusoidal (modal components. Based on this representation, we propose a two-step approach consisting of a signal analysis (extraction of the modal components followed by a signal synthesis (grouping of the components belonging to the same source using vector clustering. For the signal analysis, two existing algorithms are considered and compared: namely the EMD (empirical mode decomposition algorithm and a parametric estimation algorithm using ESPRIT technique. A major advantage of the proposed method resides in its validity for both instantaneous and convolutive mixtures and its ability to separate more sources than sensors. Simulation results are given to compare and assess the performance of the proposed algorithms.

  6. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  7. Bayesian inversion of marine controlled source electromagnetic data offshore Vancouver Island, Canada

    Science.gov (United States)

    Gehrmann, Romina A. S.; Schwalenberg, Katrin; Riedel, Michael; Spence, George D.; Spieß, Volkhard; Dosso, Stan E.

    2016-01-01

    This paper applies nonlinear Bayesian inversion to marine controlled source electromagnetic (CSEM) data collected near two sites of the Integrated Ocean Drilling Program (IODP) Expedition 311 on the northern Cascadia Margin to investigate subseafloor resistivity structure related to gas hydrate deposits and cold vents. The Cascadia margin, off the west coast of Vancouver Island, Canada, has a large accretionary prism where sediments are under pressure due to convergent plate boundary tectonics. Gas hydrate deposits and cold vent structures have previously been investigated by various geophysical methods and seabed drilling. Here, we invert time-domain CSEM data collected at Sites U1328 and U1329 of IODP Expedition 311 using Bayesian methods to derive subsurface resistivity model parameters and uncertainties. The Bayesian information criterion is applied to determine the amount of structure (number of layers in a depth-dependent model) that can be resolved by the data. The parameter space is sampled with the Metropolis-Hastings algorithm in principal-component space, utilizing parallel tempering to ensure wider and efficient sampling and convergence. Nonlinear inversion allows analysis of uncertain acquisition parameters such as time delays between receiver and transmitter clocks as well as input electrical current amplitude. Marginalizing over these instrument parameters in the inversion accounts for their contribution to the geophysical model uncertainties. One-dimensional inversion of time-domain CSEM data collected at measurement sites along a survey line allows interpretation of the subsurface resistivity structure. The data sets can be generally explained by models with 1 to 3 layers. Inversion results at U1329, at the landward edge of the gas hydrate stability zone, indicate a sediment unconformity as well as potential cold vents which were previously unknown. The resistivities generally increase upslope due to sediment erosion along the slope. Inversion

  8. Blind source separation and localization using microphone arrays

    Science.gov (United States)

    Sun, Longji

    The blind source separation and localization problem for audio signals is studied using microphone arrays. Pure delay mixtures of source signals typically encountered in outdoor environments are considered. Our proposed approach utilizes the subspace methods, including multiple signal classification (MUSIC) and estimation of signal parameters via rotational invariance techniques (ESPRIT) algorithms, to estimate the directions of arrival (DOAs) of the sources from the collected mixtures. Since audio signals are generally considered broadband, the DOA estimates at frequencies with the large sum of squared amplitude values are combined to obtain the final DOA estimates. Using the estimated DOAs, the corresponding mixing and demixing matrices are computed, and the source signals are recovered using the inverse short time Fourier transform. Subspace methods take advantage of the spatial covariance matrix of the collected mixtures to achieve robustness to noise. While the subspace methods have been studied for localizing radio frequency signals, audio signals have their special properties. For instance, they are nonstationary, naturally broadband and analog. All of these make the separation and localization for the audio signals more challenging. Moreover, our algorithm is essentially equivalent to the beamforming technique, which suppresses the signals in unwanted directions and only recovers the signals in the estimated DOAs. Several crucial issues related to our algorithm and their solutions have been discussed, including source number estimation, spatial aliasing, artifact filtering, different ways of mixture generation, and source coordinate estimation using multiple arrays. Additionally, comprehensive simulations and experiments have been conducted to examine various aspects of the algorithm. Unlike the existing blind source separation and localization methods, which are generally time consuming, our algorithm needs signal mixtures of only a short duration and

  9. Bayesian source term estimation of atmospheric releases in urban areas using LES approach.

    Science.gov (United States)

    Xue, Fei; Kikumoto, Hideki; Li, Xiaofeng; Ooka, Ryozo

    2018-05-05

    The estimation of source information from limited measurements of a sensor network is a challenging inverse problem, which can be viewed as an assimilation process of the observed concentration data and the predicted concentration data. When dealing with releases in built-up areas, the predicted data are generally obtained by the Reynolds-averaged Navier-Stokes (RANS) equations, which yields building-resolving results; however, RANS-based models are outperformed by large-eddy simulation (LES) in the predictions of both airflow and dispersion. Therefore, it is important to explore the possibility of improving the estimation of the source parameters by using the LES approach. In this paper, a novel source term estimation method is proposed based on LES approach using Bayesian inference. The source-receptor relationship is obtained by solving the adjoint equations constructed using the time-averaged flow field simulated by the LES approach based on the gradient diffusion hypothesis. A wind tunnel experiment with a constant point source downwind of a single building model is used to evaluate the performance of the proposed method, which is compared with that of the existing method using a RANS model. The results show that the proposed method reduces the errors of source location and releasing strength by 77% and 28%, respectively. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Blind source separation advances in theory, algorithms and applications

    CERN Document Server

    Wang, Wenwu

    2014-01-01

    Blind Source Separation intends to report the new results of the efforts on the study of Blind Source Separation (BSS). The book collects novel research ideas and some training in BSS, independent component analysis (ICA), artificial intelligence and signal processing applications. Furthermore, the research results previously scattered in many journals and conferences worldwide are methodically edited and presented in a unified form. The book is likely to be of interest to university researchers, R&D engineers and graduate students in computer science and electronics who wish to learn the core principles, methods, algorithms, and applications of BSS. Dr. Ganesh R. Naik works at University of Technology, Sydney, Australia; Dr. Wenwu Wang works at University of Surrey, UK.

  11. Energy conversion of source separated packaging; Energiutvinning ur kaellsorterade foerpackningsfraktioner

    Energy Technology Data Exchange (ETDEWEB)

    Blidholm, O.; Wiklund, S.E. [AaF-Energikonsult (Sweden); Bauer, A.C. [Energikonsult A. Bauer (Sweden)

    1997-02-01

    The basic idea of this project is to study the possibilities to use source separated combustible material for energy conversion in conventional solid fuel boilers (i.e. not municipal waste incineration plants). The project has been carried out in three phases. During phase 1 and 2 a number of fuel analyses of different fractions were carried out. During phase 3 two combustion tests were carried out; (1) a boiler with grate equipped with cyclone, electrostatic precipitator and flue gas condenser, and (2) a bubbling fluidized bed boiler with electrostatic precipitator and flue gas condenser. During the tests source separated paper and plastic packagings were co-fired with biomass fuels. The mixing rate of packagings was approximately 15%. This study reports the results of phase 3 and the conclusions of the whole project. The technical terms of using packaging as fuel are good. The technique is available for shredding both paper and plastic packaging. The material can be co-fired with biomass. The economical terms of using source separated packaging for energy conversion can be very advantageous, but can also form obstacles. The result is to a high degree guided by such facts as how the fuel is collected, transported, reduced in size and handled at the combustion plant. The results of the combustion tests show that the environmental terms of using source separated packaging for energy conversion are good. The emissions of heavy metals into the atmosphere are very low. The emissions are well below the emission standards for waste incineration plants. 35 figs, 13 tabs, 8 appendices

  12. Application of blind source separation to nuclear magnetic resonance

    International Nuclear Information System (INIS)

    Nuzillard, D.; Bourg, St.; Nuzillard, J.M.

    1999-01-01

    Blind source separation is intended to decompose signal mixtures into statistically independent components. Application of this technique to the analysis of organic molecules by nuclear magnetic resonance (NMR) are presented. The goal of this study is to demonstrate the simplification of spectral analysis in three cases: the analysis of mixtures of compounds, the analysis of a single mixture and the splitting of the spectrum of a pure compound into simpler sub-spectra. (authors)

  13. A Bayesian source model for the 2004 great Sumatra-Andaman earthquake

    Science.gov (United States)

    Bletery, Quentin; Sladen, Anthony; Jiang, Junle; Simons, Mark

    2016-07-01

    The 2004 Mw 9.1-9.3 Sumatra-Andaman earthquake is one of the largest earthquakes of the modern instrumental era. Despite considerable efforts to analyze this event, the different available observations have proven difficult to reconcile in a single finite-fault slip model. In particular, the critical near-field geodetic records contain variable and significant postseismic signal (between 2 weeks' and 2 months' worth), while the satellite altimetry records of the associated tsunami are affected by various sources of uncertainties (e.g., source rupture velocity and mesoscale oceanic currents). In this study, we investigate the quasi-static slip distribution of the Sumatra-Andaman earthquake by carefully accounting for the different sources of uncertainties in the joint inversion of available geodetic and tsunami data. To this end, we use nondiagonal covariance matrices reflecting both observational and modeling uncertainties in a fully Bayesian inversion framework. Modeling errors can be particularly large for great earthquakes. Here we consider a layered spherical Earth for the static displacement field, nonhydrostatic equations for the tsunami, and a 3-D megathrust interface geometry to alleviate some of the potential epistemic uncertainties. The Bayesian framework then enables us to derive families of possible models compatible with the unevenly distributed and sometimes ambiguous measurements. We infer two regions of high fault slip at 3°N-4°N and 7°N-8°N with amplitudes that likely reach values as large as 40 m and possibly larger. These values are a factor of 2 larger than typically found in previous studies—potentially an outcome of commonly assumed forms of regularization. Finally, we find that fault rupture very likely involved shallow slip. Within the resolution provided by the existing data, we cannot rule out the possibility that fault rupture reached the trench.

  14. A multi-variate blind source separation algorithm.

    Science.gov (United States)

    Goldhacker, M; Keck, P; Igel, A; Lang, E W; Tomé, A M

    2017-11-01

    The study follows the proposal of decomposing a given data matrix into a product of independent spatial and temporal component matrices. A multi-variate decomposition approach is presented, based on an approximate diagonalization of a set of matrices computed using a latent space representation. The proposed methodology follows an algebraic approach, which is common to space, temporal or spatiotemporal blind source separation algorithms. More specifically, the algebraic approach relies on singular value decomposition techniques, which avoids computationally costly and numerically instable matrix inversion. The method is equally applicable to correlation matrices determined from second order correlations or by considering fourth order correlations. The resulting algorithms are applied to fMRI data sets either to extract the underlying fMRI components or to extract connectivity maps from resting state fMRI data collected for a dynamic functional connectivity analysis. Intriguingly, our algorithm shows increased spatial specificity compared to common approaches, while temporal precision stays similar. The study presents a novel spatiotemporal blind source separation algorithm, which is both robust and avoids parameters that are difficult to fine tune. Applied on experimental data sets, the new method yields highly confined and focused areas with least spatial extent in the retinotopy case, and similar results in the dynamic functional connectivity analyses compared to other blind source separation algorithms. Therefore, we conclude that our novel algorithm is highly competitive and yields results, which are superior or at least similar to existing approaches. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Source signature estimation from multimode surface waves via mode-separated virtual real source method

    Science.gov (United States)

    Gao, Lingli; Pan, Yudi

    2018-02-01

    The correct estimation of the seismic source signature is crucial to exploration geophysics. Based on seismic interferometry, the virtual real source (VRS) method provides a model-independent way for source signature estimation. However, when encountering multimode surface waves, which are commonly seen in the shallow seismic survey, strong spurious events appear in seismic interferometric results. These spurious events introduce errors in the virtual-source recordings, and reduce the accuracy of the source signature estimated by the VRS method. In order to estimate a correct source signature from multimode surface waves, we propose a mode-separated VRS method. In this method, multimode surface waves are mode separated before seismic interferometry. Virtual-source recordings are then obtained by applying seismic interferometry to each mode individually. Therefore, artefacts caused by cross-mode correlation are excluded in the virtual-source recordings and the estimated source signatures. A synthetic example showed that a correct source signature can be estimated with the proposed method, while strong spurious oscillation occurs in the estimated source signature if we do not apply mode separation first. We also applied the proposed method to a field example, which verified its validity and effectiveness in estimating seismic source signature from shallow seismic shot gathers containing multimode surface waves.

  16. Identification of Watershed-scale Critical Source Areas Using Bayesian Maximum Entropy Spatiotemporal Analysis

    Science.gov (United States)

    Roostaee, M.; Deng, Z.

    2017-12-01

    The states' environmental agencies are required by The Clean Water Act to assess all waterbodies and evaluate potential sources of impairments. Spatial and temporal distributions of water quality parameters are critical in identifying Critical Source Areas (CSAs). However, due to limitations in monetary resources and a large number of waterbodies, available monitoring stations are typically sparse with intermittent periods of data collection. Hence, scarcity of water quality data is a major obstacle in addressing sources of pollution through management strategies. In this study spatiotemporal Bayesian Maximum Entropy method (BME) is employed to model the inherent temporal and spatial variability of measured water quality indicators such as Dissolved Oxygen (DO) concentration for Turkey Creek Watershed. Turkey Creek is located in northern Louisiana and has been listed in 303(d) list for DO impairment since 2014 in Louisiana Water Quality Inventory Reports due to agricultural practices. BME method is proved to provide more accurate estimates than the methods of purely spatial analysis by incorporating space/time distribution and uncertainty in available measured soft and hard data. This model would be used to estimate DO concentration at unmonitored locations and times and subsequently identifying CSAs. The USDA's crop-specific land cover data layers of the watershed were then used to determine those practices/changes that led to low DO concentration in identified CSAs. Primary results revealed that cultivation of corn and soybean as well as urban runoff are main contributing sources in low dissolved oxygen in Turkey Creek Watershed.

  17. Bayesian electromagnetic spatio-temporal imaging of extended sources with Markov Random Field and temporal basis expansion.

    Science.gov (United States)

    Liu, Ke; Yu, Zhu Liang; Wu, Wei; Gu, Zhenghui; Li, Yuanqing; Nagarajan, Srikantan

    2016-10-01

    Estimating the locations and spatial extents of brain sources poses a long-standing challenge for electroencephalography and magnetoencephalography (E/MEG) source imaging. In the present work, a novel source imaging method, Bayesian Electromagnetic Spatio-Temporal Imaging of Extended Sources (BESTIES), which is built upon a Bayesian framework that determines the spatio-temporal smoothness of source activities in a fully data-driven fashion, is proposed to address this challenge. In particular, a Markov Random Field (MRF), which can precisely capture local cortical interactions, is employed to characterize the spatial smoothness of source activities, the temporal dynamics of which are modeled by a set of temporal basis functions (TBFs). Crucially, all of the unknowns in the MRF and TBF models are learned from the data. To accomplish model inference efficiently on high-resolution source spaces, a scalable algorithm is developed to approximate the posterior distribution of the source activities, which is based on the variational Bayesian inference and convex analysis. The performance of BESTIES is assessed using both simulated and actual human E/MEG data. Compared with L 2 -norm constrained methods, BESTIES is superior in reconstructing extended sources with less spatial diffusion and less localization error. By virtue of the MRF, BESTIES also overcomes the drawback of over-focal estimates in sparse constrained methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K. [Lloyd' s Register Consulting AB, Sundbyberg (Sweden)

    2013-10-15

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  19. Using Bayesian Belief Network (BBN) modelling for rapid source term prediction. Final report

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Dedda, F.D.; Hansson, F.; Sjoekvist, S.; Sunnegaerd, K.

    2013-10-01

    The project presented in this report deals with a number of complex issues related to the development of a tool for rapid source term prediction (RASTEP), based on a plant model represented as a Bayesian belief network (BBN) and a source term module which is used for assigning relevant source terms to BBN end states. Thus, RASTEP uses a BBN to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, composition, timing, and release path of released radio-nuclides). The output is a set of possible source terms with associated probabilities. One major issue has been associated with the integration of probabilistic and deterministic analyses are addressed, dealing with the challenge of making the source term determination flexible enough to give reliable and valid output throughout the accident scenario. The potential for connecting RASTEP to a fast running source term prediction code has been explored, as well as alternative ways of improving the deterministic connections of the tool. As part of the investigation, a comparison of two deterministic severe accident analysis codes has been performed. A second important task has been to develop a general method where experts' beliefs can be included in a systematic way when defining the conditional probability tables (CPTs) in the BBN. The proposed method includes expert judgement in a systematic way when defining the CPTs of a BBN. Using this iterative method results in a reliable BBN even though expert judgements, with their associated uncertainties, have been used. It also simplifies verification and validation of the considerable amounts of quantitative data included in a BBN. (Author)

  20. An open source Bayesian Monte Carlo isotope mixing model with applications in Earth surface processes

    Science.gov (United States)

    Arendt, Carli A.; Aciego, Sarah M.; Hetland, Eric A.

    2015-05-01

    The implementation of isotopic tracers as constraints on source contributions has become increasingly relevant to understanding Earth surface processes. Interpretation of these isotopic tracers has become more accessible with the development of Bayesian Monte Carlo (BMC) mixing models, which allow uncertainty in mixing end-members and provide methodology for systems with multicomponent mixing. This study presents an open source multiple isotope BMC mixing model that is applicable to Earth surface environments with sources exhibiting distinct end-member isotopic signatures. Our model is first applied to new δ18O and δD measurements from the Athabasca Glacier, which showed expected seasonal melt evolution trends and vigorously assessed the statistical relevance of the resulting fraction estimations. To highlight the broad applicability of our model to a variety of Earth surface environments and relevant isotopic systems, we expand our model to two additional case studies: deriving melt sources from δ18O, δD, and 222Rn measurements of Greenland Ice Sheet bulk water samples and assessing nutrient sources from ɛNd and 87Sr/86Sr measurements of Hawaiian soil cores. The model produces results for the Greenland Ice Sheet and Hawaiian soil data sets that are consistent with the originally published fractional contribution estimates. The advantage of this method is that it quantifies the error induced by variability in the end-member compositions, unrealized by the models previously applied to the above case studies. Results from all three case studies demonstrate the broad applicability of this statistical BMC isotopic mixing model for estimating source contribution fractions in a variety of Earth surface systems.

  1. MEG Source Localization of Spatially Extended Generators of Epileptic Activity: Comparing Entropic and Hierarchical Bayesian Approaches

    Science.gov (United States)

    Chowdhury, Rasheda Arman; Lina, Jean Marc; Kobayashi, Eliane; Grova, Christophe

    2013-01-01

    Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG) or Magneto-EncephaloGraphy (MEG) signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i) brain activity may be modeled using cortical parcels and (ii) brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP) method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM) and the Hierarchical Bayesian (HB) source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC) analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm2 to 30 cm2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered. PMID:23418485

  2. MEG source localization of spatially extended generators of epileptic activity: comparing entropic and hierarchical bayesian approaches.

    Directory of Open Access Journals (Sweden)

    Rasheda Arman Chowdhury

    Full Text Available Localizing the generators of epileptic activity in the brain using Electro-EncephaloGraphy (EEG or Magneto-EncephaloGraphy (MEG signals is of particular interest during the pre-surgical investigation of epilepsy. Epileptic discharges can be detectable from background brain activity, provided they are associated with spatially extended generators. Using realistic simulations of epileptic activity, this study evaluates the ability of distributed source localization methods to accurately estimate the location of the generators and their sensitivity to the spatial extent of such generators when using MEG data. Source localization methods based on two types of realistic models have been investigated: (i brain activity may be modeled using cortical parcels and (ii brain activity is assumed to be locally smooth within each parcel. A Data Driven Parcellization (DDP method was used to segment the cortical surface into non-overlapping parcels and diffusion-based spatial priors were used to model local spatial smoothness within parcels. These models were implemented within the Maximum Entropy on the Mean (MEM and the Hierarchical Bayesian (HB source localization frameworks. We proposed new methods in this context and compared them with other standard ones using Monte Carlo simulations of realistic MEG data involving sources of several spatial extents and depths. Detection accuracy of each method was quantified using Receiver Operating Characteristic (ROC analysis and localization error metrics. Our results showed that methods implemented within the MEM framework were sensitive to all spatial extents of the sources ranging from 3 cm(2 to 30 cm(2, whatever were the number and size of the parcels defining the model. To reach a similar level of accuracy within the HB framework, a model using parcels larger than the size of the sources should be considered.

  3. Blind Separation of Piecewise Stationary NonGaussian Sources

    Czech Academy of Sciences Publication Activity Database

    Koldovský, Zbyněk; Málek, J.; Tichavský, Petr; Deville, Y.; Hosseini, S.

    2009-01-01

    Roč. 89, č. 12 (2009), s. 2570-2584 ISSN 0165-1684 R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Grant - others:GA ČR(CZ) GA102/07/P384 Institutional research plan: CEZ:AV0Z10750506 Keywords : Independent component analysis * blind source separation * Cramer-Rao lower bound Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.135, year: 2009 http://library.utia.cas.cz/separaty/2009/SI/tichavsky-blind separationofpiecewisestationarynon-gaussiansources.pdf

  4. Vermicomposting of source-separated human faeces for nutrient recycling.

    Science.gov (United States)

    Yadav, Kunwar D; Tare, Vinod; Ahammed, M Mansoor

    2010-01-01

    The present study examined the suitability of vermicomposting technology for processing source-separated human faeces. Since the earthworm species Eisenia fetida could not survive in fresh faeces, modification in the physical characteristics of faeces was necessary before earthworms could be introduced to faeces. A preliminary study with six different combinations of faeces, soil and bulking material (vermicompost) in different layers was conducted to find out the best condition for biomass growth and reproduction of earthworms. The results indicated that SVFV combination (soil, vermicompost, faeces and vermicompost - bottom to top layers) was the best for earthworm biomass growth indicating the positive role of soil layer in earthworm biomass growth. Further studies with SVFV and VFV combinations, however, showed that soil layer did not enhance vermicompost production rate. Year-long study conducted with VFV combination to assess the quality and quantity of vermicompost produced showed an average vermicompost production rate of 0.30kg-cast/kg-worm/day. The vermicompost produced was mature as indicated by low dissolved organic carbon (2.4+/-0.43mg/g) and low oxygen uptake rate (0.15+/-0.09mg O(2)/g VS/h). Complete inactivation of total coliforms was noted during the study, which is one of the important objectives of human faeces processing. Results of the study thus indicated the potential of vermicomposting for processing of source-separated human faeces.

  5. BAYESIAN CALIBRATION OF SAFETY CODES USING DATA FROM SEPARATE-AND INTEGRAL EFFECTS TESTS

    Energy Technology Data Exchange (ETDEWEB)

    Yurko, Joseph P.; Buongiorno, Jacopo; Youngblood, Robert

    2015-04-01

    Large-scale system codes for simulation of safety performance of nuclear plants may contain parameters whose values are not known very accurately. In order to be able to use the results of these simulation codes with confidence, it is important to learn how the uncertainty on the values of these parameters affects the output of the codes. New information from tests or operating experience is incorporated into safety codes by a process known as calibration, which reduces uncertainty in the output of the safety code, and thereby improves its support for decision-making. Modern analysis capabilities afford very significant improvements on classical ways of doing calibration, and the work reported here implements some of those improvements. The key innovation has come from development of safety code surrogate model (code emulator) construction and prediction algorithms. A surrogate is needed for calibration of plant-scale simulation codes because the multivariate nature of the problem (i.e., the need to adjust multiple uncertain parameters at once to fit multiple pieces of new information) calls for multiple evaluations of performance, which, for a computation-intensive model, makes calibration very computation-intensive. Use of a fast surrogate makes the calibration processes used here with Markov Chain Monte Carlo (MCMC) sampling feasible. Moreover, most traditional surrogates do not provide uncertainty information along with their predictions, but the Gaussian Process (GP) based code surrogates used here do. This improves the soundness of the code calibration process. Results are demonstrated on a simplified scenario with data from Separate and Integral Effect Tests.

  6. Number of sources uncertainty in blind source separation. Application to EMG signal processing.

    Science.gov (United States)

    Snoussi, Hichem; Khanna, Saurabh; Hewson, David; Duchene, Jacques

    2007-01-01

    This contribution deals with the number of components uncertainty in blind source separation. The number of components is estimated by maximizing its marginal a posteriori probability which favors the simplest explanation of the observed data. Marginalizing (integrating over all the parameters) is implemented through the Laplace approximation based on an efficient wavelet spectral matching separating algorithm. The effectiveness of the proposed method is shown on EMG data processing.

  7. Bayesian Inference for Neural Electromagnetic Source Localization: Analysis of MEG Visual Evoked Activity

    International Nuclear Information System (INIS)

    George, J.S.; Schmidt, D.M.; Wood, C.C.

    1999-01-01

    We have developed a Bayesian approach to the analysis of neural electromagnetic (MEG/EEG) data that can incorporate or fuse information from other imaging modalities and addresses the ill-posed inverse problem by sarnpliig the many different solutions which could have produced the given data. From these samples one can draw probabilistic inferences about regions of activation. Our source model assumes a variable number of variable size cortical regions of stimulus-correlated activity. An active region consists of locations on the cortical surf ace, within a sphere centered on some location in cortex. The number and radi of active regions can vary to defined maximum values. The goal of the analysis is to determine the posterior probability distribution for the set of parameters that govern the number, location, and extent of active regions. Markov Chain Monte Carlo is used to generate a large sample of sets of parameters distributed according to the posterior distribution. This sample is representative of the many different source distributions that could account for given data, and allows identification of probable (i.e. consistent) features across solutions. Examples of the use of this analysis technique with both simulated and empirical MEG data are presented

  8. Isotope partitioning of soil respiration: A Bayesian solution to accommodate multiple sources of variability

    Science.gov (United States)

    Ogle, Kiona; Pendall, Elise

    2015-02-01

    Isotopic methods offer great potential for partitioning trace gas fluxes such as soil respiration into their different source contributions. Traditional partitioning methods face challenges due to variability introduced by different measurement methods, fractionation effects, and end-member uncertainty. To address these challenges, we describe a hierarchical Bayesian (HB) approach for isotopic partitioning of soil respiration that directly accommodates such variability. We apply our HB method to data from an experiment conducted in a shortgrass steppe ecosystem, where decomposition was previously shown to be stimulated by elevated CO2. Our approach simultaneously fits Keeling plot (KP) models to observations of soil or soil-respired δ13C and [CO2] obtained via chambers and gas wells, corrects the KP intercepts for apparent fractionation (Δ) due to isotope-specific diffusion rates and/or method artifacts, estimates method- and treatment-specific values for Δ, propagates end-member uncertainty, and calculates proportional contributions from two distinct respiration sources ("old" and "new" carbon). The chamber KP intercepts were estimated with greater confidence than the well intercepts and compared to the theoretical value of 4.4‰, our results suggest that Δ varies between 2 and 5.2‰ depending on method (chambers versus wells) and CO2 treatment. Because elevated CO2 plots were fumigated with 13C-depleted CO2, the source contributions were tightly constrained, and new C accounted for 64% (range = 55-73%) of soil respiration. The contributions were less constrained for the ambient CO2 treatments, but new C accounted for significantly less (47%, range = 15-82%) of soil respiration. Our new HB partitioning approach contrasts our original analysis (higher contribution of old C under elevated CO2) because it uses additional data sources, accounts for end-member bias, and estimates apparent fractionation effects.

  9. Radiological risk assessment for the public under the loss of medium and large sources using bayesian methodology

    International Nuclear Information System (INIS)

    Kim, Joo Yeon; Jang, Han Ki; Lee, Jai Ki

    2005-01-01

    Bayesian methodology is appropriated for use in PRA because subjective knowledges as well as objective data are applied to assessment. In this study, radiological risk based on Bayesian methodology is assessed for the loss of source in field radiography. The exposure scenario for the lost source presented in U.S. NRC is reconstructed by considering the domestic situation and Bayes theorem is applied to updating of failure probabilities of safety functions. In case of updating of failure probabilities, it shows that 5% Bayes credible intervals using Jeffreys prior distribution are lower than ones using vague prior distribution. It is noted that Jeffreys prior distribution is appropriated in risk assessment for systems having very low failure probabilities. And, it shows that the mean of the expected annual dose for the public based on Bayesian methodology is higher than the dose based on classical methodology because the means of the updated probabilities are higher than classical probabilities. The database for radiological risk assessment are sparse in domestic. It summarizes that Bayesian methodology can be applied as an useful alternative for risk assessment and the study on risk assessment will be contributed to risk-informed regulation in the field of radiation safety

  10. How Many Separable Sources? Model Selection In Independent Components Analysis

    DEFF Research Database (Denmark)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis....../Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from...... among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though...

  11. Blind Source Separation of Event-Related EEG/MEG.

    Science.gov (United States)

    Metsomaa, Johanna; Sarvas, Jukka; Ilmoniemi, Risto Juhani

    2017-09-01

    Blind source separation (BSS) can be used to decompose complex electroencephalography (EEG) or magnetoencephalography data into simpler components based on statistical assumptions without using a physical model. Applications include brain-computer interfaces, artifact removal, and identifying parallel neural processes. We wish to address the issue of applying BSS to event-related responses, which is challenging because of nonstationary data. We introduce a new BSS approach called momentary-uncorrelated component analysis (MUCA), which is tailored for event-related multitrial data. The method is based on approximate joint diagonalization of multiple covariance matrices estimated from the data at separate latencies. We further show how to extend the methodology for autocovariance matrices and how to apply BSS methods suitable for piecewise stationary data to event-related responses. We compared several BSS approaches by using simulated EEG as well as measured somatosensory and transcranial magnetic stimulation (TMS) evoked EEG. Among the compared methods, MUCA was the most tolerant one to noise, TMS artifacts, and other challenges in the data. With measured somatosensory data, over half of the estimated components were found to be similar by MUCA and independent component analysis. MUCA was also stable when tested with several input datasets. MUCA is based on simple assumptions, and the results suggest that MUCA is robust with nonideal data. Event-related responses and BSS are valuable and popular tools in neuroscience. Correctly designed BSS is an efficient way of identifying artifactual and neural processes from nonstationary event-related data.

  12. Open source software implementation of an integrated testing strategy for skin sensitization potency based on a Bayesian network.

    Science.gov (United States)

    Pirone, Jason R; Smith, Marjolein; Kleinstreuer, Nicole C; Burns, Thomas A; Strickland, Judy; Dancik, Yuri; Morris, Richard; Rinckel, Lori A; Casey, Warren; Jaworska, Joanna S

    2014-01-01

    An open-source implementation of a previously published integrated testing strategy (ITS) for skin sensitization using a Bayesian network has been developed using R, a free and open-source statistical computing language. The ITS model provides probabilistic predictions of skin sensitization potency based on in silico and in vitro information as well as skin penetration characteristics from a published bioavailability model (Kasting et al., 2008). The structure of the Bayesian network was designed to be consistent with the adverse outcome pathway published by the OECD (Jaworska et al., 2011, 2013). In this paper, the previously published data set (Jaworska et al., 2013) is improved by two data corrections and a modified application of the Kasting model. The new data set implemented in the original commercial software package and the new R version produced consistent results. The data and a fully documented version of the code are publicly available (http://ntp.niehs.nih.gov/go/its).

  13. Removing muscle and eye artifacts using blind source separation techniques in ictal EEG source imaging.

    Science.gov (United States)

    Hallez, H; De Vos, M; Vanrumste, B; Van Hese, P; Assecondi, S; Van Laere, K; Dupont, P; Van Paesschen, W; Van Huffel, S; Lemahieu, I

    2009-07-01

    The contamination of muscle and eye artifacts during an ictal period of the EEG significantly distorts source estimation algorithms. Recent blind source separation (BSS) techniques based on canonical correlation (BSS-CCA) and independent component analysis with spatial constraints (SCICA) have shown much promise in the removal of these artifacts. In this study we want to use BSS-CCA and SCICA as a preprocessing step before the source estimation during the ictal period. Both the contaminated and cleaned ictal EEG were subjected to the RAP-MUSIC algorithm. This is a multiple dipole source estimation technique based on the separation of the EEG in signal and noise subspace. The source estimates were compared with the subtracted ictal SPECT (iSPECT) coregistered to magnetic resonance imaging (SISCOM) by means of the euclidean distance between the iSPECT activations and the dipole location estimates. SISCOM results in an image denoting the ictal onset zone with a propagation. We applied the artifact removal and the source estimation on 8 patients. Qualitatively, we can see that 5 out of 8 patients show an improvement of the dipoles. The dipoles are nearer to or have tighter clusters near the iSPECT activation. From the median of the distance measure, we could appreciate that 5 out of 8 patients show improvement. The results show that BSS-CCA and SCICA can be applied to remove artifacts, but the results should be interpreted with care. The results of the source estimation can be misleading due to excessive noise or modeling errors. Therefore, the accuracy of the source estimation can be increased by preprocessing the ictal EEG segment by BSS-CCA and SCICA. This is a pilot study where EEG source localization in the presurgical evaluation can be made more reliable, if preprocessing techniques such as BSS-CCA and SCICA are used prior to EEG source analysis on ictal episodes.

  14. Bayesian nitrate source apportionment to individual groundwater wells in the Central Valley by use of elemental and isotopic tracers

    Science.gov (United States)

    Ransom, Katherine M.; Grote, Mark N.; Deinhart, Amanda; Eppich, Gary; Kendall, Carol; Sanborn, Matthew E.; Souders, A. Kate; Wimpenny, Joshua; Yin, Qing-zhu; Young, Megan; Harter, Thomas

    2016-07-01

    Groundwater quality is a concern in alluvial aquifers that underlie agricultural areas, such as in the San Joaquin Valley of California. Shallow domestic wells (less than 150 m deep) in agricultural areas are often contaminated by nitrate. Agricultural and rural nitrate sources include dairy manure, synthetic fertilizers, and septic waste. Knowledge of the relative proportion that each of these sources contributes to nitrate concentration in individual wells can aid future regulatory and land management decisions. We show that nitrogen and oxygen isotopes of nitrate, boron isotopes, and iodine concentrations are a useful, novel combination of groundwater tracers to differentiate between manure, fertilizers, septic waste, and natural sources of nitrate. Furthermore, in this work, we develop a new Bayesian mixing model in which these isotopic and elemental tracers were used to estimate the probability distribution of the fractional contributions of manure, fertilizers, septic waste, and natural sources to the nitrate concentration found in an individual well. The approach was applied to 56 nitrate-impacted private domestic wells located in the San Joaquin Valley. Model analysis found that some domestic wells were clearly dominated by the manure source and suggests evidence for majority contributions from either the septic or fertilizer source for other wells. But, predictions of fractional contributions for septic and fertilizer sources were often of similar magnitude, perhaps because modeled uncertainty about the fraction of each was large. For validation of the Bayesian model, fractional estimates were compared to surrounding land use and estimated source contributions were broadly consistent with nearby land use types.

  15. Acoustic source separation for the detection of coronary artery sounds.

    Science.gov (United States)

    Cooper, Daniel B; Roan, Michael J; Vlachos, Pavlos P

    2011-12-01

    Coronary artery disease (CAD) is the leading cause of death in the United States, being responsible for more than 20% of all deaths in the country. This is in large part due to the difficulty of diagnostic screening for CAD. Phonoangiography seeks to detect CAD via the acoustic signature associated with turbulent flow near an abnormally constricted, or stenosed, region. However, the usefulness of the technique is severely hindered by the low strength of the CAD signal compared to the background noise within the chest. In this work, acoustic finite element analysis (FEA) was performed on physiologically accurate chest geometries to demonstrate the feasibility of an original acoustic source separation methodology for isolating coronary sounds. This approach is based upon pseudoinversion of mixing matrices determined through a combination of experiment and computation. This allows calculation of the sound emitted by the coronary arteries based upon measurements of the acoustic velocity on the chest surface. This work demonstrates the feasibility of such a technique computationally and examines the vulnerability of the proposed approach to measurement errors. © 2011 Acoustical Society of America

  16. Reverse osmosis brine for phosphorus recovery from source separated urine.

    Science.gov (United States)

    Tian, Xiujun; Wang, Guotian; Guan, Detian; Li, Jiuyi; Wang, Aimin; Li, Jin; Yu, Zhe; Chen, Yong; Zhang, Zhongguo

    2016-12-01

    Phosphorus (P) recovery from waste streams has recently been recognized as a key step in the sustainable supply of this indispensable and non-renewable resource. The feasibility of using brine from a reverse osmosis (RO) membrane unit treating cooling water as a precipitant for P recovery from source separated urine was evaluated in the present study. P removal efficiency, process parameters and precipitate properties were investigated in batch and continuous flow experiments. More than 90% of P removal was obtained from both undiluted fresh and hydrolyzed urines by mixing with RO brine (1:1, v/v) at a pH over 9.0. Around 2.58 and 1.24 Kg of precipitates could be recovered from 1 m 3 hydrolyzed and fresh urine, respectively, and the precipitated solids contain 8.1-19.0% of P, 10.3-15.2% of Ca, 3.7-5.0% of Mg and 0.1-3.5% of ammonium nitrogen. Satisfactory P removal performance was also achieved in a continuous flow precipitation reactor with a hydraulic retention time of 3-6 h. RO brine could be considered as urinal and toilet flush water despite of a marginally higher precipitation tendency than tap water. This study provides a widely available, low - cost and efficient precipitant for P recovery in urban areas, which will make P recovery from urine more economically attractive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Quantity of dates trumps quality of dates for dense Bayesian radiocarbon sediment chronologies - Gas ion source 14C dating instructed by simultaneous Bayesian accumulation rate modeling

    Science.gov (United States)

    Rosenheim, B. E.; Firesinger, D.; Roberts, M. L.; Burton, J. R.; Khan, N.; Moyer, R. P.

    2016-12-01

    Radiocarbon (14C) sediment core chronologies benefit from a high density of dates, even when precision of individual dates is sacrificed. This is demonstrated by a combined approach of rapid 14C analysis of CO2 gas generated from carbonates and organic material coupled with Bayesian statistical modeling. Analysis of 14C is facilitated by the gas ion source on the Continuous Flow Accelerator Mass Spectrometry (CFAMS) system at the Woods Hole Oceanographic Institution's National Ocean Sciences Accelerator Mass Spectrometry facility. This instrument is capable of producing a 14C determination of +/- 100 14C y precision every 4-5 minutes, with limited sample handling (dissolution of carbonates and/or combustion of organic carbon in evacuated containers). Rapid analysis allows over-preparation of samples to include replicates at each depth and/or comparison of different sample types at particular depths in a sediment or peat core. Analysis priority is given to depths that have the least chronologic precision as determined by Bayesian modeling of the chronology of calibrated ages. Use of such a statistical approach to determine the order in which samples are run ensures that the chronology constantly improves so long as material is available for the analysis of chronologic weak points. Ultimately, accuracy of the chronology is determined by the material that is actually being dated, and our combined approach allows testing of different constituents of the organic carbon pool and the carbonate minerals within a core. We will present preliminary results from a deep-sea sediment core abundant in deep-sea foraminifera as well as coastal wetland peat cores to demonstrate statistical improvements in sediment- and peat-core chronologies obtained by increasing the quantity and decreasing the quality of individual dates.

  18. The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings

    Directory of Open Access Journals (Sweden)

    Chris von Borgstede

    2012-06-01

    Full Text Available Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS. In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to employees at different workplaces. The results show that respondents with awareness of EMS report higher levels of source separation at work, stronger environmental concern, personal and social norms, and perceive source separation to be less difficult. Furthermore, the results support the notion that after the adoption of EMS at the workplace, source separation at work spills over into source separation in the household. The potential implications for environmental management systems are discussed.

  19. Back-trajectory modeling of high time-resolution air measurement data to separate nearby sources

    Science.gov (United States)

    Strategies to isolate air pollution contributions from sources is of interest as voluntary or regulatory measures are undertaken to reduce air pollution. When different sources are located in close proximity to one another and have similar emissions, separating source emissions ...

  20. Towards Enhanced Underwater Lidar Detection via Source Separation

    Science.gov (United States)

    Illig, David W.

    Interest in underwater optical sensors has grown as technologies enabling autonomous underwater vehicles have been developed. Propagation of light through water is complicated by the dual challenges of absorption and scattering. While absorption can be reduced by operating in the blue-green region of the visible spectrum, reducing scattering is a more significant challenge. Collection of scattered light negatively impacts underwater optical ranging, imaging, and communications applications. This thesis concentrates on the ranging application, where scattering reduces operating range as well as range accuracy. The focus of this thesis is on the problem of backscatter, which can create a "clutter" return that may obscure submerged target(s) of interest. The main contributions of this thesis are explorations of signal processing approaches to increase the separation between the target and backscatter returns. Increasing this separation allows detection of weak targets in the presence of strong scatter, increasing both operating range and range accuracy. Simulation and experimental results will be presented for a variety of approaches as functions of water clarity and target position. This work provides several novel contributions to the underwater lidar field: 1. Quantification of temporal separation approaches: While temporal separation has been studied extensively, this work provides a quantitative assessment of the extent to which both high frequency modulation and spatial filter approaches improve the separation between target and backscatter. 2. Development and assessment of frequency separation: This work includes the first frequency-based separation approach for underwater lidar, in which the channel frequency response is measured with a wideband waveform. Transforming to the time-domain gives a channel impulse response, in which target and backscatter returns may appear in unique range bins and thus be separated. 3. Development and assessment of statistical

  1. Use of a Bayesian isotope mixing model to estimate proportional contributions of multiple nitrate sources in surface water

    International Nuclear Information System (INIS)

    Xue Dongmei; De Baets, Bernard; Van Cleemput, Oswald; Hennessy, Carmel; Berglund, Michael; Boeckx, Pascal

    2012-01-01

    To identify different NO 3 − sources in surface water and to estimate their proportional contribution to the nitrate mixture in surface water, a dual isotope and a Bayesian isotope mixing model have been applied for six different surface waters affected by agriculture, greenhouses in an agricultural area, and households. Annual mean δ 15 N–NO 3 − were between 8.0 and 19.4‰, while annual mean δ 18 O–NO 3 − were given by 4.5–30.7‰. SIAR was used to estimate the proportional contribution of five potential NO 3 − sources (NO 3 − in precipitation, NO 3 − fertilizer, NH 4 + in fertilizer and rain, soil N, and manure and sewage). SIAR showed that “manure and sewage” contributed highest, “soil N”, “NO 3 − fertilizer” and “NH 4 + in fertilizer and rain” contributed middle, and “NO 3 − in precipitation” contributed least. The SIAR output can be considered as a “fingerprint” for the NO 3 − source contributions. However, the wide range of isotope values observed in surface water and of the NO 3 − sources limit its applicability. - Highlights: ► The dual isotope approach (δ 15 N- and δ 18 O–NO 3 − ) identify dominant nitrate sources in 6 surface waters. ► The SIAR model estimate proportional contributions for 5 nitrate sources. ► SIAR is a reliable approach to assess temporal and spatial variations of different NO 3 − sources. ► The wide range of isotope values observed in surface water and of the nitrate sources limit its applicability. - This paper successfully applied a dual isotope approach and Bayesian isotopic mixing model to identify and quantify 5 potential nitrate sources in surface water.

  2. Nitrate source apportionment using a combined dual isotope, chemical and bacterial property, and Bayesian model approach in river systems

    Science.gov (United States)

    Xia, Yongqiu; Li, Yuefei; Zhang, Xinyu; Yan, Xiaoyuan

    2017-01-01

    Nitrate (NO3-) pollution is a serious problem worldwide, particularly in countries with intensive agricultural and population activities. Previous studies have used δ15N-NO3- and δ18O-NO3- to determine the NO3- sources in rivers. However, this approach is subject to substantial uncertainties and limitations because of the numerous NO3- sources, the wide isotopic ranges, and the existing isotopic fractionations. In this study, we outline a combined procedure for improving the determination of NO3- sources in a paddy agriculture-urban gradient watershed in eastern China. First, the main sources of NO3- in the Qinhuai River were examined by the dual-isotope biplot approach, in which we narrowed the isotope ranges using site-specific isotopic results. Next, the bacterial groups and chemical properties of the river water were analyzed to verify these sources. Finally, we introduced a Bayesian model to apportion the spatiotemporal variations of the NO3- sources. Denitrification was first incorporated into the Bayesian model because denitrification plays an important role in the nitrogen pathway. The results showed that fertilizer contributed large amounts of NO3- to the surface water in traditional agricultural regions, whereas manure effluents were the dominant NO3- source in intensified agricultural regions, especially during the wet seasons. Sewage effluents were important in all three land uses and exhibited great differences between the dry season and the wet season. This combined analysis quantitatively delineates the proportion of NO3- sources from paddy agriculture to urban river water for both dry and wet seasons and incorporates isotopic fractionation and uncertainties in the source compositions.

  3. Residents’ Household Solid Waste (HSW Source Separation Activity: A Case Study of Suzhou, China

    Directory of Open Access Journals (Sweden)

    Hua Zhang

    2014-09-01

    Full Text Available Though the Suzhou government has provided household solid waste (HSW source separation since 2000, the program remains largely ineffective. Between January and March 2014, the authors conducted an intercept survey in five different community groups in Suzhou, and 505 valid surveys were completed. Based on the survey, the authors used an ordered probit regression to study residents’ HSW source separation activities for both Suzhou and for the five community groups. Results showed that 43% of the respondents in Suzhou thought they knew how to source separate HSW, and 29% of them have source separated HSW accurately. The results also found that the current HSW source separation pilot program in Suzhou is valid, as HSW source separation facilities and residents’ separation behavior both became better and better along with the program implementation. The main determinants of residents’ HSW source separation behavior are residents’ age, HSW source separation facilities and government preferential policies. The accessibility to waste management service is particularly important. Attitudes and willingness do not have significant impacts on residents’ HSW source separation behavior.

  4. Synthesis of blind source separation algorithms on reconfigurable FPGA platforms

    Science.gov (United States)

    Du, Hongtao; Qi, Hairong; Szu, Harold H.

    2005-03-01

    Recent advances in intelligence technology have boosted the development of micro- Unmanned Air Vehicles (UAVs) including Sliver Fox, Shadow, and Scan Eagle for various surveillance and reconnaissance applications. These affordable and reusable devices have to fit a series of size, weight, and power constraints. Cameras used on such micro-UAVs are therefore mounted directly at a fixed angle without any motion-compensated gimbals. This mounting scheme has resulted in the so-called jitter effect in which jitter is defined as sub-pixel or small amplitude vibrations. The jitter blur caused by the jitter effect needs to be corrected before any other processing algorithms can be practically applied. Jitter restoration has been solved by various optimization techniques, including Wiener approximation, maximum a-posteriori probability (MAP), etc. However, these algorithms normally assume a spatial-invariant blur model that is not the case with jitter blur. Szu et al. developed a smart real-time algorithm based on auto-regression (AR) with its natural generalization of unsupervised artificial neural network (ANN) learning to achieve restoration accuracy at the sub-pixel level. This algorithm resembles the capability of the human visual system, in which an agreement between the pair of eyes indicates "signal", otherwise, the jitter noise. Using this non-statistical method, for each single pixel, a deterministic blind sources separation (BSS) process can then be carried out independently based on a deterministic minimum of the Helmholtz free energy with a generalization of Shannon's information theory applied to open dynamic systems. From a hardware implementation point of view, the process of jitter restoration of an image using Szu's algorithm can be optimized by pixel-based parallelization. In our previous work, a parallelly structured independent component analysis (ICA) algorithm has been implemented on both Field Programmable Gate Array (FPGA) and Application

  5. Model Based Beamforming and Bayesian Inversion Signal Processing Methods for Seismic Localization of Underground Source

    DEFF Research Database (Denmark)

    Oh, Geok Lian

    properties such as the elastic wave speeds and soil densities. One processing method is casting the estimation problem into an inverse problem to solve for the unknown material parameters. The forward model for the seismic signals used in the literatures include ray tracing methods that consider only...... density values of the discretized ground medium, which leads to time-consuming computations and instability behaviour of the inversion process. In addition, the geophysics inverse problem is generally ill-posed due to non-exact forward model that introduces errors. The Bayesian inversion method through...... the probability density function permits the incorporation of a priori information about the parameters, and also allow for incorporation of theoretical errors. This opens up the possibilities of application of inverse paradigm in the real-world geophysics inversion problems. In this PhD study, the Bayesian...

  6. Source Separation of Heartbeat Sounds for Effective E-Auscultation

    Science.gov (United States)

    Geethu, R. S.; Krishnakumar, M.; Pramod, K. V.; George, Sudhish N.

    2016-03-01

    This paper proposes a cost effective solution for improving the effectiveness of e-auscultation. Auscultation is the most difficult skill for a doctor, since it can be acquired only through experience. The heart sound mixtures are captured by placing the four numbers of sensors at appropriate auscultation area in the body. These sound mixtures are separated to its relevant components by a statistical method independent component analysis. The separated heartbeat sounds can be further processed or can be stored for future reference. This idea can be used for making a low cost, easy to use portable instrument which will be beneficial to people living in remote areas and are unable to take the advantage of advanced diagnosis methods.

  7. Combining Superdirective Beamforming and Frequency-Domain Blind Source Separation for Highly Reverberant Signals

    Directory of Open Access Journals (Sweden)

    Lin Wang

    2010-01-01

    Full Text Available Frequency-domain blind source separation (BSS performs poorly in high reverberation because the independence assumption collapses at each frequency bins when the number of bins increases. To improve the separation result, this paper proposes a method which combines two techniques by using beamforming as a preprocessor of blind source separation. With the sound source locations supposed to be known, the mixed signals are dereverberated and enhanced by beamforming; then the beamformed signals are further separated by blind source separation. To implement the proposed method, a superdirective fixed beamformer is designed for beamforming, and an interfrequency dependence-based permutation alignment scheme is presented for frequency-domain blind source separation. With beamforming shortening mixing filters and reducing noise before blind source separation, the combined method works better in reverberation. The performance of the proposed method is investigated by separating up to 4 sources in different environments with reverberation time from 100 ms to 700 ms. Simulation results verify the outperformance of the proposed method over using beamforming or blind source separation alone. Analysis demonstrates that the proposed method is computationally efficient and appropriate for real-time processing.

  8. Single-channel source separation using non-negative matrix factorization

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard

    , in which a number of methods for single-channel source separation based on non-negative matrix factorization are presented. In the papers, the methods are applied to separating audio signals such as speech and musical instruments and separating different types of tissue in chemical shift imaging....

  9. 30 CFR 56.6404 - Separation of blasting circuits from power source.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... MINES Explosives Electric Blasting § 56.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall be locked in the open position except...

  10. 30 CFR 57.6404 - Separation of blasting circuits from power source.

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Separation of blasting circuits from power... NONMETAL MINES Explosives Electric Blasting-Surface and Underground § 57.6404 Separation of blasting circuits from power source. (a) Switches used to connect the power source to a blasting circuit shall be...

  11. A Bayesian geostatistical approach for evaluating the uncertainty of contaminant mass discharges from point sources

    DEFF Research Database (Denmark)

    Troldborg, Mads; Nowak, Wolfgang; Binning, Philip John

    of both concentration and groundwater flow. For risk assessments or any other decisions that are being based on mass discharge estimates, it is essential to address these uncertainties. We present a novel Bayesian geostatistical approach for quantifying the uncertainty of the mass discharge across...... and the hydraulic gradient across the control plane and are consistent with measurements of both hydraulic conductivity and head at the site. An analytical macro-dispersive transport solution is employed to simulate the mean concentration distribution across the control plane, and a geostatistical model of the Box......Estimates of mass discharge (mass/time) are increasingly being used when assessing risks of groundwater contamination and designing remedial systems at contaminated sites. Mass discharge estimates are, however, prone to rather large uncertainties as they integrate uncertain spatial distributions...

  12. Bayesian accrual prediction for interim review of clinical studies: open source R package and smartphone application.

    Science.gov (United States)

    Jiang, Yu; Guarino, Peter; Ma, Shuangge; Simon, Steve; Mayo, Matthew S; Raghavan, Rama; Gajewski, Byron J

    2016-07-22

    Subject recruitment for medical research is challenging. Slow patient accrual leads to increased costs and delays in treatment advances. Researchers need reliable tools to manage and predict the accrual rate. The previously developed Bayesian method integrates researchers' experience on former trials and data from an ongoing study, providing a reliable prediction of accrual rate for clinical studies. In this paper, we present a user-friendly graphical user interface program developed in R. A closed-form solution for the total subjects that can be recruited within a fixed time is derived. We also present a built-in Android system using Java for web browsers and mobile devices. Using the accrual software, we re-evaluated the Veteran Affairs Cooperative Studies Program 558- ROBOTICS study. The application of the software in monitoring and management of recruitment is illustrated for different stages of the trial. This developed accrual software provides a more convenient platform for estimation and prediction of the accrual process.

  13. Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2014-01-01

    Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.

  14. Sustainable wastewater management: life cycle assessment of conventional and source-separating urban sanitation systems.

    Science.gov (United States)

    Remy, C; Jekel, M

    2008-01-01

    Conventional and source-separating urban sanitation systems are compared with regard to their ecological sustainability using the methodology of Life Cycle Assessment. A substance flow model of all relevant processes in a settlement with 5,000 inhabitants is set up and evaluated with environmental indicators for resource demand and emissions to air, water, and soil. The comparison shows that source separation does not necessarily result in a system with less environmental impacts. If the conventional system is energetically optimized and equipped with extended nutrient removal, its impact is comparable to the source-separating systems. However, source separation has the potential to offer ecological benefits depending on the system configuration. Especially the input of toxic heavy metals to agriculture with sewage sludge can be substantially lowered if separately collected urine and faeces are used as organic fertilizer. IWA Publishing 2008.

  15. Investigation of model based beamforming and Bayesian inversion signal processing methods for seismic localization of underground sources

    DEFF Research Database (Denmark)

    Oh, Geok Lian; Brunskog, Jonas

    2014-01-01

    Techniques have been studied for the localization of an underground source with seismic interrogation signals. Much of the work has involved defining either a P-wave acoustic model or a dispersive surface wave model to the received signal and applying the time-delay processing technique...... and frequency-wavenumber processing to determine the location of the underground tunnel. Considering the case of determining the location of an underground tunnel, this paper proposed two physical models, the acoustic approximation ray tracing model and the finite difference time domain three-dimensional (3D......) elastic wave model to represent the received seismic signal. Two localization algorithms, beamforming and Bayesian inversion, are developed for each physical model. The beam-forming algorithms implemented are the modified time-and-delay beamformer and the F-K beamformer. Inversion is posed...

  16. A Separation Algorithm for Sources with Temporal Structure Only Using Second-order Statistics

    Directory of Open Access Journals (Sweden)

    J.G. Wang

    2013-09-01

    Full Text Available Unlike conventional blind source separation (BSS deals with independent identically distributed (i.i.d. sources, this paper addresses the separation from mixtures of sources with temporal structure, such as linear autocorrelations. Many sequential extraction algorithms have been reported, resulting in inevitable cumulated errors introduced by the deflation scheme. We propose a robust separation algorithm to recover original sources simultaneously, through a joint diagonalizer of several average delayed covariance matrices at positions of the optimal time delay and its integers. The proposed algorithm is computationally simple and efficient, since it is based on the second-order statistics only. Extensive simulation results confirm the validity and high performance of the algorithm. Compared with related extraction algorithms, its separation signal-to-noise rate for a desired source can reach 20dB higher, and it seems rather insensitive to the estimation error of the time delay.

  17. Overcomplete Blind Source Separation by Combining ICA and Binary Time-Frequency Masking

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan

    2005-01-01

    A limitation in many source separation tasks is that the number of source signals has to be known in advance. Further, in order to achieve good performance, the number of sources cannot exceed the number of sensors. In many real-world applications these limitations are too strict. We propose a no...

  18. Separation of radiation from two sources from their known radiated sum field

    DEFF Research Database (Denmark)

    Laitinen, Tommi; Pivnenko, Sergey

    2011-01-01

    This paper presents a technique for complete and exact separation of the radiated fields of two sources (at the same frequency) from the knowledge of their radiated sum field. The two sources can be arbitrary but it must be possible to enclose the sources inside their own non-intersecting minimum...

  19. Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE) using a Hierarchical Bayesian Approach

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2011-01-01

    We present an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model representation is motivated by the many random contributions to the path from sources to measurements including the tissue conductivity distribution, the geometry of the cortical...... models. Analysis of simulated and real EEG data provide evidence that reconstruction of the forward model leads to improved source estimates....

  20. An incentive-based source separation model for sustainable municipal solid waste management in China.

    Science.gov (United States)

    Xu, Wanying; Zhou, Chuanbin; Lan, Yajun; Jin, Jiasheng; Cao, Aixin

    2015-05-01

    Municipal solid waste (MSW) management (MSWM) is most important and challenging in large urban communities. Sound community-based waste management systems normally include waste reduction and material recycling elements, often entailing the separation of recyclable materials by the residents. To increase the efficiency of source separation and recycling, an incentive-based source separation model was designed and this model was tested in 76 households in Guiyang, a city of almost three million people in southwest China. This model embraced the concepts of rewarding households for sorting organic waste, government funds for waste reduction, and introducing small recycling enterprises for promoting source separation. Results show that after one year of operation, the waste reduction rate was 87.3%, and the comprehensive net benefit under the incentive-based source separation model increased by 18.3 CNY tonne(-1) (2.4 Euros tonne(-1)), compared to that under the normal model. The stakeholder analysis (SA) shows that the centralized MSW disposal enterprises had minimum interest and may oppose the start-up of a new recycling system, while small recycling enterprises had a primary interest in promoting the incentive-based source separation model, but they had the least ability to make any change to the current recycling system. The strategies for promoting this incentive-based source separation model are also discussed in this study. © The Author(s) 2015.

  1. Bayesian integration of isotope ratio for geographic sourcing of castor beans.

    Science.gov (United States)

    Webb-Robertson, Bobbie-Jo; Kreuzer, Helen; Hart, Garret; Ehleringer, James; West, Jason; Gill, Gary; Duckworth, Douglas

    2012-01-01

    Recent years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H) and strontium (Sr) isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 60.9 ± 2.1% versus 55.9 ± 2.1% and 40.2 ± 1.8% for the light element and strontium (Sr) isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.

  2. Bayesian Integration of Isotope Ratio for Geographic Sourcing of Castor Beans

    Directory of Open Access Journals (Sweden)

    Bobbie-Jo Webb-Robertson

    2012-01-01

    Full Text Available Recent years have seen an increase in the forensic interest associated with the poison ricin, which is extracted from the seeds of the Ricinus communis plant. Both light element (C, N, O, and H and strontium (Sr isotope ratios have previously been used to associate organic material with geographic regions of origin. We present a Bayesian integration methodology that can more accurately predict the region of origin for a castor bean than individual models developed independently for light element stable isotopes or Sr isotope ratios. Our results demonstrate a clear improvement in the ability to correctly classify regions based on the integrated model with a class accuracy of 60.9±2.1% versus 55.9±2.1% and 40.2±1.8% for the light element and strontium (Sr isotope ratios, respectively. In addition, we show graphically the strengths and weaknesses of each dataset in respect to class prediction and how the integration of these datasets strengthens the overall model.

  3. New sonorities for early jazz recordings using sound source separation and automatic mixing tools

    OpenAIRE

    Matz, Daniel; Cano, Estefanía; Abeßer, Jakob

    2015-01-01

    In this paper, a framework for automatic mixing of early jazz recordings is presented. In particular, we propose the use of sound source separation techniques as a preprocessing step of the mixing process. In addition to an initial solo and accompaniment separation step, the proposed mixing framework is composed of six processing blocks: harmonic-percussive separation (HPS), cross-adaptive multi-track scaling (CAMTS), cross-adaptive equalizer (CAEQ), cross-adaptive dynamic spectral panning (C...

  4. Semi-blind Source Separation Using Head-related Transfer Functions

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Hansen, Lars Kai; Kjems, Ulrik

    2004-01-01

    An online blind source separation algorithm which is a special case of the geometric algorithm by Parra and Fancourt has been implemented for the purpose of separating sounds recorded at microphones placed at each side of the head. By using the assumption that the position of the two sounds...

  5. A Nonlinear Blind Source Separation Method Based On Radial Basis Function and Quantum Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Wang Pidong

    2016-01-01

    Full Text Available Blind source separation is a hot topic in signal processing. Most existing works focus on dealing with linear combined signals, while in practice we always encounter with nonlinear mixed signals. To address the problem of nonlinear source separation, in this paper we propose a novel algorithm using radial basis function neutral network, optimized by multi-universe parallel quantum genetic algorithm. Experiments show the efficiency of the proposed method.

  6. Bayesian Inference for Source Term Estimation: Application to the International Monitoring System Radionuclide Network

    Science.gov (United States)

    2014-10-01

    Laboratories (CRL) medical isotope production facility. The sampling of the resulting posterior distribution of the source parameters is un- dertaken...International Monitoring System radionuclide network used for Case 2. The location of the Xe-133 tracer source (red marker) was at Chalk River Laboratories ...space. 5 Applications The International Monitoring System (IMS) consists of a comprehensive network of seismic, hydroacoustic , infrasound, and

  7. Factors influencing source separation intention and willingness to pay for improving waste management in Bangkok, Thailand

    Directory of Open Access Journals (Sweden)

    Sujitra Vassanadumrongdee

    2018-03-01

    Full Text Available Source separation for recycling has been recognized as a way to achieve sustainable municipal solid waste (MSW management. However, most developing countries including Thailand have been facing with lack of recycling facilities and low level of source separation practice. By employing questionnaire surveys, this study investigated Bangkok residents' source separation intention and willingness to pay (WTP for improving MSW service and recycling facilities (n = 1076. This research extended the theory of planned behavior to explore the effects of both internal and external factors. The survey highlighted perceived inconvenience and mistrust on MSW collection being major barriers to carrying out source separation in Bangkok. Promoting source separation at workplace may possibly create spill-over effect to people's intention to recycle their waste at home. Both subjective norms and knowledge on MSW situation were found to be a positive correlation with Bangkok residents' source separation intention and WTP (p < 0.001. Besides, the average WTP values are higher than the existing rate of waste collection service, which shows that Bangkok residents have preference for recycling programs. However, the WTP figures are still much lower than the average MSW management cost. These findings suggest that Bangkok Metropolitan Administration targets improving people knowledge on waste problems that could have adverse impact on the economy and well-being of Bangkok residents and improve its MSW collection service as these factors have positive influence on residents' WTP.

  8. Hierarchical Bayesian Model for Simultaneous EEG Source and Forward Model Reconstruction (SOFOMORE)

    DEFF Research Database (Denmark)

    Stahlhut, Carsten; Mørup, Morten; Winther, Ole

    2009-01-01

    In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface, and ele......In this paper we propose an approach to handle forward model uncertainty for EEG source reconstruction. A stochastic forward model is motivated by the many uncertain contributions that form the forward propagation model including the tissue conductivity distribution, the cortical surface...... and real EEG data demonstrate that invoking a stochastic forward model leads to improved source estimates....

  9. Bayesian estimation of source parameters and associated Coulomb failure stress changes for the 2005 Fukuoka (Japan) Earthquake

    Science.gov (United States)

    Dutta, Rishabh; Jónsson, Sigurjón; Wang, Teng; Vasyura-Bathke, Hannes

    2018-04-01

    Several researchers have studied the source parameters of the 2005 Fukuoka (northwestern Kyushu Island, Japan) earthquake (Mw 6.6) using teleseismic, strong motion and geodetic data. However, in all previous studies, errors of the estimated fault solutions have been neglected, making it impossible to assess the reliability of the reported solutions. We use Bayesian inference to estimate the location, geometry and slip parameters of the fault and their uncertainties using Interferometric Synthetic Aperture Radar and Global Positioning System data. The offshore location of the earthquake makes the fault parameter estimation challenging, with geodetic data coverage mostly to the southeast of the earthquake. To constrain the fault parameters, we use a priori constraints on the magnitude of the earthquake and the location of the fault with respect to the aftershock distribution and find that the estimated fault slip ranges from 1.5 to 2.5 m with decreasing probability. The marginal distributions of the source parameters show that the location of the western end of the fault is poorly constrained by the data whereas that of the eastern end, located closer to the shore, is better resolved. We propagate the uncertainties of the fault model and calculate the variability of Coulomb failure stress changes for the nearby Kego fault, located directly below Fukuoka city, showing that the main shock increased stress on the fault and brought it closer to failure.

  10. DAFNE: A Matlab toolbox for Bayesian multi-source remote sensing and ancillary data fusion, with application to flood mapping

    Science.gov (United States)

    D'Addabbo, Annarita; Refice, Alberto; Lovergine, Francesco P.; Pasquariello, Guido

    2018-03-01

    High-resolution, remotely sensed images of the Earth surface have been proven to be of help in producing detailed flood maps, thanks to their synoptic overview of the flooded area and frequent revisits. However, flood scenarios can be complex situations, requiring the integration of different data in order to provide accurate and robust flood information. Several processing approaches have been recently proposed to efficiently combine and integrate heterogeneous information sources. In this paper, we introduce DAFNE, a Matlab®-based, open source toolbox, conceived to produce flood maps from remotely sensed and other ancillary information, through a data fusion approach. DAFNE is based on Bayesian Networks, and is composed of several independent modules, each one performing a different task. Multi-temporal and multi-sensor data can be easily handled, with the possibility of following the evolution of an event through multi-temporal output flood maps. Each DAFNE module can be easily modified or upgraded to meet different user needs. The DAFNE suite is presented together with an example of its application.

  11. Bayesian Estimation of Source Parameters and Associated Coulomb Failure Stress Changes for the 2005 Fukuoka (Japan) Earthquake

    KAUST Repository

    Dutta, Rishabh

    2017-12-20

    Several researchers have studied the source parameters of the 2005 Fukuoka (northwestern Kyushu Island, Japan) earthquake (MW 6.6) using teleseismic, strong motion and geodetic data. However, in all previous studies, errors of the estimated fault solutions have been neglected, making it impossible to assess the reliability of the reported solutions. We use Bayesian inference to estimate the location, geometry and slip parameters of the fault and their uncertainties using Interferometric Synthetic Aperture Radar (InSAR) and Global Positioning System (GPS) data. The offshore location of the earthquake makes the fault parameter estimation challenging, with geodetic data coverage mostly to the southeast of the earthquake. To constrain the fault parameters, we use a priori constraints on the magnitude of the earthquake and the location of the fault with respect to the aftershock distribution and find that the estimated fault slip ranges from 1.5 m to 2.5 m with decreasing probability. The marginal distributions of the source parameters show that the location of the western end of the fault is poorly constrained by the data whereas that of the eastern end, located closer to the shore, is better resolved. We propagate the uncertainties of the fault model and calculate the variability of Coulomb failure stress changes for the nearby Kego fault, located directly below Fukuoka city, showing that the mainshock increased stress on the fault and brought it closer to failure.

  12. A particle velocity based method for separating all multi incoherent sound sources

    NARCIS (Netherlands)

    Winkel, J.C.; Yntema, Doekle Reinder; Druyvesteyn, W.F.; de Bree, H.E.

    2006-01-01

    In this paper we present a method to separate the contributions of different uncorrelated sound sources to the total sound field. When the contribution of each sound source to the total sound field is known, techniques with array-applications like direct sound field measurements or inverse acoustics

  13. Nitrate source identification in the Baltic Sea using its isotopic ratios in combination with a Bayesian isotope mixing model

    Science.gov (United States)

    Korth, F.; Deutsch, B.; Frey, C.; Moros, C.; Voss, M.

    2014-09-01

    Nitrate (NO3-) is the major nutrient responsible for coastal eutrophication worldwide and its production is related to intensive food production and fossil-fuel combustion. In the Baltic Sea NO3- inputs have increased 4-fold over recent decades and now remain constantly high. NO3- source identification is therefore an important consideration in environmental management strategies. In this study focusing on the Baltic Sea, we used a method to estimate the proportional contributions of NO3- from atmospheric deposition, N2 fixation, and runoff from pristine soils as well as from agricultural land. Our approach combines data on the dual isotopes of NO3- (δ15N-NO3- and δ18O-NO3-) in winter surface waters with a Bayesian isotope mixing model (Stable Isotope Analysis in R, SIAR). Based on data gathered from 47 sampling locations over the entire Baltic Sea, the majority of the NO3- in the southern Baltic was shown to derive from runoff from agricultural land (33-100%), whereas in the northern Baltic, i.e. the Gulf of Bothnia, NO3- originates from nitrification in pristine soils (34-100%). Atmospheric deposition accounts for only a small percentage of NO3- levels in the Baltic Sea, except for contributions from northern rivers, where the levels of atmospheric NO3- are higher. An additional important source in the central Baltic Sea is N2 fixation by diazotrophs, which contributes 49-65% of the overall NO3- pool at this site. The results obtained with this method are in good agreement with source estimates based upon δ15N values in sediments and a three-dimensional ecosystem model, ERGOM. We suggest that this approach can be easily modified to determine NO3- sources in other marginal seas or larger near-coastal areas where NO3- is abundant in winter surface waters when fractionation processes are minor.

  14. Nitrate source identification using its isotopic ratios in combination with a Bayesian isotope mixing model in the Baltic Sea

    Science.gov (United States)

    Korth, F.; Deutsch, B.; Frey, C.; Moros, C.; Voss, M.

    2014-04-01

    Nitrate (NO3-) is the major nutrient responsible for coastal eutrophication worldwide and its production is related to intensive food production and fossil-fuel combustion. In the Baltic Sea NO3-inputs have increased four-fold over the last decades and now remain constantly high. NO3- source identification is therefore an important consideration in environmental management strategies. In this study focusing on the Baltic Sea, we used a method to estimate the proportional contributions of NO3- from atmospheric deposition, N2 fixation, and runoff from pristine soils as well as from agricultural land. Our approach combines data on the dual isotopes of NO3- (δ15N-NO3- and δ18O-NO3-) in winter surface waters with a Bayesian isotope mixing model (Stable Isotope Analysis in R, SIAR). Based on data gathered from 46 sampling locations over the entire Baltic Sea, the majority of the NO3- in the southern Baltic was shown to derive from runoff from agricultural land (30-70%), whereas in the northern Baltic, i.e., the Gulf of Bothnia, NO3- originates from nitrification in pristine soils (47-100%). Atmospheric deposition accounts for only a small percentage of NO3- levels in the Baltic Sea, except for contributions from northern rivers, where the levels of atmospheric NO3- are higher. An additional important source in the central Baltic Sea is N2 fixation by diazotrophs, which contributes 31-62% of the overall NO3- pool at this site. The results obtained with this method are in good agreement with source estimates based upon δ15N values in sediments and a three-dimensional ecosystem model, ERGOM. We suggest that this approach can be easily modified to determine NO3- sources in other marginal seas or larger near-coastal areas where NO3- is abundant in winter surface waters when fractionation processes are minor.

  15. Bayesian inverse modeling and source location of an unintended 131I release in Europe in the fall of 2011

    Science.gov (United States)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, Miroslav; Stohl, Andreas

    2017-10-01

    In the fall of 2011, iodine-131 (131I) was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA) was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS) matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS) and from European Centre for Medium-range Weather Forecasts (ECMWF) weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC), to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most probable location of

  16. Bayesian inverse modeling and source location of an unintended 131I release in Europe in the fall of 2011

    Directory of Open Access Journals (Sweden)

    O. Tichý

    2017-10-01

    Full Text Available In the fall of 2011, iodine-131 (131I was detected at several radionuclide monitoring stations in central Europe. After investigation, the International Atomic Energy Agency (IAEA was informed by Hungarian authorities that 131I was released from the Institute of Isotopes Ltd. in Budapest, Hungary. It was reported that a total activity of 342 GBq of 131I was emitted between 8 September and 16 November 2011. In this study, we use the ambient concentration measurements of 131I to determine the location of the release as well as its magnitude and temporal variation. As the location of the release and an estimate of the source strength became eventually known, this accident represents a realistic test case for inversion models. For our source reconstruction, we use no prior knowledge. Instead, we estimate the source location and emission variation using only the available 131I measurements. Subsequently, we use the partial information about the source term available from the Hungarian authorities for validation of our results. For the source determination, we first perform backward runs of atmospheric transport models and obtain source-receptor sensitivity (SRS matrices for each grid cell of our study domain. We use two dispersion models, FLEXPART and Hysplit, driven with meteorological analysis data from the global forecast system (GFS and from European Centre for Medium-range Weather Forecasts (ECMWF weather forecast models. Second, we use a recently developed inverse method, least-squares with adaptive prior covariance (LS-APC, to determine the 131I emissions and their temporal variation from the measurements and computed SRS matrices. For each grid cell of our simulation domain, we evaluate the probability that the release was generated in that cell using Bayesian model selection. The model selection procedure also provides information about the most suitable dispersion model for the source term reconstruction. Third, we select the most

  17. Using spatiotemporal source separation to identify prominent features in multichannel data without sinusoidal filters.

    Science.gov (United States)

    Cohen, Michael X

    2017-09-27

    The number of simultaneously recorded electrodes in neuroscience is steadily increasing, providing new opportunities for understanding brain function, but also new challenges for appropriately dealing with the increase in dimensionality. Multivariate source separation analysis methods have been particularly effective at improving signal-to-noise ratio while reducing the dimensionality of the data and are widely used for cleaning, classifying and source-localizing multichannel neural time series data. Most source separation methods produce a spatial component (that is, a weighted combination of channels to produce one time series); here, this is extended to apply source separation to a time series, with the idea of obtaining a weighted combination of successive time points, such that the weights are optimized to satisfy some criteria. This is achieved via a two-stage source separation procedure, in which an optimal spatial filter is first constructed and then its optimal temporal basis function is computed. This second stage is achieved with a time-delay-embedding matrix, in which additional rows of a matrix are created from time-delayed versions of existing rows. The optimal spatial and temporal weights can be obtained by solving a generalized eigendecomposition of covariance matrices. The method is demonstrated in simulated data and in an empirical electroencephalogram study on theta-band activity during response conflict. Spatiotemporal source separation has several advantages, including defining empirical filters without the need to apply sinusoidal narrowband filters. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  18. High efficiency noble gas electron impact ion source for isotope separation

    Energy Technology Data Exchange (ETDEWEB)

    Appelhans, A. D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Olson, J. E. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dahl, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ward, M. B. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-07-01

    An electron impact ion source has been designed for generation of noble gas ions in a compact isotope separator. The source utilizes a circular filament that surrounds an ionization chamber, enabling multiple passes of electrons through the ionization chamber. This report presents ion optical design and the results of efficiency and sensitivity measurements performed in an ion source test chamber and in the compact isotope separator. The cylindrical design produced xenon ions at an efficiency of 0.37% with a sensitivity of ~24 µA /Pa at 300 µA of electron current.

  19. Adaptation of urine source separation in tropical cities: Process optimization and odor mitigation.

    Science.gov (United States)

    Zhang, Jiefeng; Giannis, Apostolos; Chang, Victor W C; Ng, Bernard J H; Wang, Jing-Yuan

    2013-04-01

    Source-separating urine from other domestic wastewaters promotes a more sustainable municipal wastewater treatment system. This study investigated the feasibility and potential issues of applying a urine source-separation system in tropical urban settings. The results showed that source-separated urine underwent rapid urea-hydrolysis (ureolysis) at temperatures between 34-40 degrees C, stale/fresh urine ratios greater than 40%, and/or with slight fecal cross-contamination. Undiluted (or low-diluted) urine favored ureolysis; this can be monitored by measuring conductivity as a reliable and efficient indicator The optimized parameters demonstrated that an effective urine source-separation system is achievable in tropical urban areas. On the other hand, the initial release of CO2 and NH3 led to an elevated pressure in the headspace of the collection reservoir, which then dropped to a negative value, primarily due to oxygen depletion by the microbial activity in the gradually alkalized urine. Another potential odor source during the ureolysis process was derived from the high production of volatile fatty acids (VFA), which were mainly acetic, propanoic, and butyric acids. Health concerns related to odor issues might limit the application of source separation systems in urban areas; it is therefore vital to systematically monitor and control the odor emissions from a source separation system. As such, an enhanced ureolysis process can attenuate the odor emissions. Urine source separation is promising to improve the management of domestic wastewater in a more sustainable way. The work demonstrates the achievability of an effective urine source-separation system in tropical urban areas. The installation of urine-stabilization tanks beneath high-rise buildings lowers the risk of pipe clogging. Conductivity measurement can be utilized as a reliable process indicator for an automated system. However, urine hydrolysis raises a strong potential of odor emission (both inorganic

  20. Multiple Speech Source Separation Using Inter-Channel Correlation and Relaxed Sparsity

    Directory of Open Access Journals (Sweden)

    Maoshen Jia

    2018-01-01

    Full Text Available In this work, a multiple speech source separation method using inter-channel correlation and relaxed sparsity is proposed. A B-format microphone with four spatially located channels is adopted due to the size of the microphone array to preserve the spatial parameter integrity of the original signal. Specifically, we firstly measure the proportion of overlapped components among multiple sources and find that there exist many overlapped time-frequency (TF components with increasing source number. Then, considering the relaxed sparsity of speech sources, we propose a dynamic threshold-based separation approach of sparse components where the threshold is determined by the inter-channel correlation among the recording signals. After conducting a statistical analysis of the number of active sources at each TF instant, a form of relaxed sparsity called the half-K assumption is proposed so that the active source number in a certain TF bin does not exceed half the total number of simultaneously occurring sources. By applying the half-K assumption, the non-sparse components are recovered by regarding the extracted sparse components as a guide, combined with vector decomposition and matrix factorization. Eventually, the final TF coefficients of each source are recovered by the synthesis of sparse and non-sparse components. The proposed method has been evaluated using up to six simultaneous speech sources under both anechoic and reverberant conditions. Both objective and subjective evaluations validated that the perceptual quality of the separated speech by the proposed approach outperforms existing blind source separation (BSS approaches. Besides, it is robust to different speeches whilst confirming all the separated speeches with similar perceptual quality.

  1. Estimates of water source contributions in a dynamic urban water supply system inferred via a Bayesian stable isotope mixing model

    Science.gov (United States)

    Jameel, M. Y.; Brewer, S.; Fiorella, R.; Tipple, B. J.; Bowen, G. J.; Terry, S.

    2017-12-01

    Public water supply systems (PWSS) are complex distribution systems and critical infrastructure, making them vulnerable to physical disruption and contamination. Exploring the susceptibility of PWSS to such perturbations requires detailed knowledge of the supply system structure and operation. Although the physical structure of supply systems (i.e., pipeline connection) is usually well documented for developed cities, the actual flow patterns of water in these systems are typically unknown or estimated based on hydrodynamic models with limited observational validation. Here, we present a novel method for mapping the flow structure of water in a large, complex PWSS, building upon recent work highlighting the potential of stable isotopes of water (SIW) to document water management practices within complex PWSS. We sampled a major water distribution system of the Salt Lake Valley, Utah, measuring SIW of water sources, treatment facilities, and numerous sites within in the supply system. We then developed a hierarchical Bayesian (HB) isotope mixing model to quantify the proportion of water supplied by different sources at sites within the supply system. Known production volumes and spatial distance effects were used to define the prior probabilities for each source; however, we did not include other physical information about the supply system. Our results were in general agreement with those obtained by hydrodynamic models and provide quantitative estimates of contributions of different water sources to a given site along with robust estimates of uncertainty. Secondary properties of the supply system, such as regions of "static" and "dynamic" source (e.g., regions supplied dominantly by one source vs. those experiencing active mixing between multiple sources), can be inferred from the results. The isotope-based HB isotope mixing model offers a new investigative technique for analyzing PWSS and documenting aspects of supply system structure and operation that are

  2. Life cycle assessment of a household solid waste source separation programme: a Swedish case study.

    Science.gov (United States)

    Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik

    2011-10-01

    The environmental impact of an extended property close source-separation system for solid household waste (i.e., a systems for collection of recyclables from domestic properties) is investigated in a residential area in southern Sweden. Since 2001, households have been able to source-separate waste into six fractions of dry recyclables and food waste sorting. The current system was evaluated using the EASEWASTE life cycle assessment tool. Current status is compared with an ideal scenario in which households display perfect source-separation behaviour and a scenario without any material recycling. Results show that current recycling provides substantial environmental benefits compared to a non-recycling alternative. The environmental benefit varies greatly between recyclable fractions, and the recyclables currently most frequently source-separated by households are often not the most beneficial from an environmental perspective. With optimal source-separation of all recyclables, the current net contribution to global warming could be changed to a net-avoidance while current avoidance of nutrient enrichment, acidification and photochemical ozone formation could be doubled. Sensitivity analyses show that the type of energy substituted by incineration of non-recycled waste, as well as energy used in recycling processes and in the production of materials substituted by waste recycling, is of high relevance for the attained results.

  3. Municipal solid waste source-separated collection in China: A comparative analysis

    International Nuclear Information System (INIS)

    Tai Jun; Zhang Weiqian; Che Yue; Feng Di

    2011-01-01

    A pilot program focusing on municipal solid waste (MSW) source-separated collection was launched in eight major cities throughout China in 2000. Detailed investigations were carried out and a comprehensive system was constructed to evaluate the effects of the eight-year implementation in those cities. This paper provides an overview of different methods of collection, transportation, and treatment of MSW in the eight cities; as well as making a comparative analysis of MSW source-separated collection in China. Information about the quantity and composition of MSW shows that the characteristics of MSW are similar, which are low calorific value, high moisture content and high proportion of organisms. Differences which exist among the eight cities in municipal solid waste management (MSWM) are presented in this paper. Only Beijing and Shanghai demonstrated a relatively effective result in the implementation of MSW source-separated collection. While the six remaining cities result in poor performance. Considering the current status of MSWM, source-separated collection should be a key priority. Thus, a wider range of cities should participate in this program instead of merely the eight pilot cities. It is evident that an integrated MSWM system is urgently needed. Kitchen waste and recyclables are encouraged to be separated at the source. Stakeholders involved play an important role in MSWM, thus their responsibilities should be clearly identified. Improvement in legislation, coordination mechanisms and public education are problematic issues that need to be addressed.

  4. Improved Convolutive and Under-Determined Blind Audio Source Separation with MRF Smoothing.

    Science.gov (United States)

    Zdunek, Rafał

    2013-01-01

    Convolutive and under-determined blind audio source separation from noisy recordings is a challenging problem. Several computational strategies have been proposed to address this problem. This study is concerned with several modifications to the expectation-minimization-based algorithm, which iteratively estimates the mixing and source parameters. This strategy assumes that any entry in each source spectrogram is modeled using superimposed Gaussian components, which are mutually and individually independent across frequency and time bins. In our approach, we resolve this issue by considering a locally smooth temporal and frequency structure in the power source spectrograms. Local smoothness is enforced by incorporating a Gibbs prior in the complete data likelihood function, which models the interactions between neighboring spectrogram bins using a Markov random field. Simulations using audio files derived from stereo audio source separation evaluation campaign 2008 demonstrate high efficiency with the proposed improvement.

  5. CO2 emission factors for waste incineration: Influence from source separation of recyclable materials

    DEFF Research Database (Denmark)

    Larsen, Anna Warberg; Astrup, Thomas

    2011-01-01

    CO2-loads from combustible waste are important inputs for national CO2 inventories and life-cycle assessments (LCA). CO2 emissions from waste incinerators are often expressed by emission factors in kg fossil CO2 emitted per GJ energy content of the waste. Various studies have shown considerable...... variations between emission factors for different incinerators, but the background for these variations has not been thoroughly examined. One important reason may be variations in collection of recyclable materials as source separation alters the composition of the residual waste incinerated. The objective...... of this study was to quantify the importance of source separation for determination of emission factors for incineration of residual household waste. This was done by mimicking various source separation scenarios and based on waste composition data calculating resulting emission factors for residual waste...

  6. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    Science.gov (United States)

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  7. Paleotempestological chronology developed from gas ion source AMS analysis of carbonates determined through real-time Bayesian statistical approach

    Science.gov (United States)

    Wallace, D. J.; Rosenheim, B. E.; Roberts, M. L.; Burton, J. R.; Donnelly, J. P.; Woodruff, J. D.

    2014-12-01

    Is a small quantity of high-precision ages more robust than a higher quantity of lower-precision ages for sediment core chronologies? AMS Radiocarbon ages have been available to researchers for several decades now, and precision of the technique has continued to improve. Analysis and time cost is high, though, and projects are often limited in terms of the number of dates that can be used to develop a chronology. The Gas Ion Source at the National Ocean Sciences Accelerator Mass Spectrometry Facility (NOSAMS), while providing lower-precision (uncertainty of order 100 14C y for a sample), is significantly less expensive and far less time consuming than conventional age dating and offers the unique opportunity for large amounts of ages. Here we couple two approaches, one analytical and one statistical, to investigate the utility of an age model comprised of these lower-precision ages for paleotempestology. We use a gas ion source interfaced to a gas-bench type device to generate radiocarbon dates approximately every 5 minutes while determining the order of sample analysis using the published Bayesian accumulation histories for deposits (Bacon). During two day-long sessions, several dates were obtained from carbonate shells in living position in a sediment core comprised of sapropel gel from Mangrove Lake, Bermuda. Samples were prepared where large shells were available, and the order of analysis was determined by the depth with the highest uncertainty according to Bacon. We present the results of these analyses as well as a prognosis for a future where such age models can be constructed from many dates that are quickly obtained relative to conventional radiocarbon dates. This technique currently is limited to carbonates, but development of a system for organic material dating is underway. We will demonstrate the extent to which sacrificing some analytical precision in favor of more dates improves age models.

  8. Improved Tensor-Based Singular Spectrum Analysis Based on Single Channel Blind Source Separation Algorithm and Its Application to Fault Diagnosis

    Directory of Open Access Journals (Sweden)

    Dan Yang

    2017-04-01

    Full Text Available To solve the problem of multi-fault blind source separation (BSS in the case that the observed signals are under-determined, a novel approach for single channel blind source separation (SCBSS based on the improved tensor-based singular spectrum analysis (TSSA is proposed. As the most natural representation of high-dimensional data, tensor can preserve the intrinsic structure of the data to the maximum extent. Thus, TSSA method can be employed to extract the multi-fault features from the measured single-channel vibration signal. However, SCBSS based on TSSA still has some limitations, mainly including unsatisfactory convergence of TSSA in many cases and the number of source signals is hard to accurately estimate. Therefore, the improved TSSA algorithm based on canonical decomposition and parallel factors (CANDECOMP/PARAFAC weighted optimization, namely CP-WOPT, is proposed in this paper. CP-WOPT algorithm is applied to process the factor matrix using a first-order optimization approach instead of the original least square method in TSSA, so as to improve the convergence of this algorithm. In order to accurately estimate the number of the source signals in BSS, EMD-SVD-BIC (empirical mode decomposition—singular value decomposition—Bayesian information criterion method, instead of the SVD in the conventional TSSA, is introduced. To validate the proposed method, we applied it to the analysis of the numerical simulation signal and the multi-fault rolling bearing signals.

  9. Audio visual speech source separation via improved context dependent association model

    Science.gov (United States)

    Kazemi, Alireza; Boostani, Reza; Sobhanmanesh, Fariborz

    2014-12-01

    In this paper, we exploit the non-linear relation between a speech source and its associated lip video as a source of extra information to propose an improved audio-visual speech source separation (AVSS) algorithm. The audio-visual association is modeled using a neural associator which estimates the visual lip parameters from a temporal context of acoustic observation frames. We define an objective function based on mean square error (MSE) measure between estimated and target visual parameters. This function is minimized for estimation of the de-mixing vector/filters to separate the relevant source from linear instantaneous or time-domain convolutive mixtures. We have also proposed a hybrid criterion which uses AV coherency together with kurtosis as a non-Gaussianity measure. Experimental results are presented and compared in terms of visually relevant speech detection accuracy and output signal-to-interference ratio (SIR) of source separation. The suggested audio-visual model significantly improves relevant speech classification accuracy compared to existing GMM-based model and the proposed AVSS algorithm improves the speech separation quality compared to reference ICA- and AVSS-based methods.

  10. scarlet: Source separation in multi-band images by Constrained Matrix Factorization

    Science.gov (United States)

    Melchior, Peter; Moolekamp, Fred; Jerdee, Maximilian; Armstrong, Robert; Sun, Ai-Lei; Bosch, James; Lupton, Robert

    2018-03-01

    SCARLET performs source separation (aka "deblending") on multi-band images. It is geared towards optical astronomy, where scenes are composed of stars and galaxies, but it is straightforward to apply it to other imaging data. Separation is achieved through a constrained matrix factorization, which models each source with a Spectral Energy Distribution (SED) and a non-parametric morphology, or multiple such components per source. The code performs forced photometry (with PSF matching if needed) using an optimal weight function given by the signal-to-noise weighted morphology across bands. The approach works well if the sources in the scene have different colors and can be further strengthened by imposing various additional constraints/priors on each source. Because of its generic utility, this package provides a stand-alone implementation that contains the core components of the source separation algorithm. However, the development of this package is part of the LSST Science Pipeline; the meas_deblender package contains a wrapper to implement the algorithms here for the LSST stack.

  11. Cultured Cortical Neurons Can Perform Blind Source Separation According to the Free-Energy Principle.

    Directory of Open Access Journals (Sweden)

    Takuya Isomura

    2015-12-01

    Full Text Available Blind source separation is the computation underlying the cocktail party effect--a partygoer can distinguish a particular talker's voice from the ambient noise. Early studies indicated that the brain might use blind source separation as a signal processing strategy for sensory perception and numerous mathematical models have been proposed; however, it remains unclear how the neural networks extract particular sources from a complex mixture of inputs. We discovered that neurons in cultures of dissociated rat cortical cells could learn to represent particular sources while filtering out other signals. Specifically, the distinct classes of neurons in the culture learned to respond to the distinct sources after repeating training stimulation. Moreover, the neural network structures changed to reduce free energy, as predicted by the free-energy principle, a candidate unified theory of learning and memory, and by Jaynes' principle of maximum entropy. This implicit learning can only be explained by some form of Hebbian plasticity. These results are the first in vitro (as opposed to in silico demonstration of neural networks performing blind source separation, and the first formal demonstration of neuronal self-organization under the free energy principle.

  12. Residents' preferences for household kitchen waste source separation services in Beijing: a choice experiment approach.

    Science.gov (United States)

    Yuan, Yalin; Yabe, Mitsuyasu

    2014-12-23

    A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents' preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.

  13. Residents’ Preferences for Household Kitchen Waste Source Separation Services in Beijing: A Choice Experiment Approach

    Directory of Open Access Journals (Sweden)

    Yalin Yuan

    2014-12-01

    Full Text Available A source separation program for household kitchen waste has been in place in Beijing since 2010. However, the participation rate of residents is far from satisfactory. This study was carried out to identify residents’ preferences based on an improved management strategy for household kitchen waste source separation. We determine the preferences of residents in an ad hoc sample, according to their age level, for source separation services and their marginal willingness to accept compensation for the service attributes. We used a multinomial logit model to analyze the data, collected from 394 residents in Haidian and Dongcheng districts of Beijing City through a choice experiment. The results show there are differences of preferences on the services attributes between young, middle, and old age residents. Low compensation is not a major factor to promote young and middle age residents accept the proposed separation services. However, on average, most of them prefer services with frequent, evening, plastic bag attributes and without instructor. This study indicates that there is a potential for local government to improve the current separation services accordingly.

  14. Survival of enteric bacteria in source-separated human urine used ...

    African Journals Online (AJOL)

    MAKAYA

    Urine in Pumpkin (Cucurbita maxima) Cultivation. Agric. Food Sci. 18:57-68. Pronk W, Koné D (2010). Options for urine treatment in developing countries. Desalination 251:360-368. Schönning C, Leeming R, Stenström TA (2002). Faecal contamination of source-separated human urine based on the content of faecal sterols ...

  15. Micropollutant removal in an algal treatment system fed with source separated wastewater streams

    NARCIS (Netherlands)

    Wilt, de H.A.; Butkovskyi, A.; Tuantet, K.; Hernandez Leal, L.; Fernandes, T.; Langenhoff, A.A.M.; Zeeman, G.

    2016-01-01

    Micropollutant removal in an algal treatment system fed with source separated wastewater streams was studied. Batch experiments with the microalgae Chlorella sorokiniana grown on urine, anaerobically treated black water and synthetic urine were performed to assess the removal of six spiked

  16. Audio Source Separation in Reverberant Environments Using β-Divergence-Based Nonnegative Factorization

    DEFF Research Database (Denmark)

    Fakhry, Mahmoud; Svaizer, Piergiorgio; Omologo, Maurizio

    2017-01-01

    -maximization algorithm and used to separate the signals by means of multichannel Wiener filtering. We propose to estimate these parameters by applying nonnegative factorization based on prior information on source variances. In the nonnegative factorization, spectral basis matrices can be defined as the prior...

  17. Optimizing source-separated feces degradation and fertility using nitrifying microorganisms.

    Science.gov (United States)

    Hashemi, Shervin; Han, Mooyoung

    2018-01-15

    Resource-oriented sanitation (ROS) systems play an important role in handling source-separated human sanitary wastes intended to be used for other purposes. Usually, the purpose of employing such systems is to render the source-separated human feces suitable as fertilizer or soil conditioner. However, the high volume, low degradation rate, and lack of fertility management pose challenges to such enterprises. Accordingly, treatment by applying controlled amounts of nitrifying microorganisms could be useful. The effect of adding different amounts of Nitrosomonas Europaea bio-seed, along with a certain amount of Nitrobacter Winogradskyi bio-seed, to source-separated human feces was investigated. The results show that adding 7000-8000 or more N. Europaea cells, along with 10,000 N. Winogradskyi cells, to 1 g feces, resulted in up to 90% degradation of the organic matter by enhancing the growth of heterotrophic microorganisms. Moreover, the nitrogen composition and pH of the degraded feces were optimized to meet the criteria for standard fertilizer. The results can be useful for managing source-separated feces in ROS systems in accordance with the specific aims of such systems, i.e., reducing feces volume by bio-degradation and increasing the fertility to meet the standard criteria for fertilizer. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Resource recovery from source separated domestic waste(water) streams; Full scale results

    NARCIS (Netherlands)

    Zeeman, G.; Kujawa, K.

    2011-01-01

    A major fraction of nutrients emitted from households are originally present in only 1% of total wastewater volume. New sanitation concepts enable the recovery and reuse of these nutrients from feces and urine. Two possible sanitation concepts are presented, with varying degree of source separation

  19. Fate of personal care and household products in source separated sanitation

    NARCIS (Netherlands)

    Butkovskyi, A.; Rijnaarts, H.H.M.; Zeeman, G.; Hernandez Leal, L.

    2016-01-01

    Removal of twelve micropollutants, namely biocides, fragrances, ultraviolet (UV)-filters and preservatives in source separated grey and black water treatment systems was studied. All compounds were present in influent grey water in μg/l range. Seven compounds were found in influent black water.

  20. Source separation of municipal solid waste: The effects of different separation methods and citizens' inclination-case study of Changsha, China.

    Science.gov (United States)

    Chen, Haibin; Yang, Yan; Jiang, Wei; Song, Mengjie; Wang, Ying; Xiang, Tiantian

    2017-02-01

    A case study on the source separation of municipal solid waste (MSW) was performed in Changsha, the capital city of Hunan Province, China. The objective of this study is to analyze the effects of different separation methods and compare their effects with citizens' attitudes and inclination. An effect evaluation method based on accuracy rate and miscellany rate was proposed to study the performance of different separation methods. A large-scale questionnaire survey was conducted to determine citizens' attitudes and inclination toward source separation. Survey result shows that the vast majority of respondents hold consciously positive attitudes toward participation in source separation. Moreover, the respondents ignore the operability of separation methods and would rather choose the complex separation method involving four or more subclassed categories. For the effects of separation methods, the site experiment result demonstrates that the relatively simple separation method involving two categories (food waste and other waste) achieves the best effect with the highest accuracy rate (83.1%) and the lowest miscellany rate (16.9%) among the proposed experimental alternatives. The outcome reflects the inconsistency between people's environmental awareness and behavior. Such inconsistency and conflict may be attributed to the lack of environmental knowledge. Environmental education is assumed to be a fundamental solution to improve the effect of source separation of MSW in Changsha. Important management tips on source separation, including the reformation of the current pay-as-you-throw (PAYT) system, are presented in this work. A case study on the source separation of municipal solid waste was performed in Changsha. An effect evaluation method based on accuracy rate and miscellany rate was proposed to study the performance of different separation methods. The site experiment result demonstrates that the two-category (food waste and other waste) method achieves the

  1. Bayesian Nitrate Source Apportionment to Individual Groundwater Wells in the Central Valley by use of Nitrogen, Oxygen, and Boron Isotopic Tracers

    Science.gov (United States)

    Lockhart, K.; Harter, T.; Grote, M.; Young, M. B.; Eppich, G.; Deinhart, A.; Wimpenny, J.; Yin, Q. Z.

    2014-12-01

    Groundwater quality is a concern in alluvial aquifers underlying agricultural areas worldwide, an example of which is the San Joaquin Valley, California. Nitrate from land applied fertilizers or from animal waste can leach to groundwater and contaminate drinking water resources. Dairy manure and synthetic fertilizers are the major sources of nitrate in groundwater in the San Joaquin Valley, however, septic waste can be a major source in some areas. As in other such regions around the world, the rural population in the San Joaquin Valley relies almost exclusively on shallow domestic wells (≤150 m deep), of which many have been affected by nitrate. Consumption of water containing nitrate above the drinking water limit has been linked to major health effects including low blood oxygen in infants and certain cancers. Knowledge of the proportion of each of the three main nitrate sources (manure, synthetic fertilizer, and septic waste) contributing to individual well nitrate can aid future regulatory decisions. Nitrogen, oxygen, and boron isotopes can be used as tracers to differentiate between the three main nitrate sources. Mixing models quantify the proportional contributions of sources to a mixture by using the concentration of conservative tracers within each source as a source signature. Deterministic mixing models are common, but do not allow for variability in the tracer source concentration or overlap of tracer concentrations between sources. Bayesian statistics used in conjunction with mixing models can incorporate variability in the source signature. We developed a Bayesian mixing model on a pilot network of 32 private domestic wells in the San Joaquin Valley for which nitrate as well as nitrogen, oxygen, and boron isotopes were measured. Probability distributions for nitrogen, oxygen, and boron isotope source signatures for manure, fertilizer, and septic waste were compiled from the literature and from a previous groundwater monitoring project on several

  2. Practical Bayesian tomography

    Science.gov (United States)

    Granade, Christopher; Combes, Joshua; Cory, D. G.

    2016-03-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  3. A review on automated sorting of source-separated municipal solid waste for recycling.

    Science.gov (United States)

    Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul

    2017-02-01

    A crucial prerequisite for recycling forming an integral part of municipal solid waste (MSW) management is sorting of useful materials from source-separated MSW. Researchers have been exploring automated sorting techniques to improve the overall efficiency of recycling process. This paper reviews recent advances in physical processes, sensors, and actuators used as well as control and autonomy related issues in the area of automated sorting and recycling of source-separated MSW. We believe that this paper will provide a comprehensive overview of the state of the art and will help future system designers in the area. In this paper, we also present research challenges in the field of automated waste sorting and recycling. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Efficient image enhancement using sparse source separation in the Retinex theory

    Science.gov (United States)

    Yoon, Jongsu; Choi, Jangwon; Choe, Yoonsik

    2017-11-01

    Color constancy is the feature of the human vision system (HVS) that ensures the relative constancy of the perceived color of objects under varying illumination conditions. The Retinex theory of machine vision systems is based on the HVS. Among Retinex algorithms, the physics-based algorithms are efficient; however, they generally do not satisfy the local characteristics of the original Retinex theory because they eliminate global illumination from their optimization. We apply the sparse source separation technique to the Retinex theory to present a physics-based algorithm that satisfies the locality characteristic of the original Retinex theory. Previous Retinex algorithms have limited use in image enhancement because the total variation Retinex results in an overly enhanced image and the sparse source separation Retinex cannot completely restore the original image. In contrast, our proposed method preserves the image edge and can very nearly replicate the original image without any special operation.

  5. Quantitative evaluation of artifact removal in real magnetoencephalogram signals with blind source separation

    OpenAIRE

    Escudero, Javier; Hornero, Roberto; Fernandez Perez, Alvaro; Abasolo, Daniel

    2011-01-01

    The magnetoencephalogram (MEG) is contaminated with undesired signals, which are called artifacts. Some of the most important ones are the cardiac and the ocular artifacts (CA and OA, respectively), and the power line noise (PLN). Blind source separation (BSS) has been used to reduce the influence of the artifacts in the data. There is a plethora of BSS-based artifact removal approaches, but few comparative analyses. In this study, MEG background activity from 26 subjects was processed with f...

  6. Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study

    OpenAIRE

    Cees Buisman; Grietje Zeeman; Lucía Hernández; Trang Hoang; Taina Tervahauta

    2013-01-01

    Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data to compare energy and water balance, nutrient recovery, chemical use, effluent quality and land area requirement in four different sanitation concepts: (1) centralized; (2) centralized with source...

  7. Weight adjusted tensor method for blind separation of underdetermined mixtures of nonstationary sources

    Czech Academy of Sciences Publication Activity Database

    Tichavský, Petr; Koldovský, Zbyněk

    2011-01-01

    Roč. 59, č. 3 (2011), s. 1037-1047 ISSN 1053-587X R&D Projects: GA MŠk 1M0572; GA ČR GA102/09/1278 Institutional research plan: CEZ:AV0Z10750506 Keywords : blind source separation * tensor decomposition * Cramer-Rao lower bound Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.628, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/tichavsky-0356666.pdf

  8. An adaptive Gaussian process-based method for efficient Bayesian experimental design in groundwater contaminant source identification problems: ADAPTIVE GAUSSIAN PROCESS-BASED INVERSION

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiangjiang [College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Li, Weixuan [Pacific Northwest National Laboratory, Richland Washington USA; Zeng, Lingzao [College of Environmental and Resource Sciences, Zhejiang University, Hangzhou China; Wu, Laosheng [Department of Environmental Sciences, University of California, Riverside California USA

    2016-08-01

    Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPU-demanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a two-stage manner. Since the two-stage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose a Gaussian process (GP) surrogate-based Bayesian experimental design and parameter estimation approach for groundwater contaminant source identification problems. A major advantage of the GP surrogate is that it provides a convenient estimation of the approximation error, which can be incorporated in the Bayesian formula to avoid over-confident estimation of the posterior distribution. The proposed approach is tested with a numerical case study. Without sacrificing the estimation accuracy, the new approach achieves about 200 times of speed-up compared to our previous work using two-stage MCMC.

  9. Concepts and Criteria for Blind Quantum Source Separation and Blind Quantum Process Tomography

    Directory of Open Access Journals (Sweden)

    Alain Deville

    2017-07-01

    Full Text Available Blind Source Separation (BSS is an active domain of Classical Information Processing, with well-identified methods and applications. The development of Quantum Information Processing has made possible the appearance of Blind Quantum Source Separation (BQSS, with a recent extension towards Blind Quantum Process Tomography (BQPT. This article investigates the use of several fundamental quantum concepts in the BQSS context and establishes properties already used without justification in that context. It mainly considers a pair of electron spins initially separately prepared in a pure state and then submitted to an undesired exchange coupling between these spins. Some consequences of the existence of the entanglement phenomenon, and of the probabilistic aspect of quantum measurements, upon BQSS solutions, are discussed. An unentanglement criterion is established for the state of an arbitrary qubit pair, expressed first with probability amplitudes and secondly with probabilities. The interest of using the concept of a random quantum state in the BQSS context is presented. It is stressed that the concept of statistical independence of the sources, widely used in classical BSS, should be used with care in BQSS, and possibly replaced by some disentanglement principle. It is shown that the coefficients of the development of any qubit pair pure state over the states of an orthonormal basis can be expressed with the probabilities of results in the measurements of well-chosen spin components.

  10. Carbon Dioxide Capture and Separation Techniques for Gasification-based Power Generation Point Sources

    Energy Technology Data Exchange (ETDEWEB)

    Pennline, H.W.; Luebke, D.R.; Jones, K.L.; Morsi, B.I. (Univ. of Pittsburgh, PA); Heintz, Y.J. (Univ. of Pittsburgh, PA); Ilconich, J.B. (Parsons)

    2007-06-01

    The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and reduced costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (post-combustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle or IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Pertaining to another separation technology, fabrication techniques and mechanistic studies for membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. Finally, dry, regenerable processes based on sorbents are additional techniques for CO2 capture from fuel gas. An overview of these novel techniques is presented along with a research progress status of technologies related to membranes and physical solvents.

  11. Source-based neurofeedback methods using EEG recordings: Training altered brain activity in a functional brain source derived from Blind Source Separation

    Directory of Open Access Journals (Sweden)

    David James White

    2014-10-01

    Full Text Available A developing literature explores the use of neurofeedback in the treatment of a range of clinical conditions, particularly ADHD and epilepsy, whilst neurofeedback also provides an experimental tool for studying the functional significance of endogenous brain activity. A critical component of any neurofeedback method is the underlying physiological signal which forms the basis for the feedback. While the past decade has seen the emergence of fMRI-based protocols training spatially confined BOLD activity, traditional neurofeedback has utilized a small number of electrode sites on the scalp. As scalp EEG at a given electrode site reflects a linear mixture of activity from multiple brain sources and artifacts, efforts to successfully acquire some level of control over the signal may be confounded by these extraneous sources. Further, in the event of successful training, these traditional neurofeedback methods are likely influencing multiple brain regions and processes. The present work describes the use of source-based signal processing methods in EEG neurofeedback. The feasibility and potential utility of such methods were explored in an experiment training increased theta oscillatory activity in a source derived from Blind Source Separation of EEG data obtained during completion of a complex cognitive task (spatial navigation. Learned increases in theta activity were observed in two of the four participants to complete 20 sessions of neurofeedback targeting this individually defined functional brain source. Source-based EEG neurofeedback methods using Blind Source Separation may offer important advantages over traditional neurofeedback, by targeting the desired physiological signal in a more functionally and spatially specific manner. Having provided preliminary evidence of the feasibility of these methods, future work may study a range of clinically and experimentally relevant brain processes targeting individual brain sources by source-based EEG

  12. Short-Sampled Blind Source Separation of Rotating Machinery Signals Based on Spectrum Correction

    Directory of Open Access Journals (Sweden)

    Xiangdong Huang

    2016-01-01

    Full Text Available Nowadays, the existing blind source separation (BSS algorithms in rotating machinery fault diagnosis can hardly meet the demand of fast response, high stability, and low complexity simultaneously. Therefore, this paper proposes a spectrum correction based BSS algorithm. Through the incorporation of FFT, spectrum correction, a screen procedure (consisting of frequency merging, candidate pattern selection, and single-source-component recognition, modified k-means based source number estimation, and mixing matrix estimation, the proposed BSS algorithm can accurately achieve harmonics sensing on field rotating machinery faults in case of short-sampled observations. Both numerical simulation and practical experiment verify the proposed BSS algorithm’s superiority in the recovery quality, stability to insufficient samples, and efficiency over the existing ICA-based methods. Besides rotating machinery fault diagnosis, the proposed BSS algorithm also possesses a vast potential in other harmonics-related application fields.

  13. Sources and speciation of heavy metals in municipal solid waste (MSW) and its effect on the separation technique

    Energy Technology Data Exchange (ETDEWEB)

    Biollaz, S.; Ludwig, Ch.; Stucki, S. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1999-08-01

    A literature search was carried out to determine sources and speciation of heavy metals in MSW. A combination of thermal and mechanical separation techniques is necessary to achieve the required high degrees of metal separation. Metallic goods should be separated mechanically, chemically bound heavy metals by a thermal process. (author) 1 fig., 1 tab., 6 refs.

  14. Resonance ionization laser ion sources for on-line isotope separators (invited)

    International Nuclear Information System (INIS)

    Marsh, B. A.

    2014-01-01

    A Resonance Ionization Laser Ion Source (RILIS) is today considered an essential component of the majority of Isotope Separator On Line (ISOL) facilities; there are seven laser ion sources currently operational at ISOL facilities worldwide and several more are under development. The ionization mechanism is a highly element selective multi-step resonance photo-absorption process that requires a specifically tailored laser configuration for each chemical element. For some isotopes, isomer selective ionization may even be achieved by exploiting the differences in hyperfine structures of an atomic transition for different nuclear spin states. For many radioactive ion beam experiments, laser resonance ionization is the only means of achieving an acceptable level of beam purity without compromising isotope yield. Furthermore, by performing element selection at the location of the ion source, the propagation of unwanted radioactivity downstream of the target assembly is reduced. Whilst advances in laser technology have improved the performance and reliability of laser ion sources and broadened the range of suitable commercially available laser systems, many recent developments have focused rather on the laser/atom interaction region in the quest for increased selectivity and/or improved spectral resolution. Much of the progress in this area has been achieved by decoupling the laser ionization from competing ionization processes through the use of a laser/atom interaction region that is physically separated from the target chamber. A new application of gas catcher laser ion source technology promises to expand the capabilities of projectile fragmentation facilities through the conversion of otherwise discarded reaction fragments into high-purity low-energy ion beams. A summary of recent RILIS developments and the current status of laser ion sources worldwide is presented

  15. Generation of dynamo waves by spatially separated sources in the Earth and other celestial bodies

    Science.gov (United States)

    Popova, E.

    2017-12-01

    The amplitude and the spatial configuration of the planetary and stellar magnetic field can changing over the years. Celestial bodies can have cyclic, chaotic or unchanging in time magnetic activity which is connected with a dynamo mechanism. This mechanism is based on the consideration of the joint influence of the alpha-effect and differential rotation. Dynamo sources can be located at different depths (active layers) of the celestial body and can have different intensities. Application of this concept allows us to get different forms of solutions and some of which can include wave propagating inside the celestial body. We analytically showed that in the case of spatially separated sources of magnetic field each source generates a wave whose frequency depends on the physical parameters of its source. We estimated parameters of sources required for the generation nondecaying waves. We discus structure of such sources and matter motion (including meridional circulation) in the liquid outer core of the Earth and active layers of other celestial bodies.

  16. Identifying the source of farmed escaped Atlantic salmon (Salmo salar): Bayesian clustering analysis increases accuracy of assignment

    DEFF Research Database (Denmark)

    Glover, Kevin A.; Hansen, Michael Møller; Skaala, Oystein

    2009-01-01

    Farmed Atlantic salmon escapees represent a significant threat to the genetic integrity of natural populations. Not all escapement events are reported, and consequently, there is a need to develop an effective tool for the identification of escapees. In this study, > 2200 salmon were collected from...... 44 cages located on 26 farms in the Hardangerfjord, western Norway. This fjord represents one of the major salmon farming areas in Norway, with a production of 57,000 t in 2007. Based upon genetic data from 17 microsatellite markers, significant but highly variable differentiation was observed among...... the 44 samples (cages), with pair-wise FST values ranging between 0.000 and 0.185. Bayesian clustering of the samples revealed five major genetic groups, into which the 44 samples were re-organised. Bayesian clustering also identified two samples consisting of fish with mixed genetic background...

  17. Bayesian inverse modeling and source location of an unintended 131I release in Europe in the fall of 2011

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek; Šindelářová, Kateřina; Hýža, M.; Stohl, A.

    2017-01-01

    Roč. 17, č. 20 (2017), s. 12677-12696 ISSN 1680-7316 R&D Projects: GA MŠk(CZ) 7F14287 Institutional support: RVO:67985556 Keywords : Bayesian inverse modeling * iodine-131 * consequences of the iodine release Subject RIV: BB - Applied Statistics, Operational Research OBOR OECD: Statistics and probability Impact factor: 5.318, year: 2016 http://library.utia.cas.cz/separaty/2017/AS/tichy-0480506.pdf

  18. Convolutive blind source separation of surface EMG measurements of the respiratory muscles.

    Science.gov (United States)

    Petersen, Eike; Buchner, Herbert; Eger, Marcus; Rostalski, Philipp

    2017-04-01

    Electromyography (EMG) has long been used for the assessment of muscle function and activity and has recently been applied to the control of medical ventilation. For this application, the EMG signal is usually recorded invasively by means of electrodes on a nasogastric tube which is placed inside the esophagus in order to minimize noise and crosstalk from other muscles. Replacing these invasive measurements with an EMG signal obtained non-invasively on the body surface is difficult and requires techniques for signal separation in order to reconstruct the contributions of the individual respiratory muscles. In the case of muscles with small cross-sectional areas, or with muscles at large distances from the recording site, solutions to this problem have been proposed previously. The respiratory muscles, however, are large and distributed widely over the upper body volume. In this article, we describe an algorithm for convolutive blind source separation (BSS) that performs well even for large, distributed muscles such as the respiratory muscles, while using only a small number of electrodes. The algorithm is derived as a special case of the TRINICON general framework for BSS. To provide evidence that it shows potential for separating inspiratory, expiratory, and cardiac activities in practical applications, a joint numerical simulation of EMG and ECG activities was performed, and separation success was evaluated in a variety of noise settings. The results are promising.

  19. Need for improvements in physical pretreatment of source-separated household food waste.

    Science.gov (United States)

    Bernstad, A; Malmquist, L; Truedsson, C; la Cour Jansen, J

    2013-03-01

    The aim of the present study was to investigate the efficiency in physical pretreatment processes of source-separated solid organic household waste. The investigation of seventeen Swedish full-scale pretreatment facilities, currently receiving separately collected food waste from household for subsequent anaerobic digestion, shows that problems with the quality of produced biomass and high maintenance costs are common. Four full-scale physical pretreatment plants, three using screwpress technology and one using dispergation technology, were compared in relation to resource efficiency, losses of nitrogen and potential methane production from biodegradable matter as well as the ratio of unwanted materials in produced biomass intended for wet anaerobic digestion. Refuse generated in the processes represent 13-39% of TS in incoming wet waste. The methane yield from these fractions corresponds to 14-36Nm(3)/ton separately collected solid organic household waste. Also, 13-32% of N-tot in incoming food waste is found in refuse. Losses of both biodegradable material and nutrients were larger in the three facilities using screwpress technology compared to the facility using dispersion technology.(1) Thus, there are large potentials for increase of both the methane yield and nutrient recovery from separately collected solid organic household waste through increased efficiency in facilities for physical pretreatment. Improved pretreatment processes could thereby increase the overall environmental benefits from anaerobic digestion as a treatment alternative for solid organic household waste. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization.......In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...

  1. Blind source separation of ex-vivo aorta tissue multispectral images.

    Science.gov (United States)

    Galeano, July; Perez, Sandra; Montoya, Yonatan; Botina, Deivid; Garzón, Johnson

    2015-05-01

    Blind Source Separation methods (BSS) aim for the decomposition of a given signal in its main components or source signals. Those techniques have been widely used in the literature for the analysis of biomedical images, in order to extract the main components of an organ or tissue under study. The analysis of skin images for the extraction of melanin and hemoglobin is an example of the use of BSS. This paper presents a proof of concept of the use of source separation of ex-vivo aorta tissue multispectral Images. The images are acquired with an interference filter-based imaging system. The images are processed by means of two algorithms: Independent Components analysis and Non-negative Matrix Factorization. In both cases, it is possible to obtain maps that quantify the concentration of the main chromophores present in aortic tissue. Also, the algorithms allow for spectral absorbance of the main tissue components. Those spectral signatures were compared against the theoretical ones by using correlation coefficients. Those coefficients report values close to 0.9, which is a good estimator of the method's performance. Also, correlation coefficients lead to the identification of the concentration maps according to the evaluated chromophore. The results suggest that Multi/hyper-spectral systems together with image processing techniques is a potential tool for the analysis of cardiovascular tissue.

  2. Carbon dioxide capture and separation techniques for advanced power generation point sources

    Energy Technology Data Exchange (ETDEWEB)

    Pennline, H.W.; Luebke, D.R.; Morsi, B.I.; Heintz, Y.J.; Jones, K.L.; Ilconich, J.B.

    2006-09-01

    The capture/separation step for carbon dioxide (CO2) from large-point sources is a critical one with respect to the technical feasibility and cost of the overall carbon sequestration scenario. For large-point sources, such as those found in power generation, the carbon dioxide capture techniques being investigated by the in-house research area of the National Energy Technology Laboratory possess the potential for improved efficiency and costs as compared to more conventional technologies. The investigated techniques can have wide applications, but the research has focused on capture/separation of carbon dioxide from flue gas (postcombustion from fossil fuel-fired combustors) and from fuel gas (precombustion, such as integrated gasification combined cycle – IGCC). With respect to fuel gas applications, novel concepts are being developed in wet scrubbing with physical absorption; chemical absorption with solid sorbents; and separation by membranes. In one concept, a wet scrubbing technique is being investigated that uses a physical solvent process to remove CO2 from fuel gas of an IGCC system at elevated temperature and pressure. The need to define an ideal solvent has led to the study of the solubility and mass transfer properties of various solvents. Fabrication techniques and mechanistic studies for hybrid membranes separating CO2 from the fuel gas produced by coal gasification are also being performed. Membranes that consist of CO2-philic silanes incorporated into an alumina support or ionic liquids encapsulated into a polymeric substrate have been investigated for permeability and selectivity. An overview of two novel techniques is presented along with a research progress status of each technology.

  3. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  4. Blind Source Separation in Farsi Language by Using Hermitian Angle in Convolutive Enviroment

    Directory of Open Access Journals (Sweden)

    Atefeh Soltani

    2013-04-01

    Full Text Available This paper presents a T-F masking method for convolutive blind source separation based on hermitian angle concept. The hermitian angle is calculated between T-F domain mixture vector and reference vector. Two different reference vectors are assumed for calculating two different hermitian angles, and then these angles are clustered with k-means or FCM method to estimate unmixing masks. The well-known permutation problem is solved based on k-means clustering of estimated masks which are partitioned to small groups. The experimental results show an improvement in performance when using two different reference vectors compared to only one.

  5. Separation of sources in radiofrequency measurements of partial discharges using time-power ratio maps.

    Science.gov (United States)

    Albarracin, R; Robles, G; Martinez-Tarifa, J M; Ardila-Rey, J

    2015-09-01

    Partial discharges measurement is one of the most useful tools for condition monitoring of high-voltage (HV) equipment. These phenomena can be measured on-line in radiofrequency (RF) with sensors such as the Vivaldi antenna, used in this paper, which improves the signal-to-noise ratio by rejecting FM and low-frequency TV bands. Additionally, the power ratios (PR), a signal-processing technique based on the power distribution of the incoming signals in frequency bands, are used to characterize different sources of PD and electromagnetic noise (EMN). The calculation of the time length of the pulses is introduced to separate signals where the PR alone do not give a conclusive solution. Thus, if several EM sources could be previously calibrated, it is possible to detect pulses corresponding to PD activity. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Economic viability and critical influencing factors assessment of black water and grey water source-separation sanitation system.

    Science.gov (United States)

    Thibodeau, C; Monette, F; Glaus, M; Laflamme, C B

    2011-01-01

    The black water and grey water source-separation sanitation system aims at efficient use of energy (biogas), water and nutrients but currently lacks evidence of economic viability to be considered a credible alternative to the conventional system. This study intends to demonstrate economic viability, identify main cost contributors and assess critical influencing factors. A technico-economic model was built based on a new neighbourhood in a Canadian context. Three implementation scales of source-separation system are defined: 500, 5,000 and 50,000 inhabitants. The results show that the source-separation system is 33% to 118% more costly than the conventional system, with the larger cost differential obtained by lower source-separation system implementation scales. A sensitivity analysis demonstrates that vacuum toilet flow reduction from 1.0 to 0.25 L/flush decreases source-separation system cost between 23 and 27%. It also shows that high resource costs can be beneficial or unfavourable to the source-separation system depending on whether the vacuum toilet flow is low or normal. Therefore, the future of this configuration of the source-separation system lies mainly in vacuum toilet flow reduction or the introduction of new efficient effluent volume reduction processes (e.g. reverse osmosis).

  7. PSFGAN: a generative adversarial network system for separating quasar point sources and host galaxy light

    Science.gov (United States)

    Stark, Dominic; Launet, Barthelemy; Schawinski, Kevin; Zhang, Ce; Koss, Michael; Turp, M. Dennis; Sartori, Lia F.; Zhang, Hantian; Chen, Yiru; Weigel, Anna K.

    2018-03-01

    The study of unobscured active galactic nuclei (AGN) and quasars depends on the reliable decomposition of the light from the AGN point source and the extended host galaxy light. The problem is typically approached using parametric fitting routines using separate models for the host galaxy and the point spread function (PSF). We present a new approach using a Generative Adversarial Network (GAN) trained on galaxy images. We test the method using Sloan Digital Sky Survey (SDSS) r-band images with artificial AGN point sources added which are then removed using the GAN and with parametric methods using GALFIT. When the AGN point source PS is more than twice as bright as the host galaxy, we find that our method, PSFGAN, can recover PS and host galaxy magnitudes with smaller systematic error and a lower average scatter (49%). PSFGAN is more tolerant to poor knowledge of the PSF than parametric methods. Our tests show that PSFGAN is robust against a broadening in the PSF width of ±50% if it is trained on multiple PSF's. We demonstrate that while a matched training set does improve performance, we can still subtract point sources using a PSFGAN trained on non-astronomical images. While initial training is computationally expensive, evaluating PSFGAN on data is more than 40 times faster than GALFIT fitting two components. Finally, PSFGAN it is more robust and easy to use than parametric methods as it requires no input parameters.

  8. Sources of variability among replicate samples separated by two-dimensional gel electrophoresis.

    Science.gov (United States)

    Bland, Alison M; Janech, Michael G; Almeida, Jonas S; Arthur, John M

    2010-04-01

    Two-dimensional gel electrophoresis (2DE) offers high-resolution separation for intact proteins. However, variability in the appearance of spots can limit the ability to identify true differences between conditions. Variability can occur at a number of levels. Individual samples can differ because of biological variability. Technical variability can occur during protein extraction, processing, or storage. Another potential source of variability occurs during analysis of the gels and is not a result of any of the causes of variability named above. We performed a study designed to focus only on the variability caused by analysis. We separated three aliquots of rat left ventricle and analyzed differences in protein abundance on the replicate 2D gels. As the samples loaded on each gel were identical, differences in protein abundance are caused by variability in separation or interpretation of the gels. Protein spots were compared across gels by quantile values to determine differences. Fourteen percent of spots had a maximum difference in intensity of 0.4 quantile values or more between replicates. We then looked individually at the spots to determine the cause of differences between the measured intensities. Reasons for differences were: failure to identify a spot (59%), differences in spot boundaries (13%), difference in the peak height (6%), and a combination of these factors (21). This study demonstrates that spot identification and characterization make major contributions to variability seen with 2DE. Methods to highlight why measured protein spot abundance is different could reduce these errors.

  9. Nonnegative signal factorization with learnt instrument models for sound source separation in close-microphone recordings

    Science.gov (United States)

    Carabias-Orti, Julio J.; Cobos, Máximo; Vera-Candeas, Pedro; Rodríguez-Serrano, Francisco J.

    2013-12-01

    Close-microphone techniques are extensively employed in many live music recordings, allowing for interference rejection and reducing the amount of reverberation in the resulting instrument tracks. However, despite the use of directional microphones, the recorded tracks are not completely free from source interference, a problem which is commonly known as microphone leakage. While source separation methods are potentially a solution to this problem, few approaches take into account the huge amount of prior information available in this scenario. In fact, besides the special properties of close-microphone tracks, the knowledge on the number and type of instruments making up the mixture can also be successfully exploited for improved separation performance. In this paper, a nonnegative matrix factorization (NMF) method making use of all the above information is proposed. To this end, a set of instrument models are learnt from a training database and incorporated into a multichannel extension of the NMF algorithm. Several options to initialize the algorithm are suggested, exploring their performance in multiple music tracks and comparing the results to other state-of-the-art approaches.

  10. Mining nutrients (N, K, P) from urban source-separated urine by forward osmosis dewatering.

    Science.gov (United States)

    Zhang, Jiefeng; She, Qianhong; Chang, Victor W C; Tang, Chuyang Y; Webster, Richard D

    2014-03-18

    Separating urine from domestic wastewater promotes a more sustainable municipal wastewater treatment system. This study investigated the feasibility of applying a forward osmosis (FO) dewatering process for nutrient recovery from source-separated urine under different conditions, using seawater or desalination brine as a low-cost draw solution. The filtration process with the active layer facing feed solution exhibited relatively high water fluxes up to 20 L/m(2)-h. The process also revealed relatively low rejection to neutral organic nitrogen (urea-N) in fresh urine but improved rejection of ammonium (50-80%) in hydrolyzed urine and high rejection (>90%) of phosphate, potassium in most cases. Compared to simulation based on the solution-diffusion mechanism, higher water flux and solute flux were obtained using fresh or hydrolyzed urine as the feed, which was attributed to the intensive forward nutrient permeation (i.e., of urea, ammonium, and potassium). Membrane fouling could be avoided by prior removal of the spontaneously precipitated crystals in urine. Compared to other urine treatment options, the current process was cost-effective and environmentally friendly for nutrient recovery from urban wastewater at source, yet a comprehensive life-cycle impact assessment might be needed to evaluate and optimize the overall system performance at pilot and full scale operation.

  11. A Novel Damage Detection Algorithm using Time-Series Analysis-Based Blind Source Separation

    Directory of Open Access Journals (Sweden)

    A. Sadhu

    2013-01-01

    Full Text Available In this paper, a novel damage detection algorithm is developed based on blind source separation in conjunction with time-series analysis. Blind source separation (BSS, is a powerful signal processing tool that is used to identify the modal responses and mode shapes of a vibrating structure using only the knowledge of responses. In the proposed method, BSS is first employed to estimate the modal response using the vibration measurements. Time-series analysis is then performed to characterize the mono-component modal responses and successively the resulting time-series models are utilized for one-step ahead prediction of the modal response. With the occurrence of newer measurements containing the signature of damaged system, a variance-based damage index is used to identify the damage instant. Once the damage instant is identified, the damaged and undamaged modal parameters of the system are estimated in an adaptive fashion. The proposed method solves classical damage detection issues including the identification of damage instant, location as well as the severity of damage. The proposed damage detection algorithm is verified using extensive numerical simulations followed by the full scale study of UCLA Factor building using the measured responses under Parkfield earthquake.

  12. Modelling fuel consumption in kerbside source segregated food waste collection: separate collection and co-collection.

    Science.gov (United States)

    Chu, T W; Heaven, S; Gredmaier, L

    2015-01-01

    Source separated food waste is a valuable feedstock for renewable energy production through anaerobic digestion, and a variety of collection schemes for this material have recently been introduced. The aim of this study was to identify options that maximize collection efficiency and reduce fuel consumption as part of the overall energy balance. A mechanistic model was developed to calculate the fuel consumption of kerbside collection of source segregated food waste, co-mingled dry recyclables and residual waste. A hypothetical city of 20,000 households was considered and nine scenarios were tested with different combinations of collection frequencies, vehicle types and waste types. The results showed that the potential fuel savings from weekly and fortnightly co-collection of household waste range from 7.4% to 22.4% and 1.8% to 26.6%, respectively, when compared to separate collection. A compartmentalized vehicle split 30:70 always performed better than one with two compartments of equal size. Weekly food waste collection with alternate weekly collection of the recyclables and residual waste by two-compartment collection vehicles was the best option to reduce the overall fuel consumption.

  13. Separation of radio-frequency sources and localization of partial discharges in noisy environments.

    Science.gov (United States)

    Robles, Guillermo; Fresno, José Manuel; Martínez-Tarifa, Juan Manuel

    2015-04-27

    The detection of partial discharges (PD) can help in early-warning detection systems to protect critical assets in power systems. The radio-frequency emission of these events can be measured with antennas even when the equipment is in service which reduces dramatically the maintenance costs and favours the implementation of condition-based monitoring systems. The drawback of these type of measurements is the difficulty of having a reference signal to study the events in a classical phase-resolved partial discharge pattern (PRPD). Therefore, in open-air substations and overhead lines where interferences from radio and TV broadcasting and mobile communications are important sources of noise and other pulsed interferences from rectifiers or inverters can be present, it is difficult to identify whether there is partial discharges activity or not. This paper proposes a robust method to separate the events captured with the antennas, identify which of them are partial discharges and localize the piece of equipment that is having problems. The separation is done with power ratio (PR) maps based on the spectral characteristics of the signal and the identification of the type of event is done localizing the source with an array of four antennas. Several classical methods to calculate the time differences of arrival (TDOA) of the emission to the antennas have been tested, and the localization is done using particle swarm optimization (PSO) to minimize a distance function.

  14. Separation of Radio-Frequency Sources and Localization of Partial Discharges in Noisy Environments

    Directory of Open Access Journals (Sweden)

    Guillermo Robles

    2015-04-01

    Full Text Available The detection of partial discharges (PD can help in early-warning detection systems to protect critical assets in power systems. The radio-frequency emission of these events can be measured with antennas even when the equipment is in service which reduces dramatically the maintenance costs and favours the implementation of condition-based monitoring systems. The drawback of these type of measurements is the difficulty of having a reference signal to study the events in a classical phase-resolved partial discharge pattern (PRPD. Therefore, in open-air substations and overhead lines where interferences from radio and TV broadcasting and mobile communications are important sources of noise and other pulsed interferences from rectifiers or inverters can be present, it is difficult to identify whether there is partial discharges activity or not. This paper proposes a robust method to separate the events captured with the antennas, identify which of them are partial discharges and localize the piece of equipment that is having problems. The separation is done with power ratio (PR maps based on the spectral characteristics of the signal and the identification of the type of event is done localizing the source with an array of four antennas. Several classical methods to calculate the time differences of arrival (TDOA of the emission to the antennas have been tested, and the localization is done using particle swarm optimization (PSO to minimize a distance function.

  15. Blind Source Separation and Dynamic Fuzzy Neural Network for Fault Diagnosis in Machines

    International Nuclear Information System (INIS)

    Huang, Haifeng; Ouyang, Huajiang; Gao, Hongli

    2015-01-01

    Many assessment and detection methods are used to diagnose faults in machines. High accuracy in fault detection and diagnosis can be achieved by using numerical methods with noise-resistant properties. However, to some extent, noise always exists in measured data on real machines, which affects the identification results, especially in the diagnosis of early- stage faults. In view of this situation, a damage assessment method based on blind source separation and dynamic fuzzy neural network (DFNN) is presented to diagnose the early-stage machinery faults in this paper. In the processing of measurement signals, blind source separation is adopted to reduce noise. Then sensitive features of these faults are obtained by extracting low dimensional manifold characteristics from the signals. The model for fault diagnosis is established based on DFNN. Furthermore, on-line computation is accelerated by means of compressed sensing. Numerical vibration signals of ball screw fault modes are processed on the model for mechanical fault diagnosis and the results are in good agreement with the actual condition even at the early stage of fault development. This detection method is very useful in practice and feasible for early-stage fault diagnosis. (paper)

  16. Source-based neurofeedback methods using EEG recordings: training altered brain activity in a functional brain source derived from blind source separation

    Science.gov (United States)

    White, David J.; Congedo, Marco; Ciorciari, Joseph

    2014-01-01

    A developing literature explores the use of neurofeedback in the treatment of a range of clinical conditions, particularly ADHD and epilepsy, whilst neurofeedback also provides an experimental tool for studying the functional significance of endogenous brain activity. A critical component of any neurofeedback method is the underlying physiological signal which forms the basis for the feedback. While the past decade has seen the emergence of fMRI-based protocols training spatially confined BOLD activity, traditional neurofeedback has utilized a small number of electrode sites on the scalp. As scalp EEG at a given electrode site reflects a linear mixture of activity from multiple brain sources and artifacts, efforts to successfully acquire some level of control over the signal may be confounded by these extraneous sources. Further, in the event of successful training, these traditional neurofeedback methods are likely influencing multiple brain regions and processes. The present work describes the use of source-based signal processing methods in EEG neurofeedback. The feasibility and potential utility of such methods were explored in an experiment training increased theta oscillatory activity in a source derived from Blind Source Separation (BSS) of EEG data obtained during completion of a complex cognitive task (spatial navigation). Learned increases in theta activity were observed in two of the four participants to complete 20 sessions of neurofeedback targeting this individually defined functional brain source. Source-based EEG neurofeedback methods using BSS may offer important advantages over traditional neurofeedback, by targeting the desired physiological signal in a more functionally and spatially specific manner. Having provided preliminary evidence of the feasibility of these methods, future work may study a range of clinically and experimentally relevant brain processes where individual brain sources may be targeted by source-based EEG neurofeedback. PMID

  17. Functional Semi-Blind Source Separation Identifies Primary Motor Area Without Active Motor Execution.

    Science.gov (United States)

    Porcaro, Camillo; Cottone, Carlo; Cancelli, Andrea; Salustri, Carlo; Tecchio, Franca

    2018-04-01

    High time resolution techniques are crucial for investigating the brain in action. Here, we propose a method to identify a section of the upper-limb motor area representation (FS_M1) by means of electroencephalographic (EEG) signals recorded during a completely passive condition (FS_M1bySS). We delivered a galvanic stimulation to the median nerve and we applied to EEG the semi-Blind Source Separation (s-BSS) algorithm named Functional Source Separation (FSS). In order to prove that FS_M1bySS is part of FS_M1, we also collected EEG in a motor condition, i.e. during a voluntary movement task (isometric handgrip) and in a rest condition, i.e. at rest with eyes open and closed. In motor condition, we show that the cortico-muscular coherence (CMC) of FS_M1bySS does not differ from FS_ M1 CMC (0.04 for both sources). Moreover, we show that the FS_M1bySS's ongoing whole band activity during Motor and both rest conditions displays high mutual information and time correlation with FS_M1 (above 0.900 and 0.800, respectively) whereas much smaller ones with the primary somatosensory cortex [Formula: see text] (about 0.300 and 0.500, [Formula: see text]). FS_M1bySS as a marker of the upper-limb FS_M1 representation obtainable without the execution of an active motor task is a great achievement of the FSS algorithm, relevant in most experimental, neurological and psychiatric protocols.

  18. GBIS (Geodetic Bayesian Inversion Software): Rapid Inversion of InSAR and GNSS Data to Estimate Surface Deformation Source Parameters and Uncertainties

    Science.gov (United States)

    Bagnardi, M.; Hooper, A. J.

    2017-12-01

    Inversions of geodetic observational data, such as Interferometric Synthetic Aperture Radar (InSAR) and Global Navigation Satellite System (GNSS) measurements, are often performed to obtain information about the source of surface displacements. Inverse problem theory has been applied to study magmatic processes, the earthquake cycle, and other phenomena that cause deformation of the Earth's interior and of its surface. Together with increasing improvements in data resolution, both spatial and temporal, new satellite missions (e.g., European Commission's Sentinel-1 satellites) are providing the unprecedented opportunity to access space-geodetic data within hours from their acquisition. To truly take advantage of these opportunities we must become able to interpret geodetic data in a rapid and robust manner. Here we present the open-source Geodetic Bayesian Inversion Software (GBIS; available for download at http://comet.nerc.ac.uk/gbis). GBIS is written in Matlab and offers a series of user-friendly and interactive pre- and post-processing tools. For example, an interactive function has been developed to estimate the characteristics of noise in InSAR data by calculating the experimental semi-variogram. The inversion software uses a Markov-chain Monte Carlo algorithm, incorporating the Metropolis-Hastings algorithm with adaptive step size, to efficiently sample the posterior probability distribution of the different source parameters. The probabilistic Bayesian approach allows the user to retrieve estimates of the optimal (best-fitting) deformation source parameters together with the associated uncertainties produced by errors in the data (and by scaling, errors in the model). The current version of GBIS (V1.0) includes fast analytical forward models for magmatic sources of different geometry (e.g., point source, finite spherical source, prolate spheroid source, penny-shaped sill-like source, and dipping-dike with uniform opening) and for dipping faults with uniform

  19. A system dynamics model to evaluate effects of source separation of municipal solid waste management: A case of Bangkok, Thailand.

    Science.gov (United States)

    Sukholthaman, Pitchayanin; Sharp, Alice

    2016-06-01

    Municipal solid waste has been considered as one of the most immediate and serious problems confronting urban government in most developing and transitional economies. Providing solid waste performance highly depends on the effectiveness of waste collection and transportation process. Generally, this process involves a large amount of expenditures and has very complex and dynamic operational problems. Source separation has a major impact on effectiveness of waste management system as it causes significant changes in quantity and quality of waste reaching final disposal. To evaluate the impact of effective source separation on waste collection and transportation, this study adopts a decision support tool to comprehend cause-and-effect interactions of different variables in waste management system. A system dynamics model that envisages the relationships of source separation and effectiveness of waste management in Bangkok, Thailand is presented. Influential factors that affect waste separation attitudes are addressed; and the result of change in perception on waste separation is explained. The impacts of different separation rates on effectiveness of provided collection service are compared in six scenarios. 'Scenario 5' gives the most promising opportunities as 40% of residents are willing to conduct organic and recyclable waste separation. The results show that better service of waste collection and transportation, less monthly expense, extended landfill life, and satisfactory efficiency of the provided service at 60.48% will be achieved at the end of the simulation period. Implications of how to get public involved and conducted source separation are proposed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms

    Directory of Open Access Journals (Sweden)

    Pan Qiongfeng

    2007-01-01

    Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.

  1. Time-Domain Convolutive Blind Source Separation Employing Selective-Tap Adaptive Algorithms

    Directory of Open Access Journals (Sweden)

    Qiongfeng Pan

    2007-04-01

    Full Text Available We investigate novel algorithms to improve the convergence and reduce the complexity of time-domain convolutive blind source separation (BSS algorithms. First, we propose MMax partial update time-domain convolutive BSS (MMax BSS algorithm. We demonstrate that the partial update scheme applied in the MMax LMS algorithm for single channel can be extended to multichannel time-domain convolutive BSS with little deterioration in performance and possible computational complexity saving. Next, we propose an exclusive maximum selective-tap time-domain convolutive BSS algorithm (XM BSS that reduces the interchannel coherence of the tap-input vectors and improves the conditioning of the autocorrelation matrix resulting in improved convergence rate and reduced misalignment. Moreover, the computational complexity is reduced since only half of the tap inputs are selected for updating. Simulation results have shown a significant improvement in convergence rate compared to existing techniques.

  2. Resonance Raman Spectroscopy of human brain metastasis of lung cancer analyzed by blind source separation

    Science.gov (United States)

    Zhou, Yan; Liu, Cheng-Hui; Pu, Yang; Cheng, Gangge; Yu, Xinguang; Zhou, Lixin; Lin, Dongmei; Zhu, Ke; Alfano, Robert R.

    2017-02-01

    Resonance Raman (RR) spectroscopy offers a novel Optical Biopsy method in cancer discrimination by a means of enhancement in Raman scattering. It is widely acknowledged that the RR spectrum of tissue is a superposition of spectra of various key building block molecules. In this study, the Resonance Raman (RR) spectra of human metastasis of lung cancerous and normal brain tissues excited by a visible selected wavelength at 532 nm are used to explore spectral changes caused by the tumor evolution. The potential application of RR spectra human brain metastasis of lung cancer was investigated by Blind Source Separation such as Principal Component Analysis (PCA). PCA is a statistical procedure that uses an orthogonal transformation to convert a set of observations of possibly correlated variables into a set of values of linearly uncorrelated variables called principal components (PCs). The results show significant RR spectra difference between human metastasis of lung cancerous and normal brain tissues analyzed by PCA. To evaluate the efficacy of for cancer detection, a linear discriminant analysis (LDA) classifier is utilized to calculate the sensitivity, and specificity and the receiver operating characteristic (ROC) curves are used to evaluate the performance of this criterion. Excellent sensitivity of 0.97, specificity (close to 1.00) and the Area Under ROC Curve (AUC) of 0.99 values are achieved under best optimal circumstance. This research demonstrates that RR spectroscopy is effective for detecting changes of tissues due to the development of brain metastasis of lung cancer. RR spectroscopy analyzed by blind source separation may have potential to be a new armamentarium.

  3. A review of output-only structural mode identification literature employing blind source separation methods

    Science.gov (United States)

    Sadhu, A.; Narasimhan, S.; Antoni, J.

    2017-09-01

    Output-only modal identification has seen significant activity in recent years, especially in large-scale structures where controlled input force generation is often difficult to achieve. This has led to the development of new system identification methods which do not require controlled input. They often work satisfactorily if they satisfy some general assumptions - not overly restrictive - regarding the stochasticity of the input. Hundreds of papers covering a wide range of applications appear every year related to the extraction of modal properties from output measurement data in more than two dozen mechanical, aerospace and civil engineering journals. In little more than a decade, concepts of blind source separation (BSS) from the field of acoustic signal processing have been adopted by several researchers and shown that they can be attractive tools to undertake output-only modal identification. Originally intended to separate distinct audio sources from a mixture of recordings, mathematical equivalence to problems in linear structural dynamics have since been firmly established. This has enabled many of the developments in the field of BSS to be modified and applied to output-only modal identification problems. This paper reviews over hundred articles related to the application of BSS and their variants to output-only modal identification. The main contribution of the paper is to present a literature review of the papers which have appeared on the subject. While a brief treatment of the basic ideas are presented where relevant, a comprehensive and critical explanation of their contents is not attempted. Specific issues related to output-only modal identification and the relative advantages and limitations of BSS methods both from theoretical and application standpoints are discussed. Gap areas requiring additional work are also summarized and the paper concludes with possible future trends in this area.

  4. Saline sewage treatment and source separation of urine for more sustainable urban water management.

    Science.gov (United States)

    Ekama, G A; Wilsenach, J A; Chen, G H

    2011-01-01

    While energy consumption and its associated carbon emission should be minimized in wastewater treatment, it has a much lower priority than human and environmental health, which are both closely related to efficient water quality management. So conservation of surface water quality and quantity are more important for sustainable development than green house gas (GHG) emissions per se. In this paper, two urban water management strategies to conserve fresh water quality and quantity are considered: (1) source separation of urine for improved water quality and (2) saline (e.g. sea) water toilet flushing for reduced fresh water consumption in coastal and mining cities. The former holds promise for simpler and shorter sludge age activated sludge wastewater treatment plants (no nitrification and denitrification), nutrient (Mg, K, P) recovery and improved effluent quality (reduced endocrine disruptor and environmental oestrogen concentrations) and the latter for significantly reduced fresh water consumption, sludge production and oxygen demand (through using anaerobic bioprocesses) and hence energy consumption. Combining source separation of urine and saline water toilet flushing can reduce sewer crown corrosion and reduce effluent P concentrations. To realize the advantages of these two approaches will require significant urban water management changes in that both need dual (fresh and saline) water distribution and (yellow and grey/brown) wastewater collection systems. While considerable work is still required to evaluate these new approaches and quantify their advantages and disadvantages, it would appear that the investment for dual water distribution and wastewater collection systems may be worth making to unlock their benefits for more sustainable urban development.

  5. Life cycle assessment and costing of urine source separation: Focus on nonsteroidal anti-inflammatory drug removal.

    Science.gov (United States)

    Landry, Kelly A; Boyer, Treavor H

    2016-11-15

    Urine source separation has the potential to reduce pharmaceutical loading to the environment, while enhancing nutrient recovery. The focus of this life cycle assessment (LCA) was to evaluate the environmental impacts and economic costs to manage nonsteroidal anti-inflammatory drugs (NSAIDs) (i.e., diclofenac, ibuprofen, ketoprofen and naproxen) and nutrients in human urine. Urine source separation was compared with centralized wastewater treatment (WWT) (biological or upgraded with ozonation). The current treatment method (i.e., centralized biological WWT) was compared with hypothetical treatment scenarios (i.e., centralized biological WWT upgraded with ozonation, and urine source separation). Alternative urine source separation scenarios included varying collection and handling methods (i.e., collection by vacuum truck, vacuum sewer, or decentralized treatment), pharmaceuticals removal by ion-exchange, and struvite precipitation. Urine source separation scenarios had 90% lower environmental impact (based on the TRACI impact assessment method) compared with the centralized wastewater scenarios due to reduced potable water production for flush water, reduced electricity use at the wastewater treatment plant, and nutrient offsets from struvite precipitation. Despite the greatest reduction of pharmaceutical toxicity, centralized treatment upgraded with ozone had the greatest ecotoxicity impacts due to ozonation operation and infrastructure. Among urine source separation scenarios, decentralized treatment of urine and centralized treatment of urine collected by vacuum truck had negligible cost differences compared with centralized wastewater treatment. Centralized treatment of urine collected by vacuum sewer and centralized treatment with ozone cost 30% more compared with conventional wastewater treatment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Detection of sudden structural damage using blind source separation and time–frequency approaches

    International Nuclear Information System (INIS)

    Morovati, V; Kazemi, M T

    2016-01-01

    Seismic signal processing is one of the most reliable methods of detecting the structural damage during earthquakes. In this paper, the use of the hybrid method of blind source separation (BSS) and time–frequency analysis (TFA) is explored to detect the changes in the structural response data. The combination of the BSS and TFA is applied to the seismic signals due to the non-stationary nature of them. Firstly, the second-order blind identification technique is used to decompose the response signal of structural vibration into modal coordinate signals which will be mono-components for TFA. Then each mono-component signal is analyzed to extract instantaneous frequency of structure. Numerical simulations and a real-world seismic-excited structure with time-varying frequencies show the accuracy and robustness of the developed algorithm. TFA of extracted sources shows that used method can be successfully applied to structural damage detection. The results also demonstrate that the combined method can be used to identify the time instant of structural damage occurrence more sharply and effectively than by the use of TFA alone. (paper)

  7. Blind Source Separation for Unimodal and Multimodal Brain Networks: A Unifying Framework for Subspace Modeling.

    Science.gov (United States)

    Silva, Rogers F; Plis, Sergey M; Sui, Jing; Pattichis, Marios S; Adalı, Tülay; Calhoun, Vince D

    2016-10-01

    In the past decade, numerous advances in the study of the human brain were fostered by successful applications of blind source separation (BSS) methods to a wide range of imaging modalities. The main focus has been on extracting "networks" represented as the underlying latent sources. While the broad success in learning latent representations from multiple datasets has promoted the wide presence of BSS in modern neuroscience, it also introduced a wide variety of objective functions, underlying graphical structures, and parameter constraints for each method. Such diversity, combined with a host of datatype-specific know-how, can cause a sense of disorder and confusion, hampering a practitioner's judgment and impeding further development. We organize the diverse landscape of BSS models by exposing its key features and combining them to establish a novel unifying view of the area. In the process, we unveil important connections among models according to their properties and subspace structures. Consequently, a high-level descriptive structure is exposed, ultimately helping practitioners select the right model for their applications. Equipped with that knowledge, we review the current state of BSS applications to neuroimaging. The gained insight into model connections elicits a broader sense of generalization, highlighting several directions for model development. In light of that, we discuss emerging multi-dataset multidimensional (MDM) models and summarize their benefits for the study of the healthy brain and disease-related changes.

  8. The Crowding-Out Effects of Garbage Fees and Voluntary Source Separation Programs on Waste Reduction: Evidence from China

    Directory of Open Access Journals (Sweden)

    Hongyun Han

    2016-07-01

    Full Text Available This paper examines how and to what degree government policies of garbage fees and voluntary source separation programs, with free indoor containers and garbage bags, can affect the effectiveness of municipal solid waste (MSW management, in the sense of achieving a desirable reduction of per capita MSW generation. Based on city-level panel data for years 1998–2012 in China, our empirical analysis indicates that per capita MSW generated is increasing with per capita disposable income, average household size, education levels of households, and the lagged per capita MSW. While both garbage fees and source separation programs have separately led to reductions in per capita waste generation, the interaction of the two policies has resulted in an increase in per capita waste generation due to the following crowding-out effects: Firstly, the positive effect of income dominates the negative effect of the garbage fee. Secondly, there are crowding-out effects of mandatory charging system and the subsidized voluntary source separation on per capita MSW generation. Thirdly, small subsidies and tax punishments have reduced the intrinsic motivation for voluntary source separation of MSW. Thus, compatible fee charging system, higher levels of subsidies, and well-designed public information and education campaigns are required to promote household waste source separation and reduction.

  9. Bench-scale composting of source-separated human faeces for sanitation.

    Science.gov (United States)

    Niwagaba, C; Nalubega, M; Vinnerås, B; Sundberg, C; Jönsson, H

    2009-02-01

    In urine-diverting toilets, urine and faeces are collected separately so that nutrient content can be recycled unmixed. Faeces should be sanitized before use in agriculture fields due to the presence of possible enteric pathogens. Composting of human faeces with food waste was evaluated as a possible method for this treatment. Temperatures were monitored in three 78-L wooden compost reactors fed with faeces-to-food waste substrates (F:FW) in wet weight ratios of 1:0, 3:1 and 1:1, which were observed for approximately 20 days. To achieve temperatures higher than 15 degrees C above ambient, insulation was required for the reactors. Use of 25-mm thick styrofoam insulation around the entire exterior of the compost reactors and turning of the compost twice a week resulted in sanitizing temperatures (>or=50 degrees C) to be maintained for 8 days in the F:FW=1:1 compost and for 4 days in the F:FW=3:1 compost. In these composts, a reduction of >3 log(10) for E. coli and >4 log(10) for Enterococcus spp. was achieved. The F:FW=1:0 compost, which did not maintain >or=50 degrees C for a sufficiently long period, was not sanitized, as the counts of E. coli and Enterococcus spp. increased between days 11 and 15. This research provides useful information on the design and operation of family-size compost units for the treatment of source-separated faeces and starchy food residues, most likely available amongst the less affluent rural/urban society in Uganda.

  10. Developing methodologies for source attribution. Glass phase separation in Trinitite using NF{sub 3}

    Energy Technology Data Exchange (ETDEWEB)

    Koeman, Elizabeth C.; Simonetti, Antonio [Notre Dame Univ., IN (United States). Dept. of Civil and Environmental Engineering and Earth Sciences; McNamara, Bruce K.; Smith, Frances N. [Pacific Northwest National Laboratory, Richland, WA (United States). Nuclear Chemistry and Engineering; Burns, Peter C. [Notre Dame Univ., IN (United States). Dept. of Civil and Environmental Engineering and Earth Sciences; Notre Dame Univ., IN (United States). Dept. of Chemistry and Biochemistry

    2017-08-01

    separation of silica from minerals (i.e. naturally occurring crystalline materials) and glasses (i.e. amorphous materials), leaving behind non-volatile fluorinated species and refractory phases. The results from our investigation clearly indicate that the NF{sub 3} treatment of nuclear materials is a technique that provides effective separation of bomb components from complex matrices (e.g. post-detonation samples), which will aid with rapid and accurate source attribution.

  11. Fate of the Urinary Tract Virus BK Human Polyomavirus in Source-Separated Urine.

    Science.gov (United States)

    Goetsch, Heather E; Zhao, Linbo; Gnegy, Mariah; Imperiale, Michael J; Love, Nancy G; Wigginton, Krista R

    2018-04-01

    Human polyomaviruses are emerging pathogens that infect a large percentage of the human population and are excreted in urine. Consequently, urine that is collected for fertilizer production often has high concentrations of polyomavirus genes. We studied the fate of infectious double-stranded DNA (dsDNA) BK human polyomavirus (BKPyV) in hydrolyzed source-separated urine with infectivity assays and quantitative PCR (qPCR). Although BKPyV genomes persisted in the hydrolyzed urine for long periods of time ( T 90 [time required for 90% reduction in infectivity or gene copies] of >3 weeks), the viruses were rapidly inactivated ( T 90 of 1.1 to 11 h) in most of the tested urine samples. Interestingly, the infectivity of dsDNA bacteriophage surrogate T3 ( T 90 of 24 to 46 days) was much more persistent than that of BKPyV, highlighting a major shortcoming of using bacteriophages as human virus surrogates. Pasteurization and filtration experiments suggest that BKPyV virus inactivation was due to microorganism activity in the source-separated urine, and SDS-PAGE Western blots showed that BKPyV protein capsid disassembly is concurrent with inactivation. Our results imply that stored urine does not pose a substantial risk of BKPyV transmission, that qPCR and infectivity of the dsDNA surrogate do not accurately depict BKPyV fate, and that microbial inactivation is driven by structural elements of the BKPyV capsid. IMPORTANCE We demonstrate that a common urinary tract virus has a high susceptibility to the conditions in hydrolyzed urine and consequently would not be a substantial exposure route to humans using urine-derived fertilizers. The results have significant implications for understanding virus fate. First, by demonstrating that the dsDNA (double-stranded DNA) genome of the polyomavirus lasts for weeks despite infectivity lasting for hours to days, our work highlights the shortcomings of using qPCR to estimate risks from unculturable viruses. Second, commonly used ds

  12. The distressed brain: a group blind source separation analysis on tinnitus.

    Directory of Open Access Journals (Sweden)

    Dirk De Ridder

    Full Text Available Tinnitus, the perception of a sound without an external sound source, can lead to variable amounts of distress.In a group of tinnitus patients with variable amounts of tinnitus related distress, as measured by the Tinnitus Questionnaire (TQ, an electroencephalography (EEG is performed, evaluating the patients' resting state electrical brain activity. This resting state electrical activity is compared with a control group and between patients with low (N = 30 and high distress (N = 25. The groups are homogeneous for tinnitus type, tinnitus duration or tinnitus laterality. A group blind source separation (BSS analysis is performed using a large normative sample (N = 84, generating seven normative components to which high and low tinnitus patients are compared. A correlation analysis of the obtained normative components' relative power and distress is performed. Furthermore, the functional connectivity as reflected by lagged phase synchronization is analyzed between the brain areas defined by the components. Finally, a group BSS analysis on the Tinnitus group as a whole is performed.Tinnitus can be characterized by at least four BSS components, two of which are posterior cingulate based, one based on the subgenual anterior cingulate and one based on the parahippocampus. Only the subgenual component correlates with distress. When performed on a normative sample, group BSS reveals that distress is characterized by two anterior cingulate based components. Spectral analysis of these components demonstrates that distress in tinnitus is related to alpha and beta changes in a network consisting of the subgenual anterior cingulate cortex extending to the pregenual and dorsal anterior cingulate cortex as well as the ventromedial prefrontal cortex/orbitofrontal cortex, insula, and parahippocampus. This network overlaps partially with brain areas implicated in distress in patients suffering from pain, functional somatic syndromes and posttraumatic stress

  13. Using discharge data to reduce structural deficits in a hydrological model with a Bayesian inference approach and the implications for the prediction of critical source areas

    Science.gov (United States)

    Frey, M. P.; Stamm, C.; Schneider, M. K.; Reichert, P.

    2011-12-01

    A distributed hydrological model was used to simulate the distribution of fast runoff formation as a proxy for critical source areas for herbicide pollution in a small agricultural catchment in Switzerland. We tested to what degree predictions based on prior knowledge without local measurements could be improved upon relying on observed discharge. This learning process consisted of five steps: For the prior prediction (step 1), knowledge of the model parameters was coarse and predictions were fairly uncertain. In the second step, discharge data were used to update the prior parameter distribution. Effects of uncertainty in input data and model structure were accounted for by an autoregressive error model. This step decreased the width of the marginal distributions of parameters describing the lower boundary (percolation rates) but hardly affected soil hydraulic parameters. Residual analysis (step 3) revealed model structure deficits. We modified the model, and in the subsequent Bayesian updating (step 4) the widths of the posterior marginal distributions were reduced for most parameters compared to those of the prior. This incremental procedure led to a strong reduction in the uncertainty of the spatial prediction. Thus, despite only using spatially integrated data (discharge), the spatially distributed effect of the improved model structure can be expected to improve the spatially distributed predictions also. The fifth step consisted of a test with independent spatial data on herbicide losses and revealed ambiguous results. The comparison depended critically on the ratio of event to preevent water that was discharged. This ratio cannot be estimated from hydrological data only. The results demonstrate that the value of local data is strongly dependent on a correct model structure. An iterative procedure of Bayesian updating, model testing, and model modification is suggested.

  14. Insects associated with the composting process of solid urban waste separated at the source

    Directory of Open Access Journals (Sweden)

    Gladis Estela Morales

    2010-01-01

    Full Text Available Sarcosaprophagous macroinvertebrates (earthworms, termites and a number of Diptera larvae enhance changes in the physical and chemical properties of organic matter during degradation and stabilization processes in composting, causing a decrease in the molecular weights of compounds. This activity makes these organisms excellent recyclers of organic matter. This article evaluates the succession of insects associated with the decomposition of solid urban waste separated at the source. The study was carried out in the city of Medellin, Colombia. A total of 11,732 individuals were determined, belonging to the classes Insecta and Arachnida. Species of three orders of Insecta were identified, Diptera, Coleoptera and Hymenoptera. Diptera corresponding to 98.5% of the total, was the most abundant and diverse group, with 16 families (Calliphoridae, Drosophilidae, Psychodidae, Fanniidae, Muscidae, Milichiidae, Ulidiidae, Scatopsidae, Sepsidae, Sphaeroceridae, Heleomyzidae, Stratiomyidae, Syrphidae, Phoridae, Tephritidae and Curtonotidae followed by Coleoptera with five families (Carabidae, Staphylinidae, Ptiliidae, Hydrophilidae and Phalacaridae. Three stages were observed during the composting process, allowing species associated with each stage to be identified. Other species were also present throughout the whole process. In terms of number of species, Diptera was the most important group observed, particularly Ornidia obesa, considered a highly invasive species, and Hermetia illuscens, both reported as beneficial for decomposition of organic matter.

  15. Liquid digestate from anaerobic treatment of source-separated household waste as fertilizer to barley.

    Science.gov (United States)

    Haraldsen, Trond Knapp; Andersen, Uno; Krogstad, Tore; Sørheim, Roald

    2011-12-01

    This study examined the efficiency of different organic waste materials as NPK fertilizer, in addition to the risk for leaching losses related to shower precipitation in the first part of the growing season. The experiment was tested in a pot trial on a sandy soil in a greenhouse. Six organic fertilizers were evaluated: liquid anaerobic digestate (LAD) sourced from separated household waste, nitrified liquid anaerobic digestate (NLAD) of the same origin as LAD, meat and bone meal (MBM), hydrolysed salmon protein (HSP), reactor-composted catering waste (CW) and cattle manure (CM). An unfertilized control, calcium nitrate (CN) and Fullgjødsel® 21-4-10 were used as reference fertilizers. At equal amounts of mineral nitrogen both LAD and Fullgjødsel® gave equal yield of barley in addition to equal uptake of N, P, and K in barley grain. NLAD gave significantly lower barley yield than the original LAD due to leaching of nitrate-N after a simulated surplus of precipitation (28 mm) at Zadoks 14. There was significantly increased leaching of nitrate N from the treatments receiving 160 kg N ha(-1) of CN and NLAD in comparison with all the other organic fertilizers. In this study LAD performed to the same degree as Fullgjødsel® NPK fertilizer and it was concluded that LAD can be recommended as fertilizer for cereals. Nitrification of the ammonium N in the digestate caused significantly increased nitrate leaching, and cannot be recommended.

  16. Energy and time modelling of kerbside waste collection: Changes incurred when adding source separated food waste.

    Science.gov (United States)

    Edwards, Joel; Othman, Maazuza; Burn, Stewart; Crossin, Enda

    2016-10-01

    The collection of source separated kerbside municipal FW (SSFW) is being incentivised in Australia, however such a collection is likely to increase the fuel and time a collection truck fleet requires. Therefore, waste managers need to determine whether the incentives outweigh the cost. With literature scarcely describing the magnitude of increase, and local parameters playing a crucial role in accurately modelling kerbside collection; this paper develops a new general mathematical model that predicts the energy and time requirements of a collection regime whilst incorporating the unique variables of different jurisdictions. The model, Municipal solid waste collect (MSW-Collect), is validated and shown to be more accurate at predicting fuel consumption and trucks required than other common collection models. When predicting changes incurred for five different SSFW collection scenarios, results show that SSFW scenarios require an increase in fuel ranging from 1.38% to 57.59%. There is also a need for additional trucks across most SSFW scenarios tested. All SSFW scenarios are ranked and analysed in regards to fuel consumption; sensitivity analysis is conducted to test key assumptions. Crown Copyright © 2016. Published by Elsevier Ltd. All rights reserved.

  17. Feature Selection and Blind Source Separation in an EEG-Based Brain-Computer Interface

    Directory of Open Access Journals (Sweden)

    Michael H. Thaut

    2005-11-01

    Full Text Available Most EEG-based BCI systems make use of well-studied patterns of brain activity. However, those systems involve tasks that indirectly map to simple binary commands such as “yes” or “no” or require many weeks of biofeedback training. We hypothesized that signal processing and machine learning methods can be used to discriminate EEG in a direct “yes”/“no” BCI from a single session. Blind source separation (BSS and spectral transformations of the EEG produced a 180-dimensional feature space. We used a modified genetic algorithm (GA wrapped around a support vector machine (SVM classifier to search the space of feature subsets. The GA-based search found feature subsets that outperform full feature sets and random feature subsets. Also, BSS transformations of the EEG outperformed the original time series, particularly in conjunction with a subset search of both spaces. The results suggest that BSS and feature selection can be used to improve the performance of even a “direct,” single-session BCI.

  18. Blind Source Separation Algorithms Using Hyperbolic and Givens Rotations for High-Order QAM Constellations

    KAUST Repository

    Shah, Syed Awais Wahab

    2017-11-24

    This paper addresses the problem of blind demixing of instantaneous mixtures in a multiple-input multiple-output communication system. The main objective is to present efficient blind source separation (BSS) algorithms dedicated to moderate or high-order QAM constellations. Four new iterative batch BSS algorithms are presented dealing with the multimodulus (MM) and alphabet matched (AM) criteria. For the optimization of these cost functions, iterative methods of Givens and hyperbolic rotations are used. A pre-whitening operation is also utilized to reduce the complexity of design problem. It is noticed that the designed algorithms using Givens rotations gives satisfactory performance only for large number of samples. However, for small number of samples, the algorithms designed by combining both Givens and hyperbolic rotations compensate for the ill-whitening that occurs in this case and thus improves the performance. Two algorithms dealing with the MM criterion are presented for moderate order QAM signals such as 16-QAM. The other two dealing with the AM criterion are presented for high-order QAM signals. These methods are finally compared with the state of art batch BSS algorithms in terms of signal-to-interference and noise ratio, symbol error rate and convergence rate. Simulation results show that the proposed methods outperform the contemporary batch BSS algorithms.

  19. A multilevel reuse system with source separation process for printing and dyeing wastewater treatment: A case study.

    Science.gov (United States)

    Wang, Rui; Jin, Xin; Wang, Ziyuan; Gu, Wantao; Wei, Zhechao; Huang, Yuanjie; Qiu, Zhuang; Jin, Pengkang

    2018-01-01

    This paper proposes a new system of multilevel reuse with source separation in printing and dyeing wastewater (PDWW) treatment in order to dramatically improve the water reuse rate to 35%. By analysing the characteristics of the sources and concentrations of pollutants produced in different printing and dyeing processes, special, highly, and less contaminated wastewaters (SCW, HCW, and LCW, respectively) were collected and treated separately. Specially, a large quantity of LCW was sequentially reused at multiple levels to meet the water quality requirements for different production processes. Based on this concept, a multilevel reuse system with a source separation process was established in a typical printing and dyeing enterprise. The water reuse rate increased dramatically to 62%, and the reclaimed water was reused in different printing and dyeing processes based on the water quality. This study provides promising leads in water management for wastewater reclamation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Life cycle comparison of centralized wastewater treatment and urine source separation with struvite precipitation: Focus on urine nutrient management.

    Science.gov (United States)

    Ishii, Stephanie K L; Boyer, Treavor H

    2015-08-01

    Alternative approaches to wastewater management including urine source separation have the potential to simultaneously improve multiple aspects of wastewater treatment, including reduced use of potable water for waste conveyance and improved contaminant removal, especially nutrients. In order to pursue such radical changes, system-level evaluations of urine source separation in community contexts are required. The focus of this life cycle assessment (LCA) is managing nutrients from urine produced in a residential setting with urine source separation and struvite precipitation, as compared with a centralized wastewater treatment approach. The life cycle impacts evaluated in this study pertain to construction of the urine source separation system and operation of drinking water treatment, decentralized urine treatment, and centralized wastewater treatment. System boundaries include fertilizer offsets resulting from the production of urine based struvite fertilizer. As calculated by the Tool for the Reduction and Assessment of Chemical and Other Environmental Impacts (TRACI), urine source separation with MgO addition for subsequent struvite precipitation with high P recovery (Scenario B) has the smallest environmental cost relative to existing centralized wastewater treatment (Scenario A) and urine source separation with MgO and Na3PO4 addition for subsequent struvite precipitation with concurrent high P and N recovery (Scenario C). Preliminary economic evaluations show that the three urine management scenarios are relatively equal on a monetary basis (<13% difference). The impacts of each urine management scenario are most sensitive to the assumed urine composition, the selected urine storage time, and the assumed electricity required to treat influent urine and toilet water used to convey urine at the centralized wastewater treatment plant. The importance of full nutrient recovery from urine in combination with the substantial chemical inputs required for N recovery

  1. Re-examining mortality sources and population trends in a declining seabird: using Bayesian methods to incorporate existing information and new data.

    Directory of Open Access Journals (Sweden)

    Tim Reid

    Full Text Available The population of flesh-footed shearwaters (Puffinus carneipes breeding on Lord Howe Island was shown to be declining from the 1970's to the early 2000's. This was attributed to destruction of breeding habitat and fisheries mortality in the Australian Eastern Tuna and Billfish Fishery. Recent evidence suggests these impacts have ceased; presumably leading to population recovery. We used Bayesian statistical methods to combine data from the literature with more recent, but incomplete, field data to estimate population parameters and trends. This approach easily accounts for sources of variation and uncertainty while formally incorporating data and variation from different sources into the estimate. There is a 70% probability that the flesh-footed shearwater population on Lord Howe continued to decline during 2003-2009, and a number of possible reasons for this are suggested. During the breeding season, road-based mortality of adults on Lord Howe Island is likely to result in reduced adult survival and there is evidence that breeding success is negatively impacted by marine debris. Interactions with fisheries on flesh-footed shearwater winter grounds should be further investigated.

  2. The management challenge for household waste in emerging economies like Brazil: realistic source separation and activation of reverse logistics.

    Science.gov (United States)

    Fehr, M

    2014-09-01

    Business opportunities in the household waste sector in emerging economies still evolve around the activities of bulk collection and tipping with an open material balance. This research, conducted in Brazil, pursued the objective of shifting opportunities from tipping to reverse logistics in order to close the balance. To do this, it illustrated how specific knowledge of sorted waste composition and reverse logistics operations can be used to determine realistic temporal and quantitative landfill diversion targets in an emerging economy context. Experimentation constructed and confirmed the recycling trilogy that consists of source separation, collection infrastructure and reverse logistics. The study on source separation demonstrated the vital difference between raw and sorted waste compositions. Raw waste contained 70% biodegradable and 30% inert matter. Source separation produced 47% biodegradable, 20% inert and 33% mixed material. The study on collection infrastructure developed the necessary receiving facilities. The study on reverse logistics identified private operators capable of collecting and processing all separated inert items. Recycling activities for biodegradable material were scarce and erratic. Only farmers would take the material as animal feed. No composting initiatives existed. The management challenge was identified as stimulating these activities in order to complete the trilogy and divert the 47% source-separated biodegradable discards from the landfills. © The Author(s) 2014.

  3. Chemical characteristics and methane potentials of source-separated and pre-treated organic municipal solid waste

    DEFF Research Database (Denmark)

    Hansen, Trine Lund; Svärd, Å; Angelidaki, Irini

    2003-01-01

    A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical composit...... composition of the wastes and the estimated methane potentials.......A research project has investigated the biogas potential of pre-screened source-separated organic waste. Wastes from five Danish cities have been pre-treated by three methods: screw press; disc screen; and shredder and magnet. This paper outlines the sampling procedure used, the chemical...

  4. Response of spinach and komatsuna to biogas effluent made from source-separated kitchen garbage.

    Science.gov (United States)

    Furukawa, Yuichiro; Hasegawa, Hiroshi

    2006-01-01

    Recycling of kitchen garbage is an urgent task for reducing public spending and environmental burdens by incineration and/or landfill. There is an interesting regional effort in Ogawa, Saitama prefecture, Japan, in which source-separated kitchen garbage is anaerobically fermented with a biogas plant and the resultant effluent is used as a quick-release organic fertilizer by surrounding farmers. However, scientific assessments of fertilizer values and risks in the use of the effluent were lacking. Thus, a field experiment was conducted from 2003 to 2004 in Tohoku National Agricultural Research Center to grow spinach (Spinacia oleracea L.) and komatsuna (Brassica rapa var. perviridis L. H. Bailey) for evaluating the fertilizer value of the kitchen garbage effluent (KGE), nitrate, coliform group (CG), Escherichia coli, fecal streptococci (FS), and Vibrio parahaemolyticus concentrations of KGE and in the soil and the plant leaves. A cattle manure effluent (CME) and chemical fertilizers (NPK) were used as controls. Total nitrogen (N) and ammonium N concentrations of the KGE were 1.47 and 1.46 g kg(-1), respectively. The bacteria tested were detected in both biogas effluents in the order of 2 to 3 log CFU g(-1), but there was little evidence that the biogas effluents increased these bacteria in the soil and the plant leaves. At the rate of 22 g N m(-2), yield, total N uptake, apparent N recovery rate, and leaf nitrate ion concentration at harvest of spinach and komatsuna in the KGE plot were mostly comparable to those in the NPK and CME plots. We conclude that the KGE is a quick-release N fertilizer comparable to chemical fertilizers and does not cause contamination of CG, E. coli, FS, or V. parahaemolyticus in the soil and spinach and komatsuna leaves.

  5. Complete nutrient recovery from source-separated urine by nitrification and distillation.

    Science.gov (United States)

    Udert, K M; Wächter, M

    2012-02-01

    In this study we present a method to recover all nutrients from source-separated urine in a dry solid by combining biological nitrification with distillation. In a first process step, a membrane-aerated biofilm reactor was operated stably for more than 12 months, producing a nutrient solution with a pH between 6.2 and 7.0 (depending on the pH set-point), and an ammonium to nitrate ratio between 0.87 and 1.15 gN gN(-1). The maximum nitrification rate was 1.8 ± 0.3 gN m(-2) d(-1). Process stability was achieved by controlling the pH via the influent. In the second process step, real nitrified urine and synthetic solutions were concentrated in lab-scale distillation reactors. All nutrients were recovered in a dry powder except for some ammonia (less than 3% of total nitrogen). We estimate that the primary energy demand for a simple nitrification/distillation process is four to five times higher than removing nitrogen and phosphorus in a conventional wastewater treatment plant and producing the equivalent amount of phosphorus and nitrogen fertilizers. However, the primary energy demand can be reduced to values very close to conventional treatment, if 80% of the water is removed with reverse osmosis and distillation is operated with vapor compression. The ammonium nitrate content of the solid residue is below the limit at which stringent EU safety regulations for fertilizers come into effect; nevertheless, we propose some additional process steps that will increase the thermal stability of the solid product. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Anaerobic treatment as a core technology for energy, nutrients and water from source-separated domestic waste(water)

    NARCIS (Netherlands)

    Zeeman, G.; Kujawa, K.; Mes, de T.Z.D.; Graaff, de M.S.; Abu-Ghunmi, L.N.A.H.; Mels, A.R.; Meulman, B.; Temmink, B.G.; Buisman, C.J.N.; Lier, van J.B.; Lettinga, G.

    2008-01-01

    Based on results of pilot scale research with source-separated black water (BW) and grey water (GW), a new sanitation concept is proposed. BW and GW are both treated in a UASB (-septic tank) for recovery of CH4 gas. Kitchen waste is added to the anaerobic BW treatment for doubling the biogas

  7. Optimization of on-site treatment systems : filtration using geo-textile filters for source separated black wastewater

    OpenAIRE

    Welesameil, Mengstab Tilahun

    2012-01-01

    Filtration performance of three different non-woven geo-textiles (i.e. polypropylene and jute wool) to highly concentrated source separated black wastewater influent was evaluated in laboratory scale, aiming to optimize treatment process as pretreatments. Jets As aranged by my sporvisor

  8. Property-close source separation of hazardous waste and waste electrical and electronic equipment--a Swedish case study.

    Science.gov (United States)

    Bernstad, Anna; la Cour Jansen, Jes; Aspegren, Henrik

    2011-03-01

    Through an agreement with EEE producers, Swedish municipalities are responsible for collection of hazardous waste and waste electrical and electronic equipment (WEEE). In most Swedish municipalities, collection of these waste fractions is concentrated to waste recycling centres where households can source-separate and deposit hazardous waste and WEEE free of charge. However, the centres are often located on the outskirts of city centres and cars are needed in order to use the facilities in most cases. A full-scale experiment was performed in a residential area in southern Sweden to evaluate effects of a system for property-close source separation of hazardous waste and WEEE. After the system was introduced, results show a clear reduction in the amount of hazardous waste and WEEE disposed of incorrectly amongst residual waste or dry recyclables. The systems resulted in a source separation ratio of 70 wt% for hazardous waste and 76 wt% in the case of WEEE. Results show that households in the study area were willing to increase source separation of hazardous waste and WEEE when accessibility was improved and that this and similar collection systems can play an important role in building up increasingly sustainable solid waste management systems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  9. Long-term operation of a pilot-scale reactor for phosphorus recovery as struvite from source-separated urine

    NARCIS (Netherlands)

    Zamora, Patricia; Georgieva, Tanya; Salcedo, Inmaculada; Elzinga, Nico; Kuntke, Philipp; Buisman, Cees J.N.

    2017-01-01

    BACKGROUND: The treatment of source separated urine allows for a more sustainable approach to nutrients recovery in actual wastewater treatment plants. Struvite precipitation from urine yields a slow-release fertilizer (struvite) with a high marketable value for agricultural use. Extensive

  10. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  11. Pathogens and pharmaceuticals in source-separated urine in eThekwini, South Africa.

    Science.gov (United States)

    Bischel, Heather N; Özel Duygan, Birge D; Strande, Linda; McArdell, Christa S; Udert, Kai M; Kohn, Tamar

    2015-11-15

    In eThekwini, South Africa, the production of agricultural fertilizers from human urine collected from urine-diverting dry toilets is being evaluated at a municipality scale as a way to help finance a decentralized, dry sanitation system. The present study aimed to assess a range of human and environmental health hazards in source-separated urine, which was presumed to be contaminated with feces, by evaluating the presence of human pathogens, pharmaceuticals, and an antibiotic resistance gene. Composite urine samples from households enrolled in a urine collection trial were obtained from urine storage tanks installed in three regions of eThekwini. Polymerase chain reaction (PCR) assays targeted 9 viral and 10 bacterial human pathogens transmitted by the fecal-oral route. The most frequently detected viral pathogens were JC polyomavirus, rotavirus, and human adenovirus in 100%, 34% and 31% of samples, respectively. Aeromonas spp. and Shigella spp. were frequently detected gram negative bacteria, in 94% and 61% of samples, respectively. The gram positive bacterium, Clostridium perfringens, which is known to survive for extended times in urine, was found in 72% of samples. A screening of 41 trace organic compounds in the urine facilitated selection of 12 priority pharmaceuticals for further evaluation. The antibiotics sulfamethoxazole and trimethoprim, which are frequently prescribed as prophylaxis for HIV-positive patients, were detected in 95% and 85% of samples, reaching maximum concentrations of 6800 μg/L and 1280 μg/L, respectively. The antiretroviral drug emtricitabine was also detected in 40% of urine samples. A sulfonamide antibiotic resistance gene (sul1) was detected in 100% of urine samples. By coupling analysis of pathogens and pharmaceuticals in geographically dispersed samples in eThekwini, this study reveals a range of human and environmental health hazards in urine intended for fertilizer production. Collection of urine offers the benefit of

  12. Noisy blind source separation based on CEEMD and Savitzky-Golay filter

    Science.gov (United States)

    Peng, Hua-Fu; Huang, Gao-Ming

    2017-09-01

    The standard independent component analysis (ICA) algorithm is difficult to extract signals in noise condition, a blind separation algorithm based on denoising pretreatment was proposed. Mixed signals firstly were decomposed into several stationary intrinsic mode components (IMF) using complementary ensemble empirical mode decomposition (CEEMD), and high frequency IMF components were filtered with Savitzky-Golay filtering, then using the whole components reconstructed the mixed signals, finally applying the fast independent component analysis(FastICA) to separate the reconstructed signals. Simulation results showed that the proposed method improved the effect of blind signal separation under low signal-to-noise ratio.

  13. Anaerobic treatment in decentralised and source-separation-based sanitation concepts

    NARCIS (Netherlands)

    Kujawa-Roeleveld, K.; Zeeman, G.

    2006-01-01

    Anaerobic digestion of wastewater should be a core technology employed in decentralised sanitation systems especially when their objective is also resource conservation and reuse. The most efficient system involves separate collection and anaerobic digestion of the most concentrated domestic

  14. Waste disposal and households' heterogeneity. Identifying factors shaping attitudes towards source-separated recycling in Bogotá, Colombia.

    Science.gov (United States)

    J Padilla, Alcides; Trujillo, Juan C

    2018-04-01

    Solid waste management in many cities of developing countries is not environmentally sustainable. People traditionally dispose of their solid waste in unsuitable urban areas like sidewalks and satellite dumpsites. This situation nowadays has become a serious public health problem in big Latin American conurbations. Among these densely-populated urban spaces, the Colombia's capital and main city stands out as a special case. In this study, we aim to identify the factors that shape the attitudes towards source-separated recycling among households in Bogotá. Using data from the Colombian Department of Statistics and Bogotá's multi-purpose survey, we estimated a multivariate Probit model. In general, our results show that the higher the household's socioeconomic class, the greater its effort for separating solid wastes. Likewise, our findings also allowed us to characterize household profiles regarding solid waste separation and considering each socioeconomic class. Among these profiles, we found that at lower socioeconomic classes, the attitudes towards solid waste separation are influenced by the use of Internet, the membership to an environmentalist organization, the level of education of the head of household and the homeownership. Hence, increasing the education levels within the poorest segment of the population, promoting affordable housing policies and facilitating Internet access for the vulnerable population could reinforce households' attitudes towards a greater source-separated recycling effort. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Eco-sewerage System Design for Modern Office Buildings: based on Vacuum and Source-separation Technology

    Science.gov (United States)

    Xu, Kangning; Wang, Chengwen; Zheng, Min; Yuan, Xin

    2010-11-01

    This study aimed to construct an on-site eco-sewerage system for modern office buildings in urban area based on combined innovative technologies of vacuum and source-separation. Results showed that source-separated grey water had low concentrations of pollutants, which helped the reuse of grey water. However, the system had a low separation efficiency between the yellow water and the brown water, which was caused by the plug problem in the urine collection from the urine-diverting toilets. During the storage of yellow water for liquid fertilizer production, nearly all urea nitrogen transferred to ammonium nitrogen and about 2/3 phosphorus was lost because of the struvite precipitation. Total bacteria and coliforms increased first in the storage, but then decreased to low concentrations. The anaerobic/anoxic/aerobic MBR had high elimination rates of COD, ammonium nitrogen and total nitrogen of the brown water, which were 94.2%, 98.1% and 95.1%, respectively. However, the effluent still had high contents of colority, nitrate and phosphorus, which affected the application of the effluent for flushing water. Even though, the effluent might be used as dilution water for the yellow water fertilizer. Based on the results and the assumption of an ideal operation of the vacuum source-separation system, a future plan for on-site eco-sewerage system of modern office buildings was constructed. Its sustainability was validated by the analysis of the substances flow of water and nutrients.

  16. Calibration in a Bayesian modelling framework

    NARCIS (Netherlands)

    Jansen, M.J.W.; Hagenaars, T.H.J.

    2004-01-01

    Bayesian statistics may constitute the core of a consistent and comprehensive framework for the statistical aspects of modelling complex processes that involve many parameters whose values are derived from many sources. Bayesian statistics holds great promises for model calibration, provides the

  17. EEG dipole analysis of motor-priming foreperiod activity reveals separate sources for motor and spatial attention components.

    Science.gov (United States)

    Mathews, Simon; Ainsley Dean, Phil John; Sterr, Annette

    2006-12-01

    This study employed EEG source localisation procedures to study the contribution of motor preparatory and attentional processing to foreperiod activity in an S1-S2 motor priming task. Behavioural and high-density event-related potential (ERP) data were recorded in an S1-S2 priming task where participants responded to S2 with a left or right-hand button press. S1 either provided information about response hand (informative) or ambiguous information (uninformative). Responses were significantly faster in informative trials compared with uninformative trials. Dipole source analysis of foreperiod lateralized ERPs revealed sources of motor preparatory activity in the dorsolateral premotor cortex (PMd) in line with previous work. In addition, two spatial attention components (ADAN, LDAP) were identified with generators in the PMd and occipitotemporal visual areas in the middle temporal (MT) region, respectively. Separation of motor-related and attentional PMd source locations was reliable along the rostral-caudal axis. The presence of attentional components in a motor priming paradigm supports the premotor theory of attention which suggests a close link between attention and motor preparatory processes. Separation of components in the premotor cortex is in accord with a functional division of PMd into rostral (higher-order processing) and caudal (motor-related processing) areas as suggested by imaging work. A prime for response preparation is a trigger for separate, but closely linked, attention-related activity in premotor areas.

  18. Nonnegative Matrix Factor 2-D Deconvolution for Blind Single Channel Source Separation

    DEFF Research Database (Denmark)

    Schmidt, Mikkel N.; Mørup, Morten

    2006-01-01

    We present a novel method for blind separation of instruments in polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding to in...... to individual instruments. Based on this factorization we separate the instruments using spectrogram masking. The proposed algorithm has applications in computational auditory scene analysis, music information retrieval, and automatic music transcription.......We present a novel method for blind separation of instruments in polyphonic music based on a non-negative matrix factor 2-D deconvolution algorithm. Using a model which is convolutive in both time and frequency we factorize a spectrogram representation of music into components corresponding...

  19. Prospects of Source-Separation-Based Sanitation Concepts: A Model-Based Study

    NARCIS (Netherlands)

    Tervahauta, T.H.; Trang Hoang,; Hernández, L.; Zeeman, G.; Buisman, C.J.N.

    2013-01-01

    Separation of different domestic wastewater streams and targeted on-site treatment for resource recovery has been recognized as one of the most promising sanitation concepts to re-establish the balance in carbon, nutrient and water cycles. In this study a model was developed based on literature data

  20. Separation of radiated sound field components from waves scattered by a source under non-anechoic conditions

    DEFF Research Database (Denmark)

    Fernandez Grande, Efren; Jacobsen, Finn

    2010-01-01

    A method of estimating the sound field radiated by a source under non-anechoic conditions has been examined. The method uses near field acoustic holography based on a combination of pressure and particle velocity measurements in a plane near the source for separating outgoing and ingoing wave...... components. The outgoing part of the sound field is composed of both radiated and scattered waves. The method compensates for the scattered components of the outgoing field on the basis of the boundary condition of the problem, exploiting the fact that the sound field is reconstructed very close...... to the source. Thus the radiated free-field component is estimated simultaneously with solving the inverse problem of reconstructing the sound field near the source. The method is particularly suited to cases in which the overall contribution of reflected sound in the measurement plane is significant....

  1. SEPARATION OF Ca AND Fe METAL ION IN SOURCE WATER BY ADSORPTION COLUMN TECHNIC WITH LOCAL ZEOLITE AND ACTIVE CARBON

    Directory of Open Access Journals (Sweden)

    Suyanta Suyanta

    2016-04-01

    Full Text Available This research aims are to separate of Ca and Fe metal ion in source water, with local zeolite and active carbon by adsorption column technic. Efficiency of separation are control by adsorption time and size of zeolite. Method that used was column adsorption with a flow system in which sample is applied to the filtration tube containing zeolite and active carbon. Initial and final concentrations of the samples were analyzed using Atomic Adsorption Spectrophotometer instrument. The results obtained shows that ability adsorption of zeolite to Ca and Fe metal ion are a good. Zeolite 1 (10 mesh can reduce iron concentration until 93.98 % and zeolite 2 (5mesh until 98.88% for 1 – 4 week range time. Whereas reducing of calcium concentration is not good, until 2 week period time adsorption of calcium ion is about 50%.   Keywords: adsorption, zeolite, source water

  2. Design of Low-Cost FPGA Hardware for Real-time ICA-Based Blind Source Separation Algorithm

    Directory of Open Access Journals (Sweden)

    Farook Sattar

    2005-11-01

    Full Text Available Blind source separation (BSS of independent sources from their convolutive mixtures is a problem in many real-world multisensor applications. In this paper, we propose and implement an efficient FPGA hardware architecture for the realization of a real-time BSS. The architecture can be implemented using a low-cost FPGA (field programmable gate array. The architecture offers a good balance between hardware requirement (gate count and minimal clock speed and separation performance. The FPGA design implements the modified Torkkola's BSS algorithm for audio signals based on ICA (independent component analysis technique. Here, the separation is performed by implementing noncausal filters, instead of the typical causal filters, within the feedback network. This reduces the required length of the unmixing filters as well as provides better separation and faster convergence. Description of the hardware as well as discussion of some issues regarding the practical hardware realization are presented. Results of various FPGA simulations as well as real-time testing of the final hardware design in real environment are given.

  3. A Hybrid Technique for Blind Separation of Non-Gaussian and Time-Correlated Sources Using a Multicomponent Approach

    Czech Academy of Sciences Publication Activity Database

    Tichavský, Petr; Koldovský, Zbyněk; Yeredor, A.; Gómez-Herrero, G.; Doron, E.

    2008-01-01

    Roč. 19, č. 3 (2008), s. 421-430 ISSN 1045-9227 R&D Projects: GA MŠk 1M0572 Grant - others:GA ČR(CZ) GP102/07/P384 Program:GP Institutional research plan: CEZ:AV0Z10750506 Keywords : blind source separation * independent component analysis Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.726, year: 2008

  4. Adaptive dictionary based approach for background noise and speaker classification and subsequent source separation

    OpenAIRE

    Girish, K V Vijay; Ramakrishnan, A G; Ananthapadmanabha, T V

    2016-01-01

    A judicious combination of dictionary learning methods, block sparsity and source recovery algorithm are used in a hierarchical manner to identify the noises and the speakers from a noisy conversation between two people. Conversations are simulated using speech from two speakers, each with a different background noise, with varied SNR values, down to -10 dB. Ten each of randomly chosen male and female speakers from the TIMIT database and all the noise sources from the NOISEX database are used...

  5. Use of oleaginous plants in phytotreatment of grey water and yellow water from source separation of sewage.

    Science.gov (United States)

    Lavagnolo, Maria Cristina; Malagoli, Mario; Alibardi, Luca; Garbo, Francesco; Pivato, Alberto; Cossu, Raffaello

    2017-05-01

    Efficient and economic reuse of waste is one of the pillars of modern environmental engineering. In the field of domestic sewage management, source separation of yellow (urine), brown (faecal matter) and grey waters aims to recover the organic substances concentrated in brown water, the nutrients (nitrogen and phosphorous) in the urine and to ensure an easier treatment and recycling of grey waters. With the objective of emphasizing the potential of recovery of resources from sewage management, a lab-scale research study was carried out at the University of Padova in order to evaluate the performances of oleaginous plants (suitable for biodiesel production) in the phytotreatment of source separated yellow and grey waters. The plant species used were Brassica napus (rapeseed), Glycine max (soybean) and Helianthus annuus (sunflower). Phytotreatment tests were carried out using 20L pots. Different testing runs were performed at an increasing nitrogen concentration in the feedstock. The results proved that oleaginous species can conveniently be used for the phytotreatment of grey and yellow waters from source separation of domestic sewage, displaying high removal efficiencies of nutrients and organic substances (nitrogen>80%; phosphorous >90%; COD nearly 90%). No inhibition was registered in the growth of plants irrigated with different mixtures of yellow and grey waters, where the characteristics of the two streams were reciprocally and beneficially integrated. Copyright © 2016. Published by Elsevier B.V.

  6. Multi-Antenna Data Collector for Smart Metering Networks with Integrated Source Separation by Spatial Filtering

    Science.gov (United States)

    Quednau, Philipp; Trommer, Ralph; Schmidt, Lorenz-Peter

    2016-03-01

    Wireless transmission systems in smart metering networks share the advantage of lower installation costs due to the expandability of separate infrastructure but suffer from transmission problems. In this paper the issue of interference of wireless transmitted smart meter data with third party systems and data from other meters is investigated and an approach for solving the problem is presented. A multi-channel wireless m-bus receiver was developed to separate the desired data from unwanted interferers by spatial filtering. The according algorithms are presented and the influence of different antenna types on the spatial filtering is investigated. The performance of the spatial filtering is evaluated by extensive measurements in a realistic surrounding with several hundreds of active wireless m-bus transponders. These measurements correspond to the future environment for data-collectors as they took place in rural and urban areas with smart gas meters equipped with wireless m-bus transponders installed in almost all surrounding buildings.

  7. An evaluation of the social dimensions in public participation in rural domestic waste source-separated collection in Guilin, China.

    Science.gov (United States)

    Ma, Jing; Hipel, Keith W; Hanson, Mark L

    2017-12-21

    A comprehensive evaluation of public participation in rural domestic waste (RDW) source-separated collection in China was carried out within a social-dimension framework, specifically in terms of public perception, awareness, attitude, and willingness to pay for RDW management. The evaluation was based on a case study conducted in Guilin, Guangxi Zhuang Autonomous Region, China, which is a representative of most inland areas of the country with a GDP around the national average. It was found that unlike urban residents, rural residents maintained a high rate of recycling, but in a spontaneous manner; they paid more attention to issues closely related to their daily lives, but less attention to those at the general level; their awareness of RDW source-separated collection was low and different age groups showed significantly different preferences regarding the sources of knowledge acquirement. Among potential information sources, village committees played a very important role in knowledge dissemination; for the respondents' pro-environmental attitudes, the influencing factor of "lack of legislation/policy" was considered to be significant; mandatory charges for waste collection and disposal had a high rate of acceptance among rural residents; and high monthly incomes had a positive correlation with both public pro-environmental attitudes and public willingness to pay for extra charges levied by RDW management. These observations imply that, for decision-makers in the short term, implementing mandatory RDW source-separated collection programs with enforced guidelines and economic compensation is more effective, while in the long run, promoting pro-environmental education to rural residents is more important.

  8. Separating contributions from natural and anthropogenic sources in atmospheric methane from the Black Sea region, Romania

    International Nuclear Information System (INIS)

    Cuna, Stela; Pendall, Elise; Miller, John B.; Tans, Pieter P.; Dlugokencky, Ed; White, James W.C.

    2008-01-01

    The Danube Delta-Black Sea region of Romania is an important wetland, and this preliminary study evaluates the significance of this region as a source of atmospheric CH 4 . Measurements of the mixing ratio and δ 13 C in CH 4 are reported from air and water samples collected at eight sites in the Danube Delta. High mixing ratios of CH 4 were found in air (2500-14,000 ppb) and dissolved in water samples (∼1-10 μmol L -1 ), demonstrating that the Danube Delta is an important natural source of CH 4 . The intercepts on Keeling plots of about -62 per mille show that the main source of CH 4 in this region is microbial, probably resulting primarily from acetate fermentation. Atmospheric CH 4 and CO data from the NOAA/ESRL (National Oceanic and Atmospheric Administration/Earth System Research Laboratory) were used to make a preliminary estimate of biogenic CH 4 at the Black Sea sampling site at Constanta (BSC). These data were used to calculate ratios of CH 4 /CO in air samples, and using an assumed CH 4 /CO anthropogenic emissions ratio of 0.6, fossil fuel emissions at BSC were estimated. Biogenic CH 4 emissions were then estimated by a simple mass balance approach. Keeling plots of well-mixed air from the BSC site suggested a stronger wetland source in summer and a stronger fossil fuel source in winter

  9. Denoising of Ictal EEG Data Using Semi-Blind Source Separation Methods Based on Time-Frequency Priors.

    Science.gov (United States)

    Hajipour Sardouie, Sepideh; Bagher Shamsollahi, Mohammad; Albera, Laurent; Merlet, Isabelle

    2015-05-01

    Removing muscle activity from ictal ElectroEncephaloGram (EEG) data is an essential preprocessing step in diagnosis and study of epileptic disorders. Indeed, at the very beginning of seizures, ictal EEG has a low amplitude and its morphology in the time domain is quite similar to muscular activity. Contrary to the time domain, ictal signals have specific characteristics in the time-frequency domain. In this paper, we use the time-frequency signature of ictal discharges as a priori information on the sources of interest. To extract the time-frequency signature of ictal sources, we use the Canonical Correlation Analysis (CCA) method. Then, we propose two time-frequency based semi-blind source separation approaches, namely the Time-Frequency-Generalized EigenValue Decomposition (TF-GEVD) and the Time-Frequency-Denoising Source Separation (TF-DSS), for the denoising of ictal signals based on these time-frequency signatures. The performance of the proposed methods is compared with that of CCA and Independent Component Analysis (ICA) approaches for the denoising of simulated ictal EEGs and of real ictal data. The results show the superiority of the proposed methods in comparison with CCA and ICA.

  10. Time-domain beamforming and blind source separation speech input in the car environment

    CERN Document Server

    Bourgeois, Julien

    2009-01-01

    The development of computer and telecommunication technologies led to a revolutioninthewaythatpeopleworkandcommunicatewitheachother.One of the results is that large amount of information will increasingly be held in a form that is natural for users, as speech in natural language. In the presented work, we investigate the speech signal capture problem, which includes the separation of multiple interfering speakers using microphone arrays. Adaptive beamforming is a classical approach which has been developed since the seventies. However it requires a double-talk detector (DTD) that interrupts th

  11. Bayesian flood forecasting methods: A review

    Science.gov (United States)

    Han, Shasha; Coulibaly, Paulin

    2017-08-01

    Over the past few decades, floods have been seen as one of the most common and largely distributed natural disasters in the world. If floods could be accurately forecasted in advance, then their negative impacts could be greatly minimized. It is widely recognized that quantification and reduction of uncertainty associated with the hydrologic forecast is of great importance for flood estimation and rational decision making. Bayesian forecasting system (BFS) offers an ideal theoretic framework for uncertainty quantification that can be developed for probabilistic flood forecasting via any deterministic hydrologic model. It provides suitable theoretical structure, empirically validated models and reasonable analytic-numerical computation method, and can be developed into various Bayesian forecasting approaches. This paper presents a comprehensive review on Bayesian forecasting approaches applied in flood forecasting from 1999 till now. The review starts with an overview of fundamentals of BFS and recent advances in BFS, followed with BFS application in river stage forecasting and real-time flood forecasting, then move to a critical analysis by evaluating advantages and limitations of Bayesian forecasting methods and other predictive uncertainty assessment approaches in flood forecasting, and finally discusses the future research direction in Bayesian flood forecasting. Results show that the Bayesian flood forecasting approach is an effective and advanced way for flood estimation, it considers all sources of uncertainties and produces a predictive distribution of the river stage, river discharge or runoff, thus gives more accurate and reliable flood forecasts. Some emerging Bayesian forecasting methods (e.g. ensemble Bayesian forecasting system, Bayesian multi-model combination) were shown to overcome limitations of single model or fixed model weight and effectively reduce predictive uncertainty. In recent years, various Bayesian flood forecasting approaches have been

  12. Gaussian process based independent analysis for temporal source separation in fMRI

    DEFF Research Database (Denmark)

    Hald, Ditte Høvenhoff; Henao, Ricardo; Winther, Ole

    2017-01-01

    Functional Magnetic Resonance Imaging (fMRI) gives us a unique insight into the processes of the brain, and opens up for analyzing the functional activation patterns of the underlying sources. Task-inferred supervised learning with restrictive assumptions in the regression set-up, restricts the e...

  13. A simple alkali-metal and noble gas ion source for SIMS equipments with mass separation of the primary ions

    International Nuclear Information System (INIS)

    Duesterhoeft, H.; Pippig, R.

    1986-01-01

    An alkali-metal ion source working without a store of alkali-metals is described. The alkali-metal ions are produced by evaporation of alkali salts and ionization in a low-voltage arc discharge stabilized with a noble gas plasma or in the case of small alkali-metal ion currents on the base of the well known thermic ionization at a hot tungsten wire. The source is very simple in construction and produces a stable ion current of 0.3 μA for more than 100 h. It is possible to change the ion species in a short time. This source is applicable to all SIMS equipments using mass separation for primary ions. (author)

  14. A Bayesian Network Approach to Ontology Mapping

    National Research Council Canada - National Science Library

    Pan, Rong; Ding, Zhongli; Yu, Yang; Peng, Yun

    2005-01-01

    .... In this approach, the source and target ontologies are first translated into Bayesian networks (BN); the concept mapping between the two ontologies are treated as evidential reasoning between the two translated BNs...

  15. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  16. Separation of mouse testis cells on a Celsep (TM) apparatus and their usefulness as a source of high molecular weight DNA or RNA

    Science.gov (United States)

    Wolgemuth, D. J.; Gizang-Ginsberg, E.; Engelmyer, E.; Gavin, B. J.; Ponzetto, C.

    1985-01-01

    The use of a self-contained unit-gravity cell separation apparatus for separation of populations of mouse testicular cells is described. The apparatus, a Celsep (TM), maximizes the unit area over which sedimentation occurs, reduces the amount of separation medium employed, and is quite reproducible. Cells thus isolated have been good sources for isolation of DNA, and notably, high molecular weight RNA.

  17. MAP-Based Underdetermined Blind Source Separation of Convolutive Mixtures by Hierarchical Clustering and -Norm Minimization

    Directory of Open Access Journals (Sweden)

    Kellermann Walter

    2007-01-01

    Full Text Available We address the problem of underdetermined BSS. While most previous approaches are designed for instantaneous mixtures, we propose a time-frequency-domain algorithm for convolutive mixtures. We adopt a two-step method based on a general maximum a posteriori (MAP approach. In the first step, we estimate the mixing matrix based on hierarchical clustering, assuming that the source signals are sufficiently sparse. The algorithm works directly on the complex-valued data in the time-frequency domain and shows better convergence than algorithms based on self-organizing maps. The assumption of Laplacian priors for the source signals in the second step leads to an algorithm for estimating the source signals. It involves the -norm minimization of complex numbers because of the use of the time-frequency-domain approach. We compare a combinatorial approach initially designed for real numbers with a second-order cone programming (SOCP approach designed for complex numbers. We found that although the former approach is not theoretically justified for complex numbers, its results are comparable to, or even better than, the SOCP solution. The advantage is a lower computational cost for problems with low input/output dimensions.

  18. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  19. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  20. Recent Administrative and Judicial Decisions Regarding Consideration of Source Separation in Determining BACT Under PSD

    Science.gov (United States)

    This document may be of assistance in applying the New Source Review (NSR) air permitting regulations including the Prevention of Significant Deterioration (PSD) requirements. This document is part of the NSR Policy and Guidance Database. Some documents in the database are a scanned or retyped version of a paper photocopy of the original. Although we have taken considerable effort to quality assure the documents, some may contain typographical errors. Contact the office that issued the document if you need a copy of the original.

  1. Query-by-Example Music Information Retrieval by Score-Informed Source Separation and Remixing Technologies

    Directory of Open Access Journals (Sweden)

    Goto Masataka

    2010-01-01

    Full Text Available We describe a novel query-by-example (QBE approach in music information retrieval that allows a user to customize query examples by directly modifying the volume of different instrument parts. The underlying hypothesis of this approach is that the musical mood of retrieved results changes in relation to the volume balance of different instruments. On the basis of this hypothesis, we aim to clarify the relationship between the change in the volume balance of a query and the genre of the retrieved pieces, called genre classification shift. Such an understanding would allow us to instruct users in how to generate alternative queries without finding other appropriate pieces. Our QBE system first separates all instrument parts from the audio signal of a piece with the help of its musical score, and then it allows users remix these parts to change the acoustic features that represent the musical mood of the piece. Experimental results showed that the genre classification shift was actually caused by the volume change in the vocal, guitar, and drum parts.

  2. Noise-Source Separation Using Internal and Far-Field Sensors for a Full-Scale Turbofan Engine

    Science.gov (United States)

    Hultgren, Lennart S.; Miles, Jeffrey H.

    2009-01-01

    Noise-source separation techniques for the extraction of the sub-dominant combustion noise from the total noise signatures obtained in static-engine tests are described. Three methods are applied to data from a static, full-scale engine test. Both 1/3-octave and narrow-band results are discussed. The results are used to assess the combustion-noise prediction capability of the Aircraft Noise Prediction Program (ANOPP). A new additional phase-angle-based discriminator for the three-signal method is also introduced.

  3. A mathematical model for source separation of MMG signals recorded with a coupled microphone-accelerometer sensor pair.

    Science.gov (United States)

    Silva, Jorge; Chau, Tom

    2005-09-01

    Recent advances in sensor technology for muscle activity monitoring have resulted in the development of a coupled microphone-accelerometer sensor pair for physiological acousti signal recording. This sensor can be used to eliminate interfering sources in practical settings where the contamination of an acoustic signal by ambient noise confounds detection but cannot be easily removed [e.g., mechanomyography (MMG), swallowing sounds, respiration, and heart sounds]. This paper presents a mathematical model for the coupled microphone-accelerometer vibration sensor pair, specifically applied to muscle activity monitoring (i.e., MMG) and noise discrimination in externally powered prostheses for below-elbow amputees. While the model provides a simple and reliable source separation technique for MMG signals, it can also be easily adapted to other aplications where the recording of low-frequency (< 1 kHz) physiological vibration signals is required.

  4. Non-parametric Bayesian models of response function in dynamic image sequences

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej; Šmídl, Václav

    2016-01-01

    Roč. 151, č. 1 (2016), s. 90-100 ISSN 1077-3142 R&D Projects: GA ČR GA13-29225S Institutional support: RVO:67985556 Keywords : Response function * Blind source separation * Dynamic medical imaging * Probabilistic models * Bayesian methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 2.498, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/tichy-0456983.pdf

  5. Nonnegative Tensor Factorization Approach Applied to Fission Chamber’s Output Signals Blind Source Separation

    Science.gov (United States)

    Laassiri, M.; Hamzaoui, E.-M.; Cherkaoui El Moursli, R.

    2018-02-01

    Inside nuclear reactors, gamma-rays emitted from nuclei together with the neutrons introduce unwanted backgrounds in neutron spectra. For this reason, powerful extraction methods are needed to extract useful neutron signal from recorded mixture and thus to obtain clearer neutron flux spectrum. Actually, several techniques have been developed to discriminate between neutrons and gamma-rays in a mixed radiation field. Most of these techniques, tackle using analogue discrimination methods. Others propose to use some organic scintillators to achieve the discrimination task. Recently, systems based on digital signal processors are commercially available to replace the analog systems. As alternative to these systems, we aim in this work to verify the feasibility of using a Nonnegative Tensor Factorization (NTF) to blind extract neutron component from mixture signals recorded at the output of fission chamber (WL-7657). This last have been simulated through the Geant4 linked to Garfield++ using a 252Cf neutron source. To achieve our objective of obtaining the best possible neutron-gamma discrimination, we have applied the two different NTF algorithms, which have been found to be the best methods that allow us to analyse this kind of nuclear data.

  6. Evaluation of a new pulping technology for pre-treating source-separated organic household waste prior to anaerobic digestion.

    Science.gov (United States)

    Naroznova, Irina; Møller, Jacob; Larsen, Bjarne; Scheutz, Charlotte

    2016-04-01

    A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed screw mechanism. The pre-treatment technology rejects more than 95% (wet weight) of non-biodegradable impurities in waste collected from households and generates biopulp ready for anaerobic digestion. Overall, 84-99% of biodegradable material (on a dry weight basis) in the waste was recovered in the biopulp. The biochemical methane potential for the biopulp was 469 ± 7 mL CH4/g ash-free mass. Moreover, all Danish and European Union requirements regarding the content of hazardous substances in biomass intended for land application were fulfilled. Compared to other pre-treatment alternatives, the screw-pulping technology showed higher biodegradable material recovery, lower electricity consumption and comparable water consumption. The higher material recovery achieved with the technology was associated with greater transfer of nutrients (N and P), carbon (total and biogenic) but also heavy metals (except Pb) to the produced biomass. The data generated in this study could be used for the environmental assessment of the technology and thus help in selecting the best pre-treatment technology for source separated organic household waste. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Quantitative assessment of distance to collection point and improved sorting information on source separation of household waste.

    Science.gov (United States)

    Rousta, Kamran; Bolton, Kim; Lundin, Magnus; Dahlén, Lisa

    2015-06-01

    The present study measures the participation of households in a source separation scheme and, in particular, if the household's application of the scheme improved after two interventions: (a) shorter distance to the drop-off point and (b) easy access to correct sorting information. The effect of these interventions was quantified and, as far as possible, isolated from other factors that can influence the recycling behaviour. The study was based on households located in an urban residential area in Sweden, where waste composition studies were performed before and after the interventions by manual sorting (pick analysis). Statistical analyses of the results indicated a significant decrease (28%) of packaging and newsprint in the residual waste after establishing a property close collection system (intervention (a)), as well as significant decrease (70%) of the miss-sorted fraction in bags intended for food waste after new information stickers were introduced (intervention (b)). Providing a property close collection system to collect more waste fractions as well as finding new communication channels for information about sorting can be used as tools to increase the source separation ratio. This contribution also highlights the need to evaluate the effects of different types of information and communication concerning sorting instructions in a property close collection system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Joint source separation of simultaneous EEG-fMRI recording in two experimental conditions using common spatial patterns.

    Science.gov (United States)

    Tan, Ao; Fu, Zening; Tu, Yiheng; Hung, Yeung Sam; Zhang, Zhiguo

    2015-01-01

    Simultaneous collection of electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) data has become increasingly popular in neuroscientific studies, because it can provide neural information with both high spatial and temporal resolution. In order to maximally utilize the information contained in simultaneous EEG-fMRI recording, many sophisticated multimodal data-mining methods, such as joint ICA, have been developed. However, these methods normally deal with data recorded in one experimental condition, and they cannot effectively extract information on activities that are distinct in two conditions. In this paper, a new data decomposition method called joint common spatial pattern (jCSP) is proposed. Compared with previous methods, the jCSP method exploits inter-conditional difference in the strength of brain source activities to achieve source separation, and is able to uncover the source activities with the strongest discriminative power. A group analysis based on clustering is further proposed to reveal distinctive jCSP patterns at group level. We applied joint CSP to a simultaneous EEG-fMRI dataset collected from 21 subjects under two different resting-state conditions (eyes-closed and eyes-open). Results show a distinct dynamic pattern shared by EEG alpha power and fMRI signal during eyes-open resting-state.

  9. Bayesian nonparametric hierarchical modeling.

    Science.gov (United States)

    Dunson, David B

    2009-04-01

    In biomedical research, hierarchical models are very widely used to accommodate dependence in multivariate and longitudinal data and for borrowing of information across data from different sources. A primary concern in hierarchical modeling is sensitivity to parametric assumptions, such as linearity and normality of the random effects. Parametric assumptions on latent variable distributions can be challenging to check and are typically unwarranted, given available prior knowledge. This article reviews some recent developments in Bayesian nonparametric methods motivated by complex, multivariate and functional data collected in biomedical studies. The author provides a brief review of flexible parametric approaches relying on finite mixtures and latent class modeling. Dirichlet process mixture models are motivated by the need to generalize these approaches to avoid assuming a fixed finite number of classes. Focusing on an epidemiology application, the author illustrates the practical utility and potential of nonparametric Bayes methods.

  10. Global warming potential of material fractions occurring in source-separated organic household waste treated by anaerobic digestion or incineration under different framework conditions

    DEFF Research Database (Denmark)

    Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte

    2016-01-01

    This study compared the environmental profiles of anaerobic digestion (AD) and incineration, in relation to global warming potential (GWP), for treating individual material fractions that may occur in source-separated organic household waste (SSOHW). Different framework conditions representative...

  11. Evaluation of a new pulping technology for pre-treating source-separated organic household waste prior to anaerobic digestion

    DEFF Research Database (Denmark)

    Naroznova, Irina; Møller, Jacob; Larsen, Bjarne

    2016-01-01

    A new technology for pre-treating source-separated organic household waste prior to anaerobic digestion was assessed, and its performance was compared to existing alternative pre-treatment technologies. This pre-treatment technology is based on waste pulping with water, using a specially developed...... screw mechanism. The pre-treatment technology rejects more than 95% (wet weight) of non-biodegradable impurities in waste collected from households and generates biopulp ready for anaerobic digestion. Overall, 84-99% of biodegradable material (on a dry weight basis) in the waste was recovered......-pulping technology showed higher biodegradable material recovery, lower electricity consumption and comparable water consumption. The higher material recovery achieved with the technology was associated with greater transfer of nutrients (N and P), carbon (total and biogenic) but also heavy metals (except Pb...

  12. Source separation for materials recovery guidelines. Part 246. [High-grade paper, corrugated container, and residential materials recovery

    Energy Technology Data Exchange (ETDEWEB)

    1976-04-23

    These guidelines are issued under the Authority of Section 209(a) of the Solid Waste Disposal Act of 1965 (PL 89-272), as amended by the Resource Recovery Act of 1970 (PL 91-512). Chapter I of Title 40 of the Code of Federal Regulations is amended effective May 24, 1976 by adding a new Part 246. The guidelines are applicable to the source separation of residential, commercial, and institutional solid wastes. Explicitly excluded are mining, agricultural, and industrial solid wastes; hazardous wastes; sludges; construction and demolition wastes; infectious wastes; classified waste. Specific requirements and recommended procedures for high-grade paper recovery, residential materials recovery, and corrugated container recovery are included in Part 246. (MCW)

  13. Separate norovirus outbreaks linked to one source of imported frozen raspberries by molecular analysis, Denmark, 2010–2011

    DEFF Research Database (Denmark)

    Müller, L.; Schultz, Anna Charlotte; Fonager, J.

    2015-01-01

    Norovirus outbreaks occur frequently in Denmark and it can be difficult to establish whether apparently independent outbreaks have the same origin. Here we report on six outbreaks linked to frozen raspberries, investigated separately over a period of 3 months. Norovirus from stools were sequence...... capsid P2 region. In one outbreak at a hospital canteen, frozen raspberries was associated with illness by cohort investigation (relative risk 6·1, 95% confidence interval 3·2–11). Bags of raspberries suspected to be the source were positive for genogroup I and II noroviruses, one typable virus...... was genotype GI.6 (capsid). These molecular investigations showed that the apparently independent outbreaks were the result of one contamination event of frozen raspberries. The contaminated raspberries originated from a single producer in Serbia and were originally not considered to belong to the same batch...

  14. LCA of the Collection, Transportation, Treatment and Disposal of Source Separated Municipal Waste: A Southern Italy Case Study

    Directory of Open Access Journals (Sweden)

    Giovanni De Feo

    2016-10-01

    Full Text Available This study performed a Life Cycle Assessment of the collection, transport, treatment and disposal of source-separated municipal waste (MW in Baronissi, a town of 17,000 inhabitants in the Campania region of Italy. Baronissi is a high-performing town in a region with scarcity of MW facilities. The environmental impacts were assessed with three different methods—IPCC 2007, Ecological Footprint and ReCiPe 2008—in order to evaluate how they influence the results as well as how the global warming affects the results, since it is one of the major environmental concerns of people. The obtained results showed how the presence of facilities in the area is fundamental. Their lack means high environmental loads due to the transportation of materials for long distances, particularly for the organic fraction. The presence of a composting plant at 10 km from the municipality would result in a decrease of 65% of the impacts due to the external transport, regardless of the evaluation method. The results obtained with ReCiPe 2008 and Ecological Footprint agreed, while those obtained with IPCC 2007 were very different since global warming is strongly affected by the transport phase. IPCC 2007 does not allow to take into account the advantages obtainable with a good level of separate collection. Considering a single impact evaluation method, there is a high risk of coming to misleading conclusions.

  15. Sequential combination of multi-source satellite observations for separation of surface deformation associated with serial seismic events

    Science.gov (United States)

    Chen, Qiang; Xu, Qian; Zhang, Yijun; Yang, Yinghui; Yong, Qi; Liu, Guoxiang; Liu, Xianwen

    2018-03-01

    Single satellite geodetic technique has weakness for mapping sequence of ground deformation associated with serial seismic events, like InSAR with long revisiting period readily leading to mixed complex deformation signals from multiple events. It challenges the observation capability of single satellite geodetic technique for accurate recognition of individual surface deformation and earthquake model. The rapidly increasing availability of various satellite observations provides good solution for overcoming the issue. In this study, we explore a sequential combination of multiple overlapping datasets from ALOS/PALSAR, ENVISAT/ASAR and GPS observations to separate surface deformation associated with the 2011 Mw 9.0 Tohoku-Oki major quake and two strong aftershocks including the Mw 6.6 Iwaki and Mw 5.8 Ibaraki events. We first estimate the fault slip model of major shock with ASAR interferometry and GPS displacements as constraints. Due to the used PALSAR interferogram spanning the period of all the events, we then remove the surface deformation of major shock through forward calculated prediction thus obtaining PALSAR InSAR deformation associated with the two strong aftershocks. The inversion for source parameters of Iwaki aftershock is conducted using the refined PALSAR deformation considering that the higher magnitude Iwaki quake has dominant deformation contribution than the Ibaraki event. After removal of deformation component of Iwaki event, we determine the fault slip distribution of Ibaraki shock using the remained PALSAR InSAR deformation. Finally, the complete source models for the serial seismic events are clearly identified from the sequential combination of multi-source satellite observations, which suggest that the major quake is a predominant mega-thrust rupture, whereas the two aftershocks are normal faulting motion. The estimated seismic moment magnitude for the Tohoku-Oki, Iwaki and Ibaraki evens are Mw 9.0, Mw 6.85 and Mw 6.11, respectively.

  16. Adaptive Langevin sampler for separation of t-distribution modelled astrophysical maps.

    Science.gov (United States)

    Kayabol, Koray; Kuruoglu, Ercan E; Sanz, José Luis; Sankur, Bülent; Salerno, Emanuele; Herranz, Diego

    2010-09-01

    We propose to model the image differentials of astrophysical source maps by Student's t-distribution and to use them in the Bayesian source separation method as priors. We introduce an efficient Markov Chain Monte Carlo (MCMC) sampling scheme to unmix the astrophysical sources and describe the derivation details. In this scheme, we use the Langevin stochastic equation for transitions, which enables parallel drawing of random samples from the posterior, and reduces the computation time significantly (by two orders of magnitude). In addition, Student's t-distribution parameters are updated throughout the iterations. The results on astrophysical source separation are assessed with two performance criteria defined in the pixel and the frequency domains.

  17. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  18. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  19. Bayesian analysis for the social sciences

    CERN Document Server

    Jackman, Simon

    2009-01-01

    Bayesian methods are increasingly being used in the social sciences, as the problems encountered lend themselves so naturally to the subjective qualities of Bayesian methodology. This book provides an accessible introduction to Bayesian methods, tailored specifically for social science students. It contains lots of real examples from political science, psychology, sociology, and economics, exercises in all chapters, and detailed descriptions of all the key concepts, without assuming any background in statistics beyond a first course. It features examples of how to implement the methods using WinBUGS - the most-widely used Bayesian analysis software in the world - and R - an open-source statistical software. The book is supported by a Website featuring WinBUGS and R code, and data sets.

  20. Bayesian analyses of cognitive architecture.

    Science.gov (United States)

    Houpt, Joseph W; Heathcote, Andrew; Eidels, Ami

    2017-06-01

    The question of cognitive architecture-how cognitive processes are temporally organized-has arisen in many areas of psychology. This question has proved difficult to answer, with many proposed solutions turning out to be spurious. Systems factorial technology (Townsend & Nozawa, 1995) provided the first rigorous empirical and analytical method of identifying cognitive architecture, using the survivor interaction contrast (SIC) to determine when people are using multiple sources of information in parallel or in series. Although the SIC is based on rigorous nonparametric mathematical modeling of response time distributions, for many years inference about cognitive architecture has relied solely on visual assessment. Houpt and Townsend (2012) recently introduced null hypothesis significance tests, and here we develop both parametric and nonparametric (encompassing prior) Bayesian inference. We show that the Bayesian approaches can have considerable advantages. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  1. Life cycle and economic assessment of source-separated MSW collection with regard to greenhouse gas emissions: a case study in China.

    Science.gov (United States)

    Dong, Jun; Ni, Mingjiang; Chi, Yong; Zou, Daoan; Fu, Chao

    2013-08-01

    In China, the continuously increasing amount of municipal solid waste (MSW) has resulted in an urgent need for changing the current municipal solid waste management (MSWM) system based on mixed collection. A pilot program focusing on source-separated MSW collection was thus launched (2010) in Hangzhou, China, to lessen the related environmental loads. And greenhouse gas (GHG) emissions (Kyoto Protocol) are singled out in particular. This paper uses life cycle assessment modeling to evaluate the potential environmental improvement with regard to GHG emissions. The pre-existing MSWM system is assessed as baseline, while the source separation scenario is compared internally. Results show that 23 % GHG emissions can be decreased by source-separated collection compared with the base scenario. In addition, the use of composting and anaerobic digestion (AD) is suggested for further optimizing the management of food waste. 260.79, 82.21, and -86.21 thousand tonnes of GHG emissions are emitted from food waste landfill, composting, and AD, respectively, proving the emission reduction potential brought by advanced food waste treatment technologies. Realizing the fact, a modified MSWM system is proposed by taking AD as food waste substitution option, with additional 44 % GHG emissions saved than current source separation scenario. Moreover, a preliminary economic assessment is implemented. It is demonstrated that both source separation scenarios have a good cost reduction potential than mixed collection, with the proposed new system the most cost-effective one.

  2. Isomer separation of $^{70g}Cu$ and $^{70m}Cu$ with a resonance ionization laser ion source

    CERN Document Server

    Köster, U; Mishin, V I; Weissman, L; Huyse, M; Kruglov, K; Müller, W F; Van Duppen, P; Van Roosbroeck, J; Thirolf, P G; Thomas, H C; Weisshaar, D W; Schulze, W; Borcea, R; La Commara, M; Schatz, H; Schmidt, K; Röttger, S; Huber, G; Sebastian, V; Kratz, K L; Catherall, R; Georg, U; Lettry, Jacques; Oinonen, M; Ravn, H L; Simon, H

    2000-01-01

    Radioactive copper isotopes were ionized with the resonance ionization laser ion source at the on-line isotope separator ISOLDE (CERN). Using the different hyperfine structure in the 3d/sup 10/ 4s /sup 2/S/sub 1/2/-3d/sup 10/ 4p /sup 2/P/sub 1/2//sup 0/ transition the low- and high-spin isomers of /sup 70/Cu were selectively enhanced by tuning the laser wavelength. The light was provided by a narrow-bandwidth dye laser pumped by copper vapor lasers and frequency doubled in a BBO crystal. The ground state to isomeric state intensity ratio could be varied by a factor of 30, allowing to assign gamma transitions unambiguously to the decay of the individual isomers. It is shown that the method can also be used to determine magnetic moments. In a first experiment for the 1/sup +/ ground state of /sup 70/Cu a magnetic moment of (+)1.8(3) mu /sub N/ and for the high-spin isomer of /sup 70/Cu a magnetic moment of (+or-)1.2(3) mu /sub N/ could be deduced. (20 refs).

  3. A Combined Independent Source Separation and Quality Index Optimization Method for Fetal ECG Extraction from Abdominal Maternal Leads.

    Science.gov (United States)

    Billeci, Lucia; Varanini, Maurizio

    2017-05-16

    The non-invasive fetal electrocardiogram (fECG) technique has recently received considerable interest in monitoring fetal health. The aim of our paper is to propose a novel fECG algorithm based on the combination of the criteria of independent source separation and of a quality index optimization (ICAQIO-based). The algorithm was compared with two methods applying the two different criteria independently-the ICA-based and the QIO-based methods-which were previously developed by our group. All three methods were tested on the recently implemented Fetal ECG Synthetic Database (FECGSYNDB). Moreover, the performance of the algorithm was tested on real data from the PhysioNet fetal ECG Challenge 2013 Database. The proposed combined method outperformed the other two algorithms on the FECGSYNDB (ICAQIO-based: 98.78%, QIO-based: 97.77%, ICA-based: 97.61%). Significant differences were obtained in particular in the conditions when uterine contractions and maternal and fetal ectopic beats occurred. On the real data, all three methods obtained very high performances, with the QIO-based method proving slightly better than the other two (ICAQIO-based: 99.38%, QIO-based: 99.76%, ICA-based: 99.37%). The findings from this study suggest that the proposed method could potentially be applied as a novel algorithm for accurate extraction of fECG, especially in critical recording conditions.

  4. Determining the Source of Water Vapor in a Cerium Oxide Electrochemical Oxygen Separator to Achieve Aviator Grade Oxygen

    Science.gov (United States)

    Graf, John; Taylor, Dale; Martinez, James

    2014-01-01

    ]. Combined with a mechanical compressor, a Solid Electrolyte Oxygen Separator (SEOS) should be capable of producing ABO grade oxygen at pressures >2400 psia, on the space station. Feasibility tests using a SEOS integrated with a mechanical compressor identified an unexpected contaminant in the oxygen: water vapour was found in the oxygen product, sometimes at concentrations higher than 40 ppm (the ABO limit for water vapour is 7 ppm). If solid electrolyte membranes are really "infinitely selective" to oxygen as they are reported to be, where did the water come from? If water is getting into the oxygen, what other contaminants might get into the oxygen? Microscopic analyses of wafers, welds, and oxygen delivery tubes were performed in an attempt to find the source of the water vapour contamination. Hot and cold pressure decay tests were performed. Measurements of water vapour as a function of O2 delivery rate, O2 delivery pressure, and process air humidity levels were the most instructive in finding the source of water contamination (Fig 3). Water contamination was directly affected by oxygen delivery rate (doubling the oxygen production rate cut the water level in half). Water was affected by process air humidity levels and delivery pressure in a way that indicates the water was diffusing into the oxygen delivery system.

  5. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...

  6. The Bayesian Score Statistic

    NARCIS (Netherlands)

    Kleibergen, F.R.; Kleijn, R.; Paap, R.

    2000-01-01

    We propose a novel Bayesian test under a (noninformative) Jeffreys'priorspecification. We check whether the fixed scalar value of the so-calledBayesian Score Statistic (BSS) under the null hypothesis is aplausiblerealization from its known and standardized distribution under thealternative. Unlike

  7. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  8. Growth and yield of tomato cultivated on composted duck excreta enriched wood shavings and source-separated municipal solid waste

    Directory of Open Access Journals (Sweden)

    Vincent Zoes

    2011-02-01

    Full Text Available A greenhouse experiment was conducted to evaluate the use of growth substrates, made with duck excreta enriched wood shaving compost (DMC and the organic fraction of source-separated municipal solid waste (MSW compost, on the growth and yield of tomato (Lycopersicum esculentum Mill. cv. Campbell 1327. Substrate A consisted of 3:2 (W/W proportion of DMC and MSW composts. Substrates B and C were the same as A but contained 15% (W/W ratio of brick dust and shredded plastic, respectively. Three control substrates consisted of the commercially available peat-based substrate (Pr, an in-house sphagnum peat-based substrate (Gs, and black earth mixed with sandy loam soil (BE/S in a 1:4 (W/W ratio. Substrates (A, B, C and controls received nitrogen (N, phosphate (P and potassium (K at equivalent rates of 780 mg/pot, 625 mg/pot, and 625 mg/pot, respectively, or were used without mineral fertilizers. Compared to the controls (Pr, Gs and BE/S, tomato plants grown on A, B, and C produced a greater total number and dry mass of fruits, with no significant differences between them. On average, total plant dry-matter biomass in substrate A, B, and C was 19% lower than that produced on Pr, but 28% greater than biomass obtained for plant grown, on Gs and BE/S. Plant height, stem diameter and chlorophyll concentrations indicate that substrates A, B, and C were particularly suitable for plant growth. Although the presence of excess N in composted substrates favoured vegetative rather than reproductive growth, the continuous supply of nutrients throughout the growing cycle, as well as the high water retention capacity that resulted in a reduced watering by 50%, suggest that substrates A, B, and C were suitable growing mixes, offering environmental and agronomic advantages.

  9. Emissions of volatile organic compounds (VOCs) from concentrated animal feeding operations (CAFOs): chemical compositions and separation of sources

    Science.gov (United States)

    Yuan, Bin; Coggon, Matthew M.; Koss, Abigail R.; Warneke, Carsten; Eilerman, Scott; Peischl, Jeff; Aikin, Kenneth C.; Ryerson, Thomas B.; de Gouw, Joost A.

    2017-04-01

    Concentrated animal feeding operations (CAFOs) emit a large number of volatile organic compounds (VOCs) to the atmosphere. In this study, we conducted mobile laboratory measurements of VOCs, methane (CH4) and ammonia (NH3) downwind of dairy cattle, beef cattle, sheep and chicken CAFO facilities in northeastern Colorado using a hydronium ion time-of-flight chemical-ionization mass spectrometer (H3O+ ToF-CIMS), which can detect numerous VOCs. Regional measurements of CAFO emissions in northeastern Colorado were also performed using the NOAA WP-3D aircraft during the Shale Oil and Natural Gas Nexus (SONGNEX) campaign. Alcohols and carboxylic acids dominate VOC concentrations and the reactivity of the VOCs with hydroxyl (OH) radicals. Sulfur-containing and phenolic species provide the largest contributions to the odor activity values and the nitrate radical (NO3) reactivity of VOC emissions, respectively. VOC compositions determined from mobile laboratory and aircraft measurements generally agree well with each other. The high time-resolution mobile measurements allow for the separation of the sources of VOCs from different parts of the operations occurring within the facilities. We show that the emissions of ethanol are primarily associated with feed storage and handling. Based on mobile laboratory measurements, we apply a multivariate regression analysis using NH3 and ethanol as tracers to determine the relative importance of animal-related emissions (animal exhalation and waste) and feed-related emissions (feed storage and handling) for different VOC species. Feed storage and handling contribute significantly to emissions of alcohols, carbonyls, carboxylic acids and sulfur-containing species. Emissions of phenolic species and nitrogen-containing species are predominantly associated with animals and their waste.

  10. Characterisation of the biochemical methane potential (BMP) of individual material fractions in Danish source-separated organic household waste.

    Science.gov (United States)

    Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte

    2016-04-01

    This study is dedicated to characterising the chemical composition and biochemical methane potential (BMP) of individual material fractions in untreated Danish source-separated organic household waste (SSOHW). First, data on SSOHW in different countries, available in the literature, were evaluated and then, secondly, laboratory analyses for eight organic material fractions comprising Danish SSOHW were conducted. No data were found in the literature that fully covered the objectives of the present study. Based on laboratory analyses, all fractions were assigned according to their specific properties in relation to BMP, protein content, lipids, lignocellulose biofibres and easily degradable carbohydrates (carbohydrates other than lignocellulose biofibres). The three components in lignocellulose biofibres, i.e. lignin, cellulose and hemicellulose, were differentiated, and theoretical BMP (TBMP) and material degradability (BMP from laboratory incubation tests divided by TBMP) were expressed. Moreover, the degradability of lignocellulose biofibres (the share of volatile lignocellulose biofibre solids degraded in laboratory incubation tests) was calculated. Finally, BMP for average SSOHW composition in Denmark (untreated) was calculated, and the BMP contribution of the individual material fractions was then evaluated. Material fractions of the two general waste types, defined as "food waste" and "fibre-rich waste," were found to be anaerobically degradable with considerable BMP. Material degradability of material fractions such as vegetation waste, moulded fibres, animal straw, dirty paper and dirty cardboard, however, was constrained by lignin content. BMP for overall SSOHW (untreated) was 404 mL CH4 per g VS, which might increase if the relative content of material fractions, such as animal and vegetable food waste, kitchen tissue and dirty paper in the waste, becomes larger. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. System approach to robust acoustic echo cancellation through semi-blind source separation based on independent component analysis

    Science.gov (United States)

    Wada, Ted S.

    In this dissertation, we build a foundation for what we refer to as the system approach to signal enhancement as we focus on the acoustic echo cancellation (AEC) problem. Such a “system” perspective aims for the integration of individual components, or algorithms, into a cohesive unit for the benefit of the system as a whole to cope with real-world enhancement problems. The standard system identification approach by minimizing the mean square error (MSE) of a linear system is sensitive to distortions that greatly affect the quality of the identification result. Therefore, we begin by examining in detail the technique of using a noise-suppressing nonlinearity in the adaptive filter error feedback-loop of the LMS algorithm when there is an interference at the near end, where the source of distortion may be linear or nonlinear. We provide a thorough derivation and analysis of the error recovery nonlinearity (ERN) that “enhances” the filter estimation error prior to the adaptation to transform the corrupted error’s distribution into a desired one, or very close to it, in order to assist the linear adaptation process. We reveal important connections of the residual echo enhancement (REE) technique to other existing AEC and signal enhancement procedures, where the technique is well-founded in the information-theoretic sense and has strong ties to independent component analysis (ICA), which is the basis for blind source separation (BSS) that permits unsupervised adaptation in the presence of multiple interfering signals. Notably, the single-channel AEC problem can be viewed as a special case of semi-blind source separation (SBSS) where one of the source signals is partially known, i.e., the far-end microphone signal that generates the near-end acoustic echo. Indeed, SBSS optimized via ICA leads to the system combination of the LMS algorithm with the ERN that allows continuous and stable adaptation even during double talk. Next, we extend the system perspective

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE — BAYSAVER TECHNOLOGIES, INC. BAYSAVER SEPARATION SYSTEM, MODEL 10K

    Science.gov (United States)

    Verification testing of the BaySaver Separation System, Model 10K was conducted on a 10 acre drainage basin near downtown Griffin, Georgia. The system consists of two water tight pre-cast concrete manholes and a high-density polyethylene BaySaver Separator Unit. The BaySaver Mod...

  13. Bayesian data analysis for newcomers.

    Science.gov (United States)

    Kruschke, John K; Liddell, Torrin M

    2018-02-01

    This article explains the foundational concepts of Bayesian data analysis using virtually no mathematical notation. Bayesian ideas already match your intuitions from everyday reasoning and from traditional data analysis. Simple examples of Bayesian data analysis are presented that illustrate how the information delivered by a Bayesian analysis can be directly interpreted. Bayesian approaches to null-value assessment are discussed. The article clarifies misconceptions about Bayesian methods that newcomers might have acquired elsewhere. We discuss prior distributions and explain how they are not a liability but an important asset. We discuss the relation of Bayesian data analysis to Bayesian models of mind, and we briefly discuss what methodological problems Bayesian data analysis is not meant to solve. After you have read this article, you should have a clear sense of how Bayesian data analysis works and the sort of information it delivers, and why that information is so intuitive and useful for drawing conclusions from data.

  14. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste

    International Nuclear Information System (INIS)

    Karim Ghani, Wan Azlina Wan Ab.; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni

    2013-01-01

    Highlights: ► Theory of planned behaviour (TPB) has been conducted to identify the influencing factors for participation in source separation of food waste using self administered questionnaires. ► The findings suggested several implications for the development and implementation of waste separation at home programme. ► The analysis indicates that the attitude towards waste separation is determined as the main predictors where this in turn could be a significant predictor of the repondent’s actual food waste separation behaviour. ► To date, none of similar have been reported elsewhere and this finding will be beneficial to local Authorities as indicator in designing campaigns to promote the use of waste separation programmes to reinforce the positive attitudes. - Abstract: Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public’s view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and

  15. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  16. Co-digestion and model simulations of source separated municipal organic waste with cattle manure under batch and continuously stirred tank reactors

    DEFF Research Database (Denmark)

    Tsapekos, Panagiotis; Kougias, Panagiotis; Kuthiala, Sidhant

    2018-01-01

    This study investigates the co-digestion of source separated municipal organic waste (SSMOW), pretreated using a biopulper, and cattle manure both in batch and continuous stirred tank reactors. The optimum co-digestion feeding mixture was consisted of 90% SSMOW and 10% cattle manure on organic...

  17. An application of the theory of planned behaviour to study the influencing factors of participation in source separation of food waste.

    Science.gov (United States)

    Karim Ghani, Wan Azlina Wan Ab; Rusli, Iffah Farizan; Biak, Dayang Radiah Awang; Idris, Azni

    2013-05-01

    Tremendous increases in biodegradable (food waste) generation significantly impact the local authorities, who are responsible to manage, treat and dispose of this waste. The process of separation of food waste at its generation source is identified as effective means in reducing the amount food waste sent to landfill and can be reused as feedstock to downstream treatment processes namely composting or anaerobic digestion. However, these efforts will only succeed with positive attitudes and highly participations rate by the public towards the scheme. Thus, the social survey (using questionnaires) to analyse public's view and influencing factors towards participation in source separation of food waste in households based on the theory of planned behaviour technique (TPB) was performed in June and July 2011 among selected staff in Universiti Putra Malaysia, Serdang, Selangor. The survey demonstrates that the public has positive intention in participating provided the opportunities, facilities and knowledge on waste separation at source are adequately prepared by the respective local authorities. Furthermore, good moral values and situational factors such as storage convenience and collection times are also encouraged public's involvement and consequently, the participations rate. The findings from this study may provide useful indicator to the waste management authorities in Malaysia in identifying mechanisms for future development and implementation of food waste source separation activities in household programmes and communication campaign which advocate the use of these programmes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Statistics: a Bayesian perspective

    National Research Council Canada - National Science Library

    Berry, Donald A

    1996-01-01

    ...: it is the only introductory textbook based on Bayesian ideas, it combines concepts and methods, it presents statistics as a means of integrating data into the significant process, it develops ideas...

  19. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  20. Variational Bayesian Filtering

    Czech Academy of Sciences Publication Activity Database

    Šmídl, Václav; Quinn, A.

    2008-01-01

    Roč. 56, č. 10 (2008), s. 5020-5030 ISSN 1053-587X R&D Projects: GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian filtering * particle filtering * Variational Bayes Subject RIV: BC - Control Systems Theory Impact factor: 2.335, year: 2008 http://library.utia.cas.cz/separaty/2008/AS/smidl-variational bayesian filtering.pdf

  1. Bayesian Networks An Introduction

    CERN Document Server

    Koski, Timo

    2009-01-01

    Bayesian Networks: An Introduction provides a self-contained introduction to the theory and applications of Bayesian networks, a topic of interest and importance for statisticians, computer scientists and those involved in modelling complex data sets. The material has been extensively tested in classroom teaching and assumes a basic knowledge of probability, statistics and mathematics. All notions are carefully explained and feature exercises throughout. Features include:.: An introduction to Dirichlet Distribution, Exponential Families and their applications.; A detailed description of learni

  2. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  3. Residents’ Waste Separation Behaviors at the Source: Using SEM with the Theory of Planned Behavior in Guangzhou, China

    Science.gov (United States)

    Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua

    2015-01-01

    Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors. PMID:26274969

  4. Residents' Waste Separation Behaviors at the Source: Using SEM with the Theory of Planned Behavior in Guangzhou, China.

    Science.gov (United States)

    Zhang, Dongliang; Huang, Guangqing; Yin, Xiaoling; Gong, Qinghua

    2015-08-12

    Understanding the factors that affect residents' waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB), this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors). The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors.

  5. Residents’ Waste Separation Behaviors at the Source: Using SEM with the Theory of Planned Behavior in Guangzhou, China

    Directory of Open Access Journals (Sweden)

    Dongliang Zhang

    2015-08-01

    Full Text Available Understanding the factors that affect residents’ waste separation behaviors helps in constructing effective environmental campaigns for a community. Using the theory of planned behavior (TPB, this study examines factors associated with waste separation behaviors by analyzing responses to questionnaires distributed in Guangzhou, China. Data drawn from 208 of 1000-field questionnaires were used to assess socio-demographic factors and the TPB constructs (i.e., attitudes, subjective norms, perceived behavioral control, intentions, and situational factors. The questionnaire data revealed that attitudes, subjective norms, perceived behavioral control, intentions, and situational factors significantly predicted household waste behaviors in Guangzhou, China. Through a structural equation modeling analysis, we concluded that campaigns targeting moral obligations may be particularly effective for increasing the participation rate in waste separation behaviors.

  6. A Compound fault diagnosis for rolling bearings method based on blind source separation and ensemble empirical mode decomposition.

    Science.gov (United States)

    Wang, Huaqing; Li, Ruitong; Tang, Gang; Yuan, Hongfang; Zhao, Qingliang; Cao, Xi

    2014-01-01

    A Compound fault signal usually contains multiple characteristic signals and strong confusion noise, which makes it difficult to separate week fault signals from them through conventional ways, such as FFT-based envelope detection, wavelet transform or empirical mode decomposition individually. In order to improve the compound faults diagnose of rolling bearings via signals' separation, the present paper proposes a new method to identify compound faults from measured mixed-signals, which is based on ensemble empirical mode decomposition (EEMD) method and independent component analysis (ICA) technique. With the approach, a vibration signal is firstly decomposed into intrinsic mode functions (IMF) by EEMD method to obtain multichannel signals. Then, according to a cross correlation criterion, the corresponding IMF is selected as the input matrix of ICA. Finally, the compound faults can be separated effectively by executing ICA method, which makes the fault features more easily extracted and more clearly identified. Experimental results validate the effectiveness of the proposed method in compound fault separating, which works not only for the outer race defect, but also for the rollers defect and the unbalance fault of the experimental system.

  7. Characterising the vertical separation of shale-gas source rocks and aquifers across England and Wales (UK)

    Science.gov (United States)

    Loveless, Sian E.; Bloomfield, John P.; Ward, Robert S.; Hart, Alwyn J.; Davey, Ian R.; Lewis, Melinda A.

    2018-03-01

    Shale gas is considered by many to have the potential to provide the UK with greater energy security, economic growth and jobs. However, development of a shale gas industry is highly contentious due to environmental concerns including the risk of groundwater pollution. Evidence suggests that the vertical separation between exploited shale units and aquifers is an important factor in the risk to groundwater from shale gas exploitation. A methodology is presented to assess the vertical separation between different pairs of aquifers and shales that are present across England and Wales. The application of the method is then demonstrated for two of these pairs—the Cretaceous Chalk Group aquifer and the Upper Jurassic Kimmeridge Clay Formation, and the Triassic sandstone aquifer and the Carboniferous Bowland Shale Formation. Challenges in defining what might be considered criteria for `safe separation' between a shale gas formation and an overlying aquifer are discussed, in particular with respect to uncertainties in geological properties, aquifer extents and determination of socially acceptable risk levels. Modelled vertical separations suggest that the risk of aquifer contamination from shale exploration will vary greatly between shale-aquifer pairs and between regions and this will need to be considered carefully as part of the risk assessment and management for any shale gas development.

  8. Development of an Ionization Scheme for Gold using the Selective Laser Ion Source at the On-Line Isotope Separator ISOLDE

    CERN Document Server

    Fedosseev, V; Marsh, B A; CERN. Geneva. AB Department

    2006-01-01

    At the ISOLDE on-line isotope separation facility, the resonance ionization laser ion source (RILIS) can be used to ionize reaction products as they effuse from the target. The RILIS process of laser step-wise resonance ionization of atoms in a hot metal cavity provides a highly element selective stage in the preparation of the radioactive ion beam. As a result, the ISOLDE mass separators can provide beams of a chosen isotope with greatly reduced isobaric contamination. The number of elements available at RILIS has been extended to 26, with the addition of a new three-step ionization scheme for gold. The optimal ionization scheme was determined during an extensive study of the atomic energy levels and auto-ionizing states of gold, carried out by means of in-source resonance ionization spectroscopy. Details of the ionization scheme and a summary of the spectroscopy study are presented.

  9. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  10. The 2004-2006 uplift episode at Campi Flegrei caldera (Italy): Constraints from SBAS-DInSAR ENVISAT data and Bayesian source inference

    Science.gov (United States)

    Trasatti, E.; Casu, F.; Giunchi, C.; Pepe, S.; Solaro, G.; Tagliaventi, S.; Berardino, P.; Manzo, M.; Pepe, A.; Ricciardi, G. P.; Sansosti, E.; Tizzani, P.; Zeni, G.; Lanari, R.

    2008-04-01

    We investigate the 2004-2006 uplift phase of Campi Flegrei caldera (Italy) by exploiting the archive of ascending and descending ENVISAT SAR data acquired from November 2002 to November 2006. The SBAS-DInSAR technique is applied to generate displacement mean velocity maps and time series. An appropriate post-processing step is subsequently applied to map the areas whose temporal deformation behavior is correlated with that of the maximum uplift zone. Our results show that the deformation also extends outside the volcanological limits of the Neapolitan Yellow Tuff caldera, without significant discontinuities. The DInSAR data are inverted by considering a finite spheroid and an isotropic point-source. The inversion results suggest that the new uplift is characterized by a source location similar to the previous small uplift event of 2000 and to the long term subsidence of the 1990's. In particular, the source is located at a depth of about 3.2 km and very close to the city of Pozzuoli (about 800 m offshore, to the SW); the associated volume variation is about 1.1 106 m3/year.

  11. Bayesian Decision Support

    Science.gov (United States)

    Berliner, M.

    2017-12-01

    Bayesian statistical decision theory offers a natural framework for decision-policy making in the presence of uncertainty. Key advantages of the approach include efficient incorporation of information and observations. However, in complicated settings it is very difficult, perhaps essentially impossible, to formalize the mathematical inputs needed in the approach. Nevertheless, using the approach as a template is useful for decision support; that is, organizing and communicating our analyses. Bayesian hierarchical modeling is valuable in quantifying and managing uncertainty such cases. I review some aspects of the idea emphasizing statistical model development and use in the context of sea-level rise.

  12. Bayesian Exploratory Factor Analysis

    Science.gov (United States)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  13. Recommendations for developing the optimum method for da installations to comply with 40 CFR 246-source separation for materials recovery guidelines. Interim report

    Energy Technology Data Exchange (ETDEWEB)

    Donahue, B.A.; Gershman, H.W.; Gardner, W.P.; Price, J.D.

    1977-05-01

    This report is the end-product of a study which involved both an examination of source separation techniques for materials recovery and an evaluation of current Department of the Army (DA) operations, especially those related to solid waste management. The purpose of the study was to arrive at the optimum method or methods by which DA installations could integrate source separation procedures into their current solid waste management activities, as required by the U.S. Environmental Protection Agency's 40 CFR 246 -- Source Separation for Materials Recovery Guidelines. The report was written with the goal of providing assistance -- both background information and specific instructions for procedures -- to the managing activities faced with compliance with the guidelines at the installation level. The procedures recommended in the report are those which can most easily be included in current operations while still producing significant resource recovery results. It is considered advisable for DA to institute only the 'Required' procedures from the guidelines, acting on the 'Recommended' procedures only where they can be handled with existing equipment, personnel, and funds. Procedural changes at the installation level must be preceded by DA and Department of Defense policy and guidance alterations, for which Recommendations are also provided.

  14. A Novel Blind Source Separation Algorithm and Performance Analysis of Weak Signal against Strong Interference in Passive Radar Systems

    Directory of Open Access Journals (Sweden)

    Chengjie Li

    2016-01-01

    Full Text Available In Passive Radar System, obtaining the mixed weak object signal against the super power signal (jamming is still a challenging task. In this paper, a novel framework based on Passive Radar System is designed for weak object signal separation. Firstly, we propose an Interference Cancellation algorithm (IC-algorithm to extract the mixed weak object signals from the strong jamming. Then, an improved FastICA algorithm with K-means cluster is designed to separate each weak signal from the mixed weak object signals. At last, we discuss the performance of the proposed method and verify the novel method based on several simulations. The experimental results demonstrate the effectiveness of the proposed method.

  15. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  16. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  17. Problems in the fingerprints based polycyclic aromatic hydrocarbons source apportionment analysis and a practical solution

    International Nuclear Information System (INIS)

    Zou, Yonghong; Wang, Lixia; Christensen, Erik R.

    2015-01-01

    This work intended to explain the challenges of the fingerprints based source apportionment method for polycyclic aromatic hydrocarbons (PAH) in the aquatic environment, and to illustrate a practical and robust solution. The PAH data detected in the sediment cores from the Illinois River provide the basis of this study. Principal component analysis (PCA) separates PAH compounds into two groups reflecting their possible airborne transport patterns; but it is not able to suggest specific sources. Not all positive matrix factorization (PMF) determined sources are distinguishable due to the variability of source fingerprints. However, they constitute useful suggestions for inputs for a Bayesian chemical mass balance (CMB) analysis. The Bayesian CMB analysis takes into account the measurement errors as well as the variations of source fingerprints, and provides a credible source apportionment. Major PAH sources for Illinois River sediments are traffic (35%), coke oven (24%), coal combustion (18%), and wood combustion (14%). - Highlights: • Fingerprint variability poses challenges in PAH source apportionment analysis. • PCA can be used to group compounds or cluster measurements. • PMF requires results validation but is useful for source suggestion. • Bayesian CMB provide practical and credible solution. - A Bayesian CMB model combined with PMF is a practical and credible fingerprints based PAH source apportionment method

  18. A new method for quantifying the performance of EEG blind source separation algorithms by referencing a simultaneously recorded ECoG signal.

    Science.gov (United States)

    Oosugi, Naoya; Kitajo, Keiichi; Hasegawa, Naomi; Nagasaka, Yasuo; Okanoya, Kazuo; Fujii, Naotaka

    2017-09-01

    Blind source separation (BSS) algorithms extract neural signals from electroencephalography (EEG) data. However, it is difficult to quantify source separation performance because there is no criterion to dissociate neural signals and noise in EEG signals. This study develops a method for evaluating BSS performance. The idea is neural signals in EEG can be estimated by comparison with simultaneously measured electrocorticography (ECoG). Because the ECoG electrodes cover the majority of the lateral cortical surface and should capture most of the original neural sources in the EEG signals. We measured real EEG and ECoG data and developed an algorithm for evaluating BSS performance. First, EEG signals are separated into EEG components using the BSS algorithm. Second, the EEG components are ranked using the correlation coefficients of the ECoG regression and the components are grouped into subsets based on their ranks. Third, canonical correlation analysis estimates how much information is shared between the subsets of the EEG components and the ECoG signals. We used our algorithm to compare the performance of BSS algorithms (PCA, AMUSE, SOBI, JADE, fastICA) via the EEG and ECoG data of anesthetized nonhuman primates. The results (Best case >JADE = fastICA >AMUSE = SOBI ≥ PCA >random separation) were common to the two subjects. To encourage the further development of better BSS algorithms, our EEG and ECoG data are available on our Web site (http://neurotycho.org/) as a common testing platform. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  19. Bayesian logistic regression analysis

    NARCIS (Netherlands)

    Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.

    2012-01-01

    In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an

  20. Bayesian statistical inference

    Directory of Open Access Journals (Sweden)

    Bruno De Finetti

    2017-04-01

    Full Text Available This work was translated into English and published in the volume: Bruno De Finetti, Induction and Probability, Biblioteca di Statistica, eds. P. Monari, D. Cocchi, Clueb, Bologna, 1993.Bayesian statistical Inference is one of the last fundamental philosophical papers in which we can find the essential De Finetti's approach to the statistical inference.

  1. Investigation of source-detector separation optimization for an implantable perfusion and oxygenation sensor for liver blood vessels

    Energy Technology Data Exchange (ETDEWEB)

    Baba, Justin S [ORNL; Akl, Tony [Texas A& M University; Cote, Gerard L. [Texas A& M University; Wilson, Mark A. [University of Pittsburgh School of Medicine, Pittsburgh PA; Ericson, Milton Nance [ORNL

    2011-01-01

    An implanted system is being developed to monitor transplanted liver health during the critical 7-10 day period posttransplantation. The unit will monitor organ perfusion and oxygen consumption using optically-based probes placed on both the inflow and outflow blood vessels, and on the liver parenchymal surface. Sensing probes are based on a 3- wavelength LED source and a photodiode detector. Sample diffuse reflectance is measured at 735, 805, and 940 nm. To ascertain optimal source-to-photodetector spacing for perfusion measurement in blood vessels, an ex vivo study was conducted. In this work, a dye mixture simulating 80% blood oxygen saturation was developed and perfused through excised porcine arteries while collecting data for various preset probe source-to-photodetector spacings. The results from this study demonstrate a decrease in the optical signal with decreasing LED drive current and a reduction in perfusion index signal with increasing probe spacing. They also reveal a 2- to 4-mm optimal range for blood vessel perfusion probe source-to-photodetector spacing that allows for sufficient perfusion signal modulation depth with maximized signal to noise ratio (SNR). These findings are currently being applied to guide electronic configuration and probe placement for in vivo liver perfusion porcine model studies.

  2. Entropy, Information Theory, Information Geometry and Bayesian Inference in Data, Signal and Image Processing and Inverse Problems

    Directory of Open Access Journals (Sweden)

    Ali Mohammad-Djafari

    2015-06-01

    Full Text Available The main content of this review article is first to review the main inference tools using Bayes rule, the maximum entropy principle (MEP, information theory, relative entropy and the Kullback–Leibler (KL divergence, Fisher information and its corresponding geometries. For each of these tools, the precise context of their use is described. The second part of the paper is focused on the ways these tools have been used in data, signal and image processing and in the inverse problems, which arise in different physical sciences and engineering applications. A few examples of the applications are described: entropy in independent components analysis (ICA and in blind source separation, Fisher information in data model selection, different maximum entropy-based methods in time series spectral estimation and in linear inverse problems and, finally, the Bayesian inference for general inverse problems. Some original materials concerning the approximate Bayesian computation (ABC and, in particular, the variational Bayesian approximation (VBA methods are also presented. VBA is used for proposing an alternative Bayesian computational tool to the classical Markov chain Monte Carlo (MCMC methods. We will also see that VBA englobes joint maximum a posteriori (MAP, as well as the different expectation-maximization (EM algorithms as particular cases.

  3. Bayesian optimization for materials science

    CERN Document Server

    Packwood, Daniel

    2017-01-01

    This book provides a short and concise introduction to Bayesian optimization specifically for experimental and computational materials scientists. After explaining the basic idea behind Bayesian optimization and some applications to materials science in Chapter 1, the mathematical theory of Bayesian optimization is outlined in Chapter 2. Finally, Chapter 3 discusses an application of Bayesian optimization to a complicated structure optimization problem in computational surface science. Bayesian optimization is a promising global optimization technique that originates in the field of machine learning and is starting to gain attention in materials science. For the purpose of materials design, Bayesian optimization can be used to predict new materials with novel properties without extensive screening of candidate materials. For the purpose of computational materials science, Bayesian optimization can be incorporated into first-principles calculations to perform efficient, global structure optimizations. While re...

  4. Dissipation of pesticides during composting and anaerobic digestion of source-separated organic waste at full-scale plants.

    Science.gov (United States)

    Kupper, Thomas; Bucheli, Thomas D; Brändli, Rahel C; Ortelli, Didier; Edder, Patrick

    2008-11-01

    In the present study, concentration levels and dissipation of modern pesticides during composting and digestion at full-scale plants were followed. Of the 271 pesticides analyzed, 28 were detected. Within the three windrows studied, total concentrations were between 36 and 101microg per kg of dry matter (d.m.) in input materials and between 8 and 20microg kg d.m.(-1) in composts after 112 days of treatment. Fungicides and among them triazoles clearly dominated over other pesticides. More than two-thirds of all pesticides detected in the input materials showed dissipation rates higher than 50% during composting, whilst levels of most triazoles decreased slightly or remained unchanged. The investigation on semi-dry thermophilic anaerobic digestion suggests that pesticides preferentially end up in presswater after solid-liquid separation.

  5. A Novel Partial Discharge Ultra-High Frequency Signal De-Noising Method Based on a Single-Channel Blind Source Separation Algorithm

    Directory of Open Access Journals (Sweden)

    Liangliang Wei

    2018-02-01

    Full Text Available To effectively de-noise the Gaussian white noise and periodic narrow-band interference in the background noise of partial discharge ultra-high frequency (PD UHF signals in field tests, a novel de-noising method, based on a single-channel blind source separation algorithm, is proposed. Compared with traditional methods, the proposed method can effectively de-noise the noise interference, and the distortion of the de-noising PD signal is smaller. Firstly, the PD UHF signal is time-frequency analyzed by S-transform to obtain the number of source signals. Then, the single-channel detected PD signal is converted into multi-channel signals by singular value decomposition (SVD, and background noise is separated from multi-channel PD UHF signals by the joint approximate diagonalization of eigen-matrix method. At last, the source PD signal is estimated and recovered by the l1-norm minimization method. The proposed de-noising method was applied on the simulation test and field test detected signals, and the de-noising performance of the different methods was compared. The simulation and field test results demonstrate the effectiveness and correctness of the proposed method.

  6. Bayesian information fusion networks for biosurveillance applications.

    Science.gov (United States)

    Mnatsakanyan, Zaruhi R; Burkom, Howard S; Coberly, Jacqueline S; Lombardo, Joseph S

    2009-01-01

    This study introduces new information fusion algorithms to enhance disease surveillance systems with Bayesian decision support capabilities. A detection system was built and tested using chief complaints from emergency department visits, International Classification of Diseases Revision 9 (ICD-9) codes from records of outpatient visits to civilian and military facilities, and influenza surveillance data from health departments in the National Capital Region (NCR). Data anomalies were identified and distribution of time offsets between events in the multiple data streams were established. The Bayesian Network was built to fuse data from multiple sources and identify influenza-like epidemiologically relevant events. Results showed increased specificity compared with the alerts generated by temporal anomaly detection algorithms currently deployed by NCR health departments. Further research should be done to investigate correlations between data sources for efficient fusion of the collected data.

  7. Separation of fission produced 106Ru from simulated high level nuclear wastes for production of brachytherapy sources

    International Nuclear Information System (INIS)

    Blicharska, Magdalena; Bartoś, Barbara; Krajewski, Seweryn; Bilewicz, Aleksander

    2014-01-01

    Brachytherapy is the common method for treating various tumors, and currently 106 Ru and 125 I applicators are the most frequently used. Considering that 106 Ru is a β emitter with maximum energy of 3.54 MeV, it is best indicated in the treatment of small melanomas, with up to 20 mm tissue range. 106 Ru is commercially obtained from neutron irradiated high enrichment 235 U target in process of production 99 Mo. At present, there are only a handful of ageing reactors worldwide capable of producing the 99 Mo, therefore alternative strategies for production of this key medical isotope are explored. In our work, we propose to use liquid high-level radioactive waste as a source of high activity of 106 Ru. Simple calculations indicate that 1 dm 3 of HLLW solution after 4 years of cooling contains about 500 GBq of 106 Ru. This amount of activity is enough for production of about few thousands of brachytherapy sources. Present communication reports results of our process development studies on the recovery of ruthenium radioisotopes from simulated solution of high level radioactive waste using oxidation-extraction method

  8. Spherical mesh adaptive direct search for separating quasi-uncorrelated sources by range-based independent component analysis.

    Science.gov (United States)

    Selvan, S Easter; Borckmans, Pierre B; Chattopadhyay, A; Absil, P-A

    2013-09-01

    It is seemingly paradoxical to the classical definition of the independent component analysis (ICA), that in reality, the true sources are often not strictly uncorrelated. With this in mind, this letter concerns a framework to extract quasi-uncorrelated sources with finite supports by optimizing a range-based contrast function under unit-norm constraints (to handle the inherent scaling indeterminacy of ICA) but without orthogonality constraints. Albeit the appealing contrast properties of the range-based function (e.g., the absence of mixing local optima), the function is not differentiable everywhere. Unfortunately, there is a dearth of literature on derivative-free optimizers that effectively handle such a nonsmooth yet promising contrast function. This is the compelling reason for the design of a nonsmooth optimization algorithm on a manifold of matrices having unit-norm columns with the following objectives: to ascertain convergence to a Clarke stationary point of the contrast function and adhere to the necessary unit-norm constraints more naturally. The proposed nonsmooth optimization algorithm crucially relies on the design and analysis of an extension of the mesh adaptive direct search (MADS) method to handle locally Lipschitz objective functions defined on the sphere. The applicability of the algorithm in the ICA domain is demonstrated with simulations involving natural, face, aerial, and texture images.

  9. Bayesian coronal seismology

    Science.gov (United States)

    Arregui, Iñigo

    2018-01-01

    In contrast to the situation in a laboratory, the study of the solar atmosphere has to be pursued without direct access to the physical conditions of interest. Information is therefore incomplete and uncertain and inference methods need to be employed to diagnose the physical conditions and processes. One of such methods, solar atmospheric seismology, makes use of observed and theoretically predicted properties of waves to infer plasma and magnetic field properties. A recent development in solar atmospheric seismology consists in the use of inversion and model comparison methods based on Bayesian analysis. In this paper, the philosophy and methodology of Bayesian analysis are first explained. Then, we provide an account of what has been achieved so far from the application of these techniques to solar atmospheric seismology and a prospect of possible future extensions.

  10. Bayesian community detection.

    Science.gov (United States)

    Mørup, Morten; Schmidt, Mikkel N

    2012-09-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled.

  11. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  12. Bayesian Hypothesis Testing

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Stephen A. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Sigeti, David E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-15

    These are a set of slides about Bayesian hypothesis testing, where many hypotheses are tested. The conclusions are the following: The value of the Bayes factor obtained when using the median of the posterior marginal is almost the minimum value of the Bayes factor. The value of τ2 which minimizes the Bayes factor is a reasonable choice for this parameter. This allows a likelihood ratio to be computed with is the least favorable to H0.

  13. Bayesian networks in reliability

    Energy Technology Data Exchange (ETDEWEB)

    Langseth, Helge [Department of Mathematical Sciences, Norwegian University of Science and Technology, N-7491 Trondheim (Norway)]. E-mail: helgel@math.ntnu.no; Portinale, Luigi [Department of Computer Science, University of Eastern Piedmont ' Amedeo Avogadro' , 15100 Alessandria (Italy)]. E-mail: portinal@di.unipmn.it

    2007-01-15

    Over the last decade, Bayesian networks (BNs) have become a popular tool for modelling many kinds of statistical problems. We have also seen a growing interest for using BNs in the reliability analysis community. In this paper we will discuss the properties of the modelling framework that make BNs particularly well suited for reliability applications, and point to ongoing research that is relevant for practitioners in reliability.

  14. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...... economics, with careful controls for the confounding effects of risk aversion. Our results show that risk aversion significantly alters inferences on deviations from Bayes’ Rule....

  15. Approximate Bayesian recursive estimation

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav

    2014-01-01

    Roč. 285, č. 1 (2014), s. 100-111 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf

  16. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  17. Fate of PCBs, PAHs and their source characteristic ratios during composting and digestion of source-separated organic waste in full-scale plants

    International Nuclear Information System (INIS)

    Braendli, Rahel C.; Bucheli, Thomas D.; Kupper, Thomas; Mayer, Jochen; Stadelmann, Franz X.; Tarradellas, Joseph

    2007-01-01

    Composting and digestion are important waste management strategies. However, the resulting products can contain significant amounts of organic pollutants such as polychlorinated biphenyls (PCBs) and polycyclic aromatic hydrocarbons (PAHs). In this study we followed the concentration changes of PCBs and PAHs during composting and digestion on field-scale for the first time. Concentrations of low-chlorinated PCBs increased during composting (about 30%), whereas a slight decrease was observed for the higher chlorinated congeners (about 10%). Enantiomeric fractions of atropisomeric PCBs were essentially racemic and stable over time. Levels of low-molecular-weight PAHs declined during composting (50-90% reduction), whereas high-molecular-weight compounds were stable. The PCBs and PAHs concentrations did not seem to vary during digestion. Source apportionment by applying characteristic PAH ratios and molecular markers in input material did not give any clear results. Some of these parameters changed considerably during composting. Hence, their diagnostic potential for finished compost must be questioned. - During field-scale composting, low molecular weight PCBs and PAHs increased and decreased, respectively, whereas high molecular weight compounds remained stable

  18. Bayesian analysis in plant pathology.

    Science.gov (United States)

    Mila, A L; Carriquiry, A L

    2004-09-01

    ABSTRACT Bayesian methods are currently much discussed and applied in several disciplines from molecular biology to engineering. Bayesian inference is the process of fitting a probability model to a set of data and summarizing the results via probability distributions on the parameters of the model and unobserved quantities such as predictions for new observations. In this paper, after a short introduction of Bayesian inference, we present the basic features of Bayesian methodology using examples from sequencing genomic fragments and analyzing microarray gene-expressing levels, reconstructing disease maps, and designing experiments.

  19. Isotope separation

    International Nuclear Information System (INIS)

    Coleman, J.H.; Marks, T.J.

    1981-01-01

    A process for separating uranium isotopes is described which includes: preparing a volatile compound U-T, in which U is a mixture of uranium isotopes and T is a chemical moiety containing at least one organic or deuterated borohydride group, and which exhibits for at least one isotopic species thereof a fundamental, overtone or combination vibrational absorption excitation energy level at a frequency between 900 and 1100 cm -1 ; and irradiating the compound in the vapour phase with energy emitted by a radiation source at a frequency between 900 and 1100 cm -1 (e.g. a CO 2 laser). (author)

  20. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  1. Multisnapshot Sparse Bayesian Learning for DOA

    DEFF Research Database (Denmark)

    Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki

    2016-01-01

    The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source...... powers). For a complex Gaussian likelihood with hyperparameter, the unknown noise variance, the corresponding Gaussian posterior distribution is derived. The hyperparameters are automatically selected by maximizing the evidence and promoting sparse DOA estimates. The SBL scheme for DOA estimation...

  2. Bayesian long branch attraction bias and corrections.

    Science.gov (United States)

    Susko, Edward

    2015-03-01

    Previous work on the star-tree paradox has shown that Bayesian methods suffer from a long branch attraction bias. That work is extended to settings involving more taxa and partially resolved trees. The long branch attraction bias is confirmed to arise more broadly and an additional source of bias is found. A by-product of the analysis is methods that correct for biases toward particular topologies. The corrections can be easily calculated using existing Bayesian software. Posterior support for a set of two or more trees can thus be supplemented with corrected versions to cross-check or replace results. Simulations show the corrections to be highly effective. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  3. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  4. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  5. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to

  6. Bayesian Data Analysis (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  7. Bayesian Data Analysis (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    framework but we will also go into more detail and discuss for example the role of the prior. The second part of the lecture will cover further examples and applications that heavily rely on the bayesian approach, as well as some computational tools needed to perform a bayesian analysis.

  8. A compact electron beam ion source with integrated Wien filter providing mass and charge state separated beams of highly charged ions

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, M. [DREEBIT GmbH, Zur Wetterwarte 50, D-01109 Dresden (Germany); Peng, H. [School of Nuclear Science and Technology, Lanzhou University, Lanzhou 730000 (China); Zschornack, G. [Institut fuer Angewandte Physik, Technische Universitaet Dresden, Mommsenstr. 13, D-01062 Dresden (Germany); Sykora, S. [Institut fuer Theoretische Physik, Technische Universitaet Dresden, Mommsenstr. 13, D-01062 Dresden (Germany)

    2009-06-15

    A Wien filter was designed for and tested with a room temperature electron beam ion source (EBIS). Xenon charge state spectra up to the charge state Xe{sup 46+} were resolved as well as the isotopes of krypton using apertures of different sizes. The complete setup consisting of an EBIS and a Wien filter has a length of less than 1 m substituting a complete classical beamline setup. The Wien filter is equipped with removable permanent magnets. Hence total beam current measurements are possible via simple removal of the permanent magnets. In dependence on the needs of resolution a weak (0.2 T) or a strong (0.5 T) magnets setup can be used. In this paper the principle of operation and the design of the Wien filter meeting the requirements of an EBIS are briefly discussed. The first ion beam extraction and separation experiments with a Dresden EBIS are presented.

  9. A compact electron beam ion source with integrated Wien filter providing mass and charge state separated beams of highly charged ions

    Science.gov (United States)

    Schmidt, M.; Peng, H.; Zschornack, G.; Sykora, S.

    2009-06-01

    A Wien filter was designed for and tested with a room temperature electron beam ion source (EBIS). Xenon charge state spectra up to the charge state Xe46+ were resolved as well as the isotopes of krypton using apertures of different sizes. The complete setup consisting of an EBIS and a Wien filter has a length of less than 1 m substituting a complete classical beamline setup. The Wien filter is equipped with removable permanent magnets. Hence total beam current measurements are possible via simple removal of the permanent magnets. In dependence on the needs of resolution a weak (0.2 T) or a strong (0.5 T) magnets setup can be used. In this paper the principle of operation and the design of the Wien filter meeting the requirements of an EBIS are briefly discussed. The first ion beam extraction and separation experiments with a Dresden EBIS are presented.

  10. Vermicomposting of source-separated human faeces by Eisenia fetida: effect of stocking density on feed consumption rate, growth characteristics and vermicompost production.

    Science.gov (United States)

    Yadav, Kunwar D; Tare, Vinod; Ahammed, M Mansoor

    2011-06-01

    The main objective of the present study was to determine the optimum stocking density for feed consumption rate, biomass growth and reproduction of earthworm Eisenia fetida as well as determining and characterising vermicompost quantity and product, respectively, during vermicomposting of source-separated human faeces. For this, a number of experiments spanning up to 3 months were conducted using soil and vermicompost as support materials. Stocking density in the range of 0.25-5.00 kg/m(2) was employed in different tests. The results showed that 0.40-0.45 kg-feed/kg-worm/day was the maximum feed consumption rate by E. fetida in human faeces. The optimum stocking densities were 3.00 kg/m(2) for bioconversion of human faeces to vermicompost, and 0.50 kg/m(2) for earthworm biomass growth and reproduction. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. The Bayesian Covariance Lasso.

    Science.gov (United States)

    Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G

    2013-04-01

    Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size ( n ) is less than the dimension ( d ), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data.

  12. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  13. Isotope-based hydrograph separation in large rivers: assessing flow sources and water quality controls in the oil sands region, Canada

    Science.gov (United States)

    Gibson, John; Yi, Yi; Birks, Jean

    2016-04-01

    Hydrograph separation using stable isotopes of water is used to partition streamflow sources in the Athabasca River and its tributaries in the oil sands region of northern Alberta, Canada. Snow, rain, groundwater and surface water contributions to total streamflow are estimated for multi-year records and provide considerable insight into runoff generation mechanisms operating in six tributaries and at four stations along the Athabasca River. Groundwater, found to be an important flow source at all stations, is the dominant component of the hydrograph in three tributaries (Steepbank R., Muskeg R., Firebag R.), accounting for 39 to 50% of annual streamflow. Surface water, mainly drainage from peatlands, is also found to be widely important, and dominant in three tributaries (Clearwater R., Mackay R., Ells R.), accounting for 45 to 81% of annual streamflow. Direct runoff of precipitation sources including rain (7-19%) and snowmelt (3-7%) account for the remainder of sources. Fairly limited contributions from direct precipitation illustrate that most snow and rain events result in indirect displacement of pre-event water (surface water and groundwater), due in part to the prevalence of fill and spill mechanisms and limited overland flow. Systematic shifts in the groundwater:surface-water ratios, noted for the main stem of the Athabasca River and in its tributaries, is an important control on the spatial and temporal distribution of major and minor ions, trace elements, dissolved organics and contaminants, as well as for evaluating the susceptibility of the rivers to climate and development-related impacts. Runoff partitioning is likely to be a useful monitoring tool for better understanding of flow drivers and water quality controls, and for determining the underlying causes of climate or industrial impacts.

  14. Bayesian inference with ecological applications

    CERN Document Server

    Link, William A

    2009-01-01

    This text is written to provide a mathematically sound but accessible and engaging introduction to Bayesian inference specifically for environmental scientists, ecologists and wildlife biologists. It emphasizes the power and usefulness of Bayesian methods in an ecological context. The advent of fast personal computers and easily available software has simplified the use of Bayesian and hierarchical models . One obstacle remains for ecologists and wildlife biologists, namely the near absence of Bayesian texts written specifically for them. The book includes many relevant examples, is supported by software and examples on a companion website and will become an essential grounding in this approach for students and research ecologists. Engagingly written text specifically designed to demystify a complex subject Examples drawn from ecology and wildlife research An essential grounding for graduate and research ecologists in the increasingly prevalent Bayesian approach to inference Companion website with analyt...

  15. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  16. Bayesian methods in clinical trials: a Bayesian analysis of ECOG trials E1684 and E1690

    Directory of Open Access Journals (Sweden)

    Ibrahim Joseph G

    2012-11-01

    Full Text Available Abstract Background E1684 was the pivotal adjuvant melanoma trial for establishment of high-dose interferon (IFN as effective therapy of high-risk melanoma patients. E1690 was an intriguing effort to corroborate E1684, and the differences between the outcomes of these trials have embroiled the field in controversy over the past several years. The analyses of E1684 and E1690 were carried out separately when the results were published, and there were no further analyses trying to perform a single analysis of the combined trials. Method In this paper, we consider such a joint analysis by carrying out a Bayesian analysis of these two trials, thus providing us with a consistent and coherent methodology for combining the results from these two trials. Results The Bayesian analysis using power priors provided a more coherent flexible and potentially more accurate analysis than a separate analysis of these data or a frequentist analysis of these data. The methodology provides a consistent framework for carrying out a single unified analysis by combining data from two or more studies. Conclusions Such Bayesian analyses can be crucial in situations where the results from two theoretically identical trials yield somewhat conflicting or inconsistent results.

  17. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  18. SAFETY RISK ASSESSMENT USING BAYESIAN BELIEF NETWORK

    Directory of Open Access Journals (Sweden)

    Victor M. Rukhlinskiy

    2017-01-01

    Full Text Available The solution of the problem of modelling and quantitative assessment of flight safety risk is being considered in this paper. The article considers the main groups of mathematical models used to quantify the risks of flight safety, which can be used by providers of aviation services. The authors demonstrate and discuss risk modeling possibilities in the field of flight safety on the basis of Bayesian belief networks.In this paper a mathematical model is built on the basis of identified hazards, and this model allows to determine the level of risk for each hazard and the consequences of their occurrence using Bayesian belief networks, consisting of marginal probability distributions graph and conditional probability tables. This mathematical model allows to determine the following, based on the data on adverse events and hazard identification: the probability of various adverse events in all dangers occurrence, the risk level for each of the identified hazards, the most likely consequences of the given danger oc- currence. For risk modeling in the field of flight safety on the basis of Bayesian belief networks there were used supple- mentary Bayes Net Toolbox for MATLAB with open source. To determine the level of risk in the form specified in ICAO Doc 9859 "Flight Safety Management Manual" of the International Civil Aviation Organization, the authors wrote a func- tion to MATLAB, allowing each pair of probability - to set severity level in line with alphanumeric value and significance of the risk category.Risk model in the field of flight safety on the basis of Bayesian belief networks corresponds to the definition of risk by Kaplan and Garrick. The advantage of the developed risk assessment method over other methods is shown in the paper.

  19. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... represents the spatial coordinates of the grid nodes. Knowledge of how grid nodes are depicted in the observed image is described through the observation model. The prior consists of a node prior and an arc (edge) prior, both modeled as Gaussian MRFs. The node prior models variations in the positions of grid...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...

  20. Bayesian supervised dimensionality reduction.

    Science.gov (United States)

    Gönen, Mehmet

    2013-12-01

    Dimensionality reduction is commonly used as a preprocessing step before training a supervised learner. However, coupled training of dimensionality reduction and supervised learning steps may improve the prediction performance. In this paper, we introduce a simple and novel Bayesian supervised dimensionality reduction method that combines linear dimensionality reduction and linear supervised learning in a principled way. We present both Gibbs sampling and variational approximation approaches to learn the proposed probabilistic model for multiclass classification. We also extend our formulation toward model selection using automatic relevance determination in order to find the intrinsic dimensionality. Classification experiments on three benchmark data sets show that the new model significantly outperforms seven baseline linear dimensionality reduction algorithms on very low dimensions in terms of generalization performance on test data. The proposed model also obtains the best results on an image recognition task in terms of classification and retrieval performances.

  1. Bayesian Geostatistical Design

    DEFF Research Database (Denmark)

    Diggle, Peter; Lophaven, Søren Nymand

    2006-01-01

    This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...... locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...... parameter values are unknown. The results show that in this situation a wide range of interpoint distances should be included in the design, and the widely used regular design is often not the best choice....

  2. A combined evidence Bayesian method for human ancestry inference applied to Afro-Colombians.

    Science.gov (United States)

    Rishishwar, Lavanya; Conley, Andrew B; Vidakovic, Brani; Jordan, I King

    2015-12-15

    Uniparental genetic markers, mitochondrial DNA (mtDNA) and Y chromosomal DNA, are widely used for the inference of human ancestry. However, the resolution of ancestral origins based on mtDNA haplotypes is limited by the fact that such haplotypes are often found to be distributed across wide geographical regions. We have addressed this issue here by combining two sources of ancestry information that have typically been considered separately: historical records regarding population origins and genetic information on mtDNA haplotypes. To combine these distinct data sources, we applied a Bayesian approach that considers historical records, in the form of prior probabilities, together with data on the geographical distribution of mtDNA haplotypes, formulated as likelihoods, to yield ancestry assignments from posterior probabilities. This combined evidence Bayesian approach to ancestry assignment was evaluated for its ability to accurately assign sub-continental African ancestral origins to Afro-Colombians based on their mtDNA haplotypes. We demonstrate that the incorporation of historical prior probabilities via this analytical framework can provide for substantially increased resolution in sub-continental African ancestry assignment for members of this population. In addition, a personalized approach to ancestry assignment that involves the tuning of priors to individual mtDNA haplotypes yields even greater resolution for individual ancestry assignment. Despite the fact that Colombia has a large population of Afro-descendants, the ancestry of this community has been understudied relative to populations with primarily European and Native American ancestry. Thus, the application of the kind of combined evidence approach developed here to the study of ancestry in the Afro-Colombian population has the potential to be impactful. The formal Bayesian analytical framework we propose for combining historical and genetic information also has the potential to be widely applied

  3. EXONEST: The Bayesian Exoplanetary Explorer

    Directory of Open Access Journals (Sweden)

    Kevin H. Knuth

    2017-10-01

    Full Text Available The fields of astronomy and astrophysics are currently engaged in an unprecedented era of discovery as recent missions have revealed thousands of exoplanets orbiting other stars. While the Kepler Space Telescope mission has enabled most of these exoplanets to be detected by identifying transiting events, exoplanets often exhibit additional photometric effects that can be used to improve the characterization of exoplanets. The EXONEST Exoplanetary Explorer is a Bayesian exoplanet inference engine based on nested sampling and originally designed to analyze archived Kepler Space Telescope and CoRoT (Convection Rotation et Transits planétaires exoplanet mission data. We discuss the EXONEST software package and describe how it accommodates plug-and-play models of exoplanet-associated photometric effects for the purpose of exoplanet detection, characterization and scientific hypothesis testing. The current suite of models allows for both circular and eccentric orbits in conjunction with photometric effects, such as the primary transit and secondary eclipse, reflected light, thermal emissions, ellipsoidal variations, Doppler beaming and superrotation. We discuss our new efforts to expand the capabilities of the software to include more subtle photometric effects involving reflected and refracted light. We discuss the EXONEST inference engine design and introduce our plans to port the current MATLAB-based EXONEST software package over to the next generation Exoplanetary Explorer, which will be a Python-based open source project with the capability to employ third-party plug-and-play models of exoplanet-related photometric effects.

  4. Bayesian adaptive methods for clinical trials

    CERN Document Server

    Berry, Scott M; Muller, Peter

    2010-01-01

    Already popular in the analysis of medical device trials, adaptive Bayesian designs are increasingly being used in drug development for a wide variety of diseases and conditions, from Alzheimer's disease and multiple sclerosis to obesity, diabetes, hepatitis C, and HIV. Written by leading pioneers of Bayesian clinical trial designs, Bayesian Adaptive Methods for Clinical Trials explores the growing role of Bayesian thinking in the rapidly changing world of clinical trial analysis. The book first summarizes the current state of clinical trial design and analysis and introduces the main ideas and potential benefits of a Bayesian alternative. It then gives an overview of basic Bayesian methodological and computational tools needed for Bayesian clinical trials. With a focus on Bayesian designs that achieve good power and Type I error, the next chapters present Bayesian tools useful in early (Phase I) and middle (Phase II) clinical trials as well as two recent Bayesian adaptive Phase II studies: the BATTLE and ISP...

  5. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  6. Assessment of GHG Emission Reduction Potential from Source-separated Organic Waste (SOW) Management: Case Study in a Higher Educational Institution in Malaysia

    International Nuclear Information System (INIS)

    Ng, C.G.; Sumiani Yusoff

    2015-01-01

    In Malaysia, the greenhouse gases (GHGs) emissions reduction via composting of source-separated organic waste (SOW) in municipal solid waste (MSW) has not been assessed. Assessment of GHG emissions reduction via composting of SOW is important as environmental impacts from waste management are waste-specific and local-specific. The study presents the case study for potential carbon reduction via composting of SOW in University of Malaya (UM). In this study, a series of calculations were used to evaluate the GHG emission of different SOW management scenarios. The calculations based on IPCC calculation methods (AM0025) include GHGs emissions from land filling, fuel consumption in transportation and SOW composting activity. The methods were applied to assess the GHG emissions from five alternative SOW management scenarios in UM. From the baseline scenario (S0), a total of 1,636.18 tCO2e was generated. In conjunction with target of 22 % recycling rate, as shown in S1, 14 % reduction in potential GHG emission can be achieved. The carbon reduction can be further enhanced by increasing the SOW composting capacity. The net GHG emission for S1, S2, S3 and S4 were 1,399.52, 1,161.29, 857.70 and 1,060.48 tCO2e, respectively. In general, waste diversion for composting proved a significant net GHG emission reduction as shown in S3 (47 %), S4 (35 %) and S2 (29 %). Despite the emission due to direct on-site activity, the significant reduction in methane generation at landfill has reduced the net GHG emission. The emission source of each scenario was studied and analysed. (author)

  7. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  8. Bayesian image restoration, using configurations

    OpenAIRE

    Thorarinsdottir, Thordis

    2006-01-01

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the re...

  9. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    Science.gov (United States)

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  10. Life-Cycle Cost and Environmental Assessment of Decentralized Nitrogen Recovery Using Ion Exchange from Source-Separated Urine through Spatial Modeling.

    Science.gov (United States)

    Kavvada, Olga; Tarpeh, William A; Horvath, Arpad; Nelson, Kara L

    2017-11-07

    Nitrogen standards for discharge of wastewater effluent into aquatic bodies are becoming more stringent, requiring some treatment plants to reduce effluent nitrogen concentrations. This study aimed to assess, from a life-cycle perspective, an innovative decentralized approach to nitrogen recovery: ion exchange of source-separated urine. We modeled an approach in which nitrogen from urine at individual buildings is sorbed onto resins, then transported by truck to regeneration and fertilizer production facilities. To provide insight into impacts from transportation, we enhanced the traditional economic and environmental assessment approach by combining spatial analysis, system-scale evaluation, and detailed last-mile logistics modeling using the city of San Francisco as an illustrative case study. The major contributor to energy intensity and greenhouse gas (GHG) emissions was the production of sulfuric acid to regenerate resins, rather than transportation. Energy and GHG emissions were not significantly sensitive to the number of regeneration facilities. Cost, however, increased with decentralization as rental costs per unit area are higher for smaller areas. The metrics assessed (unit energy, GHG emissions, and cost) were not significantly influenced by facility location in this high-density urban area. We determined that this decentralized approach has lower cost, unit energy, and GHG emissions than centralized nitrogen management via nitrification-denitrification if fertilizer production offsets are taken into account.

  11. Accurate phenotyping: Reconciling approaches through Bayesian model averaging.

    Directory of Open Access Journals (Sweden)

    Carla Chia-Ming Chen

    Full Text Available Genetic research into complex diseases is frequently hindered by a lack of clear biomarkers for phenotype ascertainment. Phenotypes for such diseases are often identified on the basis of clinically defined criteria; however such criteria may not be suitable for understanding the genetic composition of the diseases. Various statistical approaches have been proposed for phenotype definition; however our previous studies have shown that differences in phenotypes estimated using different approaches have substantial impact on subsequent analyses. Instead of obtaining results based upon a single model, we propose a new method, using Bayesian model averaging to overcome problems associated with phenotype definition. Although Bayesian model averaging has been used in other fields of research, this is the first study that uses Bayesian model averaging to reconcile phenotypes obtained using multiple models. We illustrate the new method by applying it to simulated genetic and phenotypic data for Kofendred personality disorder-an imaginary disease with several sub-types. Two separate statistical methods were used to identify clusters of individuals with distinct phenotypes: latent class analysis and grade of membership. Bayesian model averaging was then used to combine the two clusterings for the purpose of subsequent linkage analyses. We found that causative genetic loci for the disease produced higher LOD scores using model averaging than under either individual model separately. We attribute this improvement to consolidation of the cores of phenotype clusters identified using each individual method.

  12. The Bayesian Approach to Association

    Science.gov (United States)

    Arora, N. S.

    2017-12-01

    The Bayesian approach to Association focuses mainly on quantifying the physics of the domain. In the case of seismic association for instance let X be the set of all significant events (above some threshold) and their attributes, such as location, time, and magnitude, Y1 be the set of detections that are caused by significant events and their attributes such as seismic phase, arrival time, amplitude etc., Y2 be the set of detections that are not caused by significant events, and finally Y be the set of observed detections We would now define the joint distribution P(X, Y1, Y2, Y) = P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2) ; where the last term simply states that Y1 and Y2 are a partitioning of Y. Given the above joint distribution the inference problem is simply to find the X, Y1, and Y2 that maximizes posterior probability P(X, Y1, Y2| Y) which reduces to maximizing P(X) P(Y1 | X) P(Y2) I(Y = Y1 + Y2). In this expression P(X) captures our prior belief about event locations. P(Y1 | X) captures notions of travel time, residual error distributions as well as detection and mis-detection probabilities. While P(Y2) captures the false detection rate of our seismic network. The elegance of this approach is that all of the assumptions are stated clearly in the model for P(X), P(Y1|X) and P(Y2). The implementation of the inference is merely a by-product of this model. In contrast some of the other methods such as GA hide a number of assumptions in the implementation details of the inference - such as the so called "driver cells." The other important aspect of this approach is that all seismic knowledge including knowledge from other domains such as infrasound and hydroacoustic can be included in the same model. So, we don't need to separately account for misdetections or merge seismic and infrasound events as a separate step. Finally, it should be noted that the objective of automatic association is to simplify the job of humans who are publishing seismic bulletins based on this

  13. Bayesian Modeling of the Assimilative Capacity Component of Stream Nutrient Export

    Science.gov (United States)

    Implementing stream restoration techniques and best management practices to reduce nonpoint source nutrients implies enhancement of the assimilative capacity for the stream system. In this paper, a Bayesian method for evaluating this component of a TMDL load capacity is developed...

  14. Separating Underdetermined Convolutive Speech Mixtures

    DEFF Research Database (Denmark)

    Pedersen, Michael Syskind; Wang, DeLiang; Larsen, Jan

    2006-01-01

    a method for underdetermined blind source separation of convolutive mixtures. The proposed framework is applicable for separation of instantaneous as well as convolutive speech mixtures. It is possible to iteratively extract each speech signal from the mixture by combining blind source separation...

  15. UWIS isotope separator

    International Nuclear Information System (INIS)

    Wojtasiewicz, A.

    1997-01-01

    Since 1995 the University of Warsaw Isotope Separator group has participated in the ISOL/IGISOL project at the Heavy Ion Cyclotron. This project consists in installation of an isotope separator (on line with cyclotron heavy ion beam) with a hot plasma ion source (ISOL system) and/or with an ion guide source (IGISOL system). In the report the short description of the present status of the project is presented

  16. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  17. Bayesian microsaccade detection

    Science.gov (United States)

    Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji

    2017-01-01

    Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483

  18. Case studies in Bayesian microbial risk assessments

    Directory of Open Access Journals (Sweden)

    Turner Joanne

    2009-12-01

    Full Text Available Abstract Background The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. Methods We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs. Results We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5. The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11. In the second

  19. A Dynamic BI–Orthogonal Field Equation Approach to Efficient Bayesian Inversion

    Directory of Open Access Journals (Sweden)

    Tagade Piyush M.

    2017-06-01

    Full Text Available This paper proposes a novel computationally efficient stochastic spectral projection based approach to Bayesian inversion of a computer simulator with high dimensional parametric and model structure uncertainty. The proposed method is based on the decomposition of the solution into its mean and a random field using a generic Karhunen-Loève expansion. The random field is represented as a convolution of separable Hilbert spaces in stochastic and spatial dimensions that are spectrally represented using respective orthogonal bases. In particular, the present paper investigates generalized polynomial chaos bases for the stochastic dimension and eigenfunction bases for the spatial dimension. Dynamic orthogonality is used to derive closed-form equations for the time evolution of mean, spatial and the stochastic fields. The resultant system of equations consists of a partial differential equation (PDE that defines the dynamic evolution of the mean, a set of PDEs to define the time evolution of eigenfunction bases, while a set of ordinary differential equations (ODEs define dynamics of the stochastic field. This system of dynamic evolution equations efficiently propagates the prior parametric uncertainty to the system response. The resulting bi-orthogonal expansion of the system response is used to reformulate the Bayesian inference for efficient exploration of the posterior distribution. The efficacy of the proposed method is investigated for calibration of a 2D transient diffusion simulator with an uncertain source location and diffusivity. The computational efficiency of the method is demonstrated against a Monte Carlo method and a generalized polynomial chaos approach.

  20. A systematic approach to evaluate parameter consistency in the inlet stream of source separated biowaste composting facilities: A case study in Colombia.

    Science.gov (United States)

    Oviedo-Ocaña, E R; Torres-Lozada, P; Marmolejo-Rebellon, L F; Torres-López, W A; Dominguez, I; Komilis, D; Sánchez, A

    2017-04-01

    Biowaste is commonly the largest fraction of municipal solid waste (MSW) in developing countries. Although composting is an effective method to treat source separated biowaste (SSB), there are certain limitations in terms of operation, partly due to insufficient control to the variability of SSB quality, which affects process kinetics and product quality. This study assesses the variability of the SSB physicochemical quality in a composting facility located in a small town of Colombia, in which SSB collection was performed twice a week. Likewise, the influence of the SSB physicochemical variability on the variability of compost parameters was assessed. Parametric and non-parametric tests (i.e. Student's t-test and the Mann-Whitney test) showed no significant differences in the quality parameters of SSB among collection days, and therefore, it was unnecessary to establish specific operation and maintenance regulations for each collection day. Significant variability was found in eight of the twelve quality parameters analyzed in the inlet stream, with corresponding coefficients of variation (CV) higher than 23%. The CVs for the eight parameters analyzed in the final compost (i.e. pH, moisture, total organic carbon, total nitrogen, C/N ratio, total phosphorus, total potassium and ash) ranged from 9.6% to 49.4%, with significant variations in five of those parameters (CV>20%). The above indicate that variability in the inlet stream can affect the variability of the end-product. Results suggest the need to consider variability of the inlet stream in the performance of composting facilities to achieve a compost of consistent quality. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Global warming potential of material fractions occurring in source-separated organic household waste treated by anaerobic digestion or incineration under different framework conditions.

    Science.gov (United States)

    Naroznova, Irina; Møller, Jacob; Scheutz, Charlotte

    2016-12-01

    This study compared the environmental profiles of anaerobic digestion (AD) and incineration, in relation to global warming potential (GWP), for treating individual material fractions that may occur in source-separated organic household waste (SSOHW). Different framework conditions representative for the European Union member countries were considered. For AD, biogas utilisation with a biogas engine was considered and two potential situations investigated - biogas combustion with (1) combined heat and power production (CHP) and (2) electricity production only. For incineration, four technology options currently available in Europe were covered: (1) an average incinerator with CHP production, (2) an average incinerator with mainly electricity production, (3) an average incinerator with mainly heat production and (4) a state-of-the art incinerator with CHP working at high energy recovery efficiencies. The study was performed using a life cycle assessment in its consequential approach. Furthermore, the role of waste-sorting guidelines (defined by the material fractions allowed for SSOHW) in relation to GWP of treating overall SSOHW with AD was investigated. A case-study of treating 1tonne of SSOHW under framework conditions in Denmark was conducted. Under the given assumptions, vegetable food waste was the only material fraction which was always better for AD compared to incineration. For animal food waste, kitchen tissue, vegetation waste and dirty paper, AD utilisation was better unless it was compared to a highly efficient incinerator. Material fractions such as moulded fibres and dirty cardboard were attractive for AD, albeit only when AD with CHP and incineration with mainly heat production were compared. Animal straw, in contrast, was always better to incinerate. Considering the total amounts of individual material fractions in waste generated within households in Denmark, food waste (both animal and vegetable derived) and kitchen tissue are the main material

  2. Isotope separation

    International Nuclear Information System (INIS)

    Ravoire, Jean

    1978-11-01

    Separation of isotopes is treated in a general way, with special reference to the production of enriched uranium. Uses of separated isotopes are presented quickly. Then basic definitions and theoretical concepts are explained: isotopic effects, non statistical and statistical processes, reversible and irreversible processes, separation factor, enrichment, cascades, isotopic separative work, thermodynamics. Afterwards the main processes and productions are reviewed. Finally the economical and industrial aspects of uranium enrichment are resumed [fr

  3. Kernel Bayesian ART and ARTMAP.

    Science.gov (United States)

    Masuyama, Naoki; Loo, Chu Kiong; Dawood, Farhan

    2018-02-01

    Adaptive Resonance Theory (ART) is one of the successful approaches to resolving "the plasticity-stability dilemma" in neural networks, and its supervised learning model called ARTMAP is a powerful tool for classification. Among several improvements, such as Fuzzy or Gaussian based models, the state of art model is Bayesian based one, while solving the drawbacks of others. However, it is known that the Bayesian approach for the high dimensional and a large number of data requires high computational cost, and the covariance matrix in likelihood becomes unstable. This paper introduces Kernel Bayesian ART (KBA) and ARTMAP (KBAM) by integrating Kernel Bayes' Rule (KBR) and Correntropy Induced Metric (CIM) to Bayesian ART (BA) and ARTMAP (BAM), respectively, while maintaining the properties of BA and BAM. The kernel frameworks in KBA and KBAM are able to avoid the curse of dimensionality. In addition, the covariance-free Bayesian computation by KBR provides the efficient and stable computational capability to KBA and KBAM. Furthermore, Correntropy-based similarity measurement allows improving the noise reduction ability even in the high dimensional space. The simulation experiments show that KBA performs an outstanding self-organizing capability than BA, and KBAM provides the superior classification ability than BAM, respectively. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Bayesian Extraction of Deep UV Resonance Raman Signature of Fibrillar Cross-β Sheet Core based on H-D Exchange Data

    Science.gov (United States)

    Shashilov, V. A.; Lednev, I. K.

    2007-11-01

    Amyloid fibrils are associated with many neurodegenerative diseases. The application of conventional biophysical techniques including solution NMR and X-ray crystallography for structural characterization of fibrils is limited because they are neither crystalline nor soluble. The Bayesian approach was utilized for extracting the deep UV resonance Raman (DUVRR) spectrum of the lysozyme fibrillar β-sheet based on the hydrogen-deuterium exchange spectral data. The problem was shown to be unsolvable when using blind source separation or conventional chemometrics methods because of the 100% correlation of the concentration profiles of the species under study. Information about the mixing process was incorporated by forcing the columns of the concentration matrix to be proportional to the expected concentration profiles. The ill-conditioning of the matrix was removed by concatenating it to the diagonal matrix with entries corresponding to the known pure spectra (sources). Prior information about the spectral features and characteristic bands of the spectra was taken into account using the Bayesian signal dictionary approach. The extracted DUVRR spectrum of the cross-β sheet core exhibited sharp bands indicating the highly ordered structure. Well resolved sub-bands in Amide I and Amide III regions enabled us to assign the fibril core structure to anti-parallel β-sheet and estimate the amide group facial angle Ψ in the cross-β structure. The elaborated Bayesian approach was demonstrated to be applicable for studying correlated biochemical processes.

  5. BATSE gamma-ray burst line search. 2: Bayesian consistency methodology

    Science.gov (United States)

    Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.

    1994-01-01

    We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.

  6. Flood quantile estimation at ungauged sites by Bayesian networks

    Science.gov (United States)

    Mediero, L.; Santillán, D.; Garrote, L.

    2012-04-01

    Estimating flood quantiles at a site for which no observed measurements are available is essential for water resources planning and management. Ungauged sites have no observations about the magnitude of floods, but some site and basin characteristics are known. The most common technique used is the multiple regression analysis, which relates physical and climatic basin characteristic to flood quantiles. Regression equations are fitted from flood frequency data and basin characteristics at gauged sites. Regression equations are a rigid technique that assumes linear relationships between variables and cannot take the measurement errors into account. In addition, the prediction intervals are estimated in a very simplistic way from the variance of the residuals in the estimated model. Bayesian networks are a probabilistic computational structure taken from the field of Artificial Intelligence, which have been widely and successfully applied to many scientific fields like medicine and informatics, but application to the field of hydrology is recent. Bayesian networks infer the joint probability distribution of several related variables from observations through nodes, which represent random variables, and links, which represent causal dependencies between them. A Bayesian network is more flexible than regression equations, as they capture non-linear relationships between variables. In addition, the probabilistic nature of Bayesian networks allows taking the different sources of estimation uncertainty into account, as they give a probability distribution as result. A homogeneous region in the Tagus Basin was selected as case study. A regression equation was fitted taking the basin area, the annual maximum 24-hour rainfall for a given recurrence interval and the mean height as explanatory variables. Flood quantiles at ungauged sites were estimated by Bayesian networks. Bayesian networks need to be learnt from a huge enough data set. As observational data are reduced, a

  7. Bayesian analysis of CCDM models

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, J.F. [Universidade Estadual Paulista (Unesp), Câmpus Experimental de Itapeva, Rua Geraldo Alckmin 519, Vila N. Sra. de Fátima, Itapeva, SP, 18409-010 Brazil (Brazil); Valentim, R. [Departamento de Física, Instituto de Ciências Ambientais, Químicas e Farmacêuticas—ICAQF, Universidade Federal de São Paulo (UNIFESP), Unidade José Alencar, Rua São Nicolau No. 210, Diadema, SP, 09913-030 Brazil (Brazil); Andrade-Oliveira, F., E-mail: jfjesus@itapeva.unesp.br, E-mail: valentim.rodolfo@unifesp.br, E-mail: felipe.oliveira@port.ac.uk [Institute of Cosmology and Gravitation—University of Portsmouth, Burnaby Road, Portsmouth, PO1 3FX United Kingdom (United Kingdom)

    2017-09-01

    Creation of Cold Dark Matter (CCDM), in the context of Einstein Field Equations, produces a negative pressure term which can be used to explain the accelerated expansion of the Universe. In this work we tested six different spatially flat models for matter creation using statistical criteria, in light of SNe Ia data: Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and Bayesian Evidence (BE). These criteria allow to compare models considering goodness of fit and number of free parameters, penalizing excess of complexity. We find that JO model is slightly favoured over LJO/ΛCDM model, however, neither of these, nor Γ = 3α H {sub 0} model can be discarded from the current analysis. Three other scenarios are discarded either because poor fitting or because of the excess of free parameters. A method of increasing Bayesian evidence through reparameterization in order to reducing parameter degeneracy is also developed.

  8. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  9. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  10. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary...... configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  11. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary...... configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  12. Bayesian variable selection in regression

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell, T.J.; Beauchamp, J.J.

    1987-01-01

    This paper is concerned with the selection of subsets of ''predictor'' variables in a linear regression model for the prediction of a ''dependent'' variable. We take a Bayesian approach and assign a probability distribution to the dependent variable through a specification of prior distributions for the unknown parameters in the regression model. The appropriate posterior probabilities are derived for each submodel and methods are proposed for evaluating the family of prior distributions. Examples are given that show the application of the Bayesian methodology. 23 refs., 3 figs.

  13. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees a...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....... and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...

  14. Bayesian methods for proteomic biomarker development

    Directory of Open Access Journals (Sweden)

    Belinda Hernández

    2015-12-01

    In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.

  15. Bayesian variable order Markov models: Towards Bayesian predictive state representations

    NARCIS (Netherlands)

    Dimitrakakis, C.

    2009-01-01

    We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more

  16. The humble Bayesian : Model checking from a fully Bayesian perspective

    NARCIS (Netherlands)

    Morey, Richard D.; Romeijn, Jan-Willem; Rouder, Jeffrey N.

    Gelman and Shalizi (2012) criticize what they call the usual story in Bayesian statistics: that the distribution over hypotheses or models is the sole means of statistical inference, thus excluding model checking and revision, and that inference is inductivist rather than deductivist. They present

  17. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  18. Bayesian models in cognitive neuroscience: A tutorial

    NARCIS (Netherlands)

    O'Reilly, J.X.; Mars, R.B.

    2015-01-01

    This chapter provides an introduction to Bayesian models and their application in cognitive neuroscience. The central feature of Bayesian models, as opposed to other classes of models, is that Bayesian models represent the beliefs of an observer as probability distributions, allowing them to

  19. A Bayesian framework for risk perception

    NARCIS (Netherlands)

    van Erp, H.R.N.

    2017-01-01

    We present here a Bayesian framework of risk perception. This framework encompasses plausibility judgments, decision making, and question asking. Plausibility judgments are modeled by way of Bayesian probability theory, decision making is modeled by way of a Bayesian decision theory, and relevancy

  20. Optimal frequency separation of power sources by multivariable LPV/Hinf control: application to on-board energy management systems of electric vehicles

    OpenAIRE

    Nwesaty, Waleed; Bratcu, Antoneta Iuliana; Sename, Olivier

    2014-01-01

    International audience; In this paper a multi-variable LPV/Hinf control approach is applied to design a strategy for power source coordination within a multi-source energy system. Three different kinds of power sources - fuel cell, battery and ultracapacitor - compose the power supply system of an electric vehicle. All sources are current-controlled and paralleled together with their associated DC-DC converters on a common DClink coupled to vehicle's electrical motor and its converter. DC-lin...

  1. A separator

    Energy Technology Data Exchange (ETDEWEB)

    Prokopyuk, S.G.; Dyachenko, A.Ye.; Mukhametov, M.N.; Prokopov, O.I.

    1982-01-01

    A separator is proposed which contains separating slanted plates and baffle plates installed at a distance to them at an acute angle to them. To increase the effectiveness of separating a gas and liquid stream and the throughput through reducing the secondary carry away of the liquid drops and to reduce the hydraulic resistance, as well, openings are made in the plates. The horizontal projections of each opening from the lower and upper surfaces of the plate do not overlap each other.

  2. Spatiotemporal Bayesian inference dipole analysis for MEG neuroimaging data.

    Science.gov (United States)

    Jun, Sung C; George, John S; Paré-Blagoev, Juliana; Plis, Sergey M; Ranken, Doug M; Schmidt, David M; Wood, C C

    2005-10-15

    Recently, we described a Bayesian inference approach to the MEG/EEG inverse problem that used numerical techniques to estimate the full posterior probability distributions of likely solutions upon which all inferences were based [Schmidt, D.M., George, J.S., Wood, C.C., 1999. Bayesian inference applied to the electromagnetic inverse problem. Human Brain Mapping 7, 195; Schmidt, D.M., George, J.S., Ranken, D.M., Wood, C.C., 2001. Spatial-temporal bayesian inference for MEG/EEG. In: Nenonen, J., Ilmoniemi, R. J., Katila, T. (Eds.), Biomag 2000: 12th International Conference on Biomagnetism. Espoo, Norway, p. 671]. Schmidt et al. (1999) focused on the analysis of data at a single point in time employing an extended region source model. They subsequently extended their work to a spatiotemporal Bayesian inference analysis of the full spatiotemporal MEG/EEG data set. Here, we formulate spatiotemporal Bayesian inference analysis using a multi-dipole model of neural activity. This approach is faster than the extended region model, does not require use of the subject's anatomical information, does not require prior determination of the number of dipoles, and yields quantitative probabilistic inferences. In addition, we have incorporated the ability to handle much more complex and realistic estimates of the background noise, which may be represented as a sum of Kronecker products of temporal and spatial noise covariance components. This reduces the effects of undermodeling noise. In order to reduce the rigidity of the multi-dipole formulation which commonly causes problems due to multiple local minima, we treat the given covariance of the background as uncertain and marginalize over it in the analysis. Markov Chain Monte Carlo (MCMC) was used to sample the many possible likely solutions. The spatiotemporal Bayesian dipole analysis is demonstrated using simulated and empirical whole-head MEG data.

  3. MERGING DIGITAL SURFACE MODELS IMPLEMENTING BAYESIAN APPROACHES

    Directory of Open Access Journals (Sweden)

    H. Sadeq

    2016-06-01

    Full Text Available In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades. It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  4. Separation and capture of CO2 from large stationary sources and sequestration in geological formations--coalbeds and deep saline aquifers.

    Science.gov (United States)

    White, Curt M; Strazisar, Brian R; Granite, Evan J; Hoffman, James S; Pennline, Henry W

    2003-06-01

    The topic of global warming as a result of increased atmospheric CO2 concentration is arguably the most important environmental issue that the world faces today. It is a global problem that will need to be solved on a global level. The link between anthropogenic emissions of CO2 with increased atmospheric CO2 levels and, in turn, with increased global temperatures has been well established and accepted by the world. International organizations such as the United Nations Framework Convention on Climate Change (UNFCCC) and the Intergovernmental Panel on Climate Change (IPCC) have been formed to address this issue. Three options are being explored to stabilize atmospheric levels of greenhouse gases (GHGs) and global temperatures without severely and negatively impacting standard of living: (1) increasing energy efficiency, (2) switching to less carbon-intensive sources of energy, and (3) carbon sequestration. To be successful, all three options must be used in concert. The third option is the subject of this review. Specifically, this review will cover the capture and geologic sequestration of CO2 generated from large point sources, namely fossil-fuel-fired power gasification plants. Sequestration of CO2 in geological formations is necessary to meet the President's Global Climate Change Initiative target of an 18% reduction in GHG intensity by 2012. Further, the best strategy to stabilize the atmospheric concentration of CO2 results from a multifaceted approach where sequestration of CO2 into geological formations is combined with increased efficiency in electric power generation and utilization, increased conservation, increased use of lower carbon-intensity fuels, and increased use of nuclear energy and renewables. This review covers the separation and capture of CO2 from both flue gas and fuel gas using wet scrubbing technologies, dry regenerable sorbents, membranes, cryogenics, pressure and temperature swing adsorption, and other advanced concepts. Existing

  5. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  6. Bayesian networks in levee reliability

    NARCIS (Netherlands)

    Roscoe, K.; Hanea, A.

    2015-01-01

    We applied a Bayesian network to a system of levees for which the results of traditional reliability analysis showed high failure probabilities, which conflicted with the intuition and experience of those managing the levees. We made use of forty proven strength observations - high water levels with

  7. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic...... dimensionality. The built classi er is tested on standard and non-standard images...

  8. Computational Neuropsychology and Bayesian Inference.

    Science.gov (United States)

    Parr, Thomas; Rees, Geraint; Friston, Karl J

    2018-01-01

    Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine 'prior' beliefs with a generative (predictive) model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world). This draws upon the notion of a Bayes optimal pathology - optimal inference with suboptimal priors - and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient's behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  9. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  10. Isotope separation

    International Nuclear Information System (INIS)

    Eerkens, J.W.

    1979-01-01

    A method of isotope separation is described which involves the use of a laser photon beam to selectively induce energy level transitions of an isotope molecule containing the isotope to be separated. The use of the technique for 235 U enrichment is demonstrated. (UK)

  11. Topics in Bayesian statistics and maximum entropy

    International Nuclear Information System (INIS)

    Mutihac, R.; Cicuttin, A.; Cerdeira, A.; Stanciulescu, C.

    1998-12-01

    Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)

  12. A new Bayesian Earthquake Analysis Tool (BEAT)

    Science.gov (United States)

    Vasyura-Bathke, Hannes; Dutta, Rishabh; Jónsson, Sigurjón; Mai, Martin

    2017-04-01

    Modern earthquake source estimation studies increasingly use non-linear optimization strategies to estimate kinematic rupture parameters, often considering geodetic and seismic data jointly. However, the optimization process is complex and consists of several steps that need to be followed in the earthquake parameter estimation procedure. These include pre-describing or modeling the fault geometry, calculating the Green's Functions (often assuming a layered elastic half-space), and estimating the distributed final slip and possibly other kinematic source parameters. Recently, Bayesian inference has become popular for estimating posterior distributions of earthquake source model parameters given measured/estimated/assumed data and model uncertainties. For instance, some research groups consider uncertainties of the layered medium and propagate these to the source parameter uncertainties. Other groups make use of informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed that efficiently explore the often high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational demands of these methods are high and estimation codes are rarely distributed along with the published results. Even if codes are made available, it is often difficult to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in earthquake source estimations, we undertook the effort of producing BEAT, a python package that comprises all the above-mentioned features in one

  13. Bayesian analysis of rare events

    Science.gov (United States)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  14. Polytomies and Bayesian phylogenetic inference.

    Science.gov (United States)

    Lewis, Paul O; Holder, Mark T; Holsinger, Kent E

    2005-04-01

    Bayesian phylogenetic analyses are now very popular in systematics and molecular evolution because they allow the use of much more realistic models than currently possible with maximum likelihood methods. There are, however, a growing number of examples in which large Bayesian posterior clade probabilities are associated with very short branch lengths and low values for non-Bayesian measures of support such as nonparametric bootstrapping. For the four-taxon case when the true tree is the star phylogeny, Bayesian analyses become increasingly unpredictable in their preference for one of the three possible resolved tree topologies as data set size increases. This leads to the prediction that hard (or near-hard) polytomies in nature will cause unpredictable behavior in Bayesian analyses, with arbitrary resolutions of the polytomy receiving very high posterior probabilities in some cases. We present a simple solution to this problem involving a reversible-jump Markov chain Monte Carlo (MCMC) algorithm that allows exploration of all of tree space, including unresolved tree topologies with one or more polytomies. The reversible-jump MCMC approach allows prior distributions to place some weight on less-resolved tree topologies, which eliminates misleadingly high posteriors associated with arbitrary resolutions of hard polytomies. Fortunately, assigning some prior probability to polytomous tree topologies does not appear to come with a significant cost in terms of the ability to assess the level of support for edges that do exist in the true tree. Methods are discussed for applying arbitrary prior distributions to tree topologies of varying resolution, and an empirical example showing evidence of polytomies is analyzed and discussed.

  15. Using literature and data to learn Bayesian networks as clinical models of ovarian tumors

    DEFF Research Database (Denmark)

    Antal, P.; Fannes, G.; Timmerman, D.

    2004-01-01

    Thanks to its increasing availability, electronic literature has become a potential source of information for the development of complex Bayesian networks (BN), when human expertise is missing or data is scarce or contains much noise. This opportunity raises the question of how to integrate...... an expert reference and against data scores (the mutual information (MI) and a Bayesian score). Next, we transform the text-based dependency measures into informative text-based priors for Bayesian network structures. Finally, we report the benefit of such informative text-based priors on the performance...

  16. Implementing statistical learning methods through Bayesian networks (Part 2): Bayesian evaluations for results of black toner analyses in forensic document examination.

    Science.gov (United States)

    Biedermann, A; Taroni, F; Bozza, S; Mazzella, W D

    2011-01-30

    This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  17. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  18. CENTRIFUGAL SEPARATORS

    Science.gov (United States)

    Skarstrom, C.

    1959-03-10

    A centrifugal separator is described for separating gaseous mixtures where the temperature gradients both longitudinally and radially of the centrifuge may be controlled effectively to produce a maximum separation of the process gases flowing through. Tbe invention provides for the balancing of increases and decreases in temperature in various zones of the centrifuge chamber as the result of compression and expansions respectively, of process gases and may be employed effectively both to neutralize harmful temperature gradients and to utilize beneficial temperaturc gradients within the centrifuge.

  19. Bayesian Calibration of Simultaneity in Audiovisual Temporal Order Judgments

    Science.gov (United States)

    Yamamoto, Shinya; Miyazaki, Makoto; Iwano, Takayuki; Kitazawa, Shigeru

    2012-01-01

    After repeated exposures to two successive audiovisual stimuli presented in one frequent order, participants eventually perceive a pair separated by some lag time in the same order as occurring simultaneously (lag adaptation). In contrast, we previously found that perceptual changes occurred in the opposite direction in response to tactile stimuli, conforming to Bayesian integration theory (Bayesian calibration). We further showed, in theory, that the effect of Bayesian calibration cannot be observed when the lag adaptation was fully operational. This led to the hypothesis that Bayesian calibration affects judgments regarding the order of audiovisual stimuli, but that this effect is concealed behind the lag adaptation mechanism. In the present study, we showed that lag adaptation is pitch-insensitive using two sounds at 1046 and 1480 Hz. This enabled us to cancel lag adaptation by associating one pitch with sound-first stimuli and the other with light-first stimuli. When we presented each type of stimulus (high- or low-tone) in a different block, the point of simultaneity shifted to “sound-first” for the pitch associated with sound-first stimuli, and to “light-first” for the pitch associated with light-first stimuli. These results are consistent with lag adaptation. In contrast, when we delivered each type of stimulus in a randomized order, the point of simultaneity shifted to “light-first” for the pitch associated with sound-first stimuli, and to “sound-first” for the pitch associated with light-first stimuli. The results clearly show that Bayesian calibration is pitch-specific and is at work behind pitch-insensitive lag adaptation during temporal order judgment of audiovisual stimuli. PMID:22792297

  20. A Bayesian on-off analysis of cosmic ray data

    Science.gov (United States)

    Nosek, Dalibor; Nosková, Jana

    2017-09-01

    We deal with the analysis of on-off measurements designed for the confirmation of a weak source of events whose presence is hypothesized, based on former observations. The problem of a small number of source events that are masked by an imprecisely known background is addressed from a Bayesian point of view. We examine three closely related variables, the posterior distributions of which carry relevant information about various aspects of the investigated phenomena. This information is utilized for predictions of further observations, given actual data. Backed by details of detection, we propose how to quantify disparities between different measurements. The usefulness of the Bayesian inference is demonstrated on examples taken from cosmic ray physics.

  1. Separated Shoulder

    Science.gov (United States)

    ... ligaments that hold your collarbone (clavicle) to your shoulder blade. In a mild separated shoulder, the ligaments might ... the ligaments that hold your collarbone to your shoulder blade. Risk factors Participating in contact sports, such as ...

  2. Isotopic separation

    International Nuclear Information System (INIS)

    Chen, C.L.

    1979-01-01

    Isotopic species in an isotopic mixture including a first species having a first isotope and a second species having a second isotope are separated by selectively exciting the first species in preference to the second species and then reacting the selectively excited first species with an additional preselected radiation, an electron or another chemical species so as to form a product having a mass different from the original species and separating the product from the balance of the mixture in a centrifugal separating device such as centrifuge or aerodynamic nozzle. In the centrifuge the isotopic mixture is passed into a rotor where it is irradiated through a window. Heavier and lighter components can be withdrawn. The irradiated mixture experiences a large centrifugal force and is separated in a deflection area into lighter and heavier components. (UK)

  3. Isotopic separation

    International Nuclear Information System (INIS)

    Castle, P.M.

    1979-01-01

    This invention relates to molecular and atomic isotope separation and is particularly applicable to the separation of 235 U from other uranium isotopes including 238 U. In the method described a desired isotope is separated mechanically from an atomic or molecular beam formed from an isotope mixture utilising the isotropic recoil momenta resulting from selective excitation of the desired isotope species by radiation, followed by ionization or dissociation by radiation or electron attachment. By forming a matrix of UF 6 molecules in HBr molecules so as to collapse the V 3 vibrational mode of the UF 6 molecule the 235 UF 6 molecules are selectively excited to promote reduction of UF 6 molecules containing 235 U and facilitate separation. (UK)

  4. A computational Bayesian approach to dependency assessment in system reliability

    International Nuclear Information System (INIS)

    Yontay, Petek; Pan, Rong

    2016-01-01

    Due to the increasing complexity of engineered products, it is of great importance to develop a tool to assess reliability dependencies among components and systems under the uncertainty of system reliability structure. In this paper, a Bayesian network approach is proposed for evaluating the conditional probability of failure within a complex system, using a multilevel system configuration. Coupling with Bayesian inference, the posterior distributions of these conditional probabilities can be estimated by combining failure information and expert opinions at both system and component levels. Three data scenarios are considered in this study, and they demonstrate that, with the quantification of the stochastic relationship of reliability within a system, the dependency structure in system reliability can be gradually revealed by the data collected at different system levels. - Highlights: • A Bayesian network representation of system reliability is presented. • Bayesian inference methods for assessing dependencies in system reliability are developed. • Complete and incomplete data scenarios are discussed. • The proposed approach is able to integrate reliability information from multiple sources at multiple levels of the system.

  5. Separations chemistry

    International Nuclear Information System (INIS)

    Anon.

    1976-01-01

    Results of studies on the photochemistry of aqueous Pu solutions and the stability of iodine in liquid and gaseous CO 2 are reported. Progress is reported in studies on: the preparation of macroporous bodies filled with oxides and sulfides to be used as adsorbents; the beneficiation of photographic wastes; the anion exchange adsorption of transition elements from thiosulfate solutions; advanced filtration applications of energy significance; high-resolution separations; and, the examination of the separation agents, octylphenylphosphoric acid (OPPA) and trihexyl phosphate (THP)

  6. Product separator

    International Nuclear Information System (INIS)

    Welsh, R.A.; Deurbrouck, A.W.

    1976-01-01

    A description is given of a secondary light sensitive photoelectric product separator for use with a primary product separator that concentrates a material so that it is visually distinguishable from adjacent materials. The concentrate separation is accomplished first by feeding the material onto a vibratory inclined surface with a liquid flow, such as a wet concentrating table. Vibrations generally perpendicular to the stream direction of flow cause the concentrate to separate from its mixture according to its color. When the concentrate and its surrounding stream reach the recovery end of the table, a detecting device notes the line of color demarcation and triggers a signal if it differs from a normal condition. If no difference is noted, nothing moves on the second separator. However, if a difference is detected in the constant monitoring of the color line's location, a product splitter and recovery unit normally positioned near the color line at the recovery end, moves to a new position. In this manner the selected separated concentrate is recovered at a maximum rate regardless of variations in the flow stream or other conditions present

  7. Bayesian non parametric modelling of Higgs pair production

    Directory of Open Access Journals (Sweden)

    Scarpa Bruno

    2017-01-01

    Full Text Available Statistical classification models are commonly used to separate a signal from a background. In this talk we face the problem of isolating the signal of Higgs pair production using the decay channel in which each boson decays into a pair of b-quarks. Typically in this context non parametric methods are used, such as Random Forests or different types of boosting tools. We remain in the same non-parametric framework, but we propose to face the problem following a Bayesian approach. A Dirichlet process is used as prior for the random effects in a logit model which is fitted by leveraging the Polya-Gamma data augmentation. Refinements of the model include the insertion in the simple model of P-splines to relate explanatory variables with the response and the use of Bayesian trees (BART to describe the atoms in the Dirichlet process.

  8. Bayesian Model Averaging for Propensity Score Analysis.

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2014-01-01

    This article considers Bayesian model averaging as a means of addressing uncertainty in the selection of variables in the propensity score equation. We investigate an approximate Bayesian model averaging approach based on the model-averaged propensity score estimates produced by the R package BMA but that ignores uncertainty in the propensity score. We also provide a fully Bayesian model averaging approach via Markov chain Monte Carlo sampling (MCMC) to account for uncertainty in both parameters and models. A detailed study of our approach examines the differences in the causal estimate when incorporating noninformative versus informative priors in the model averaging stage. We examine these approaches under common methods of propensity score implementation. In addition, we evaluate the impact of changing the size of Occam's window used to narrow down the range of possible models. We also assess the predictive performance of both Bayesian model averaging propensity score approaches and compare it with the case without Bayesian model averaging. Overall, results show that both Bayesian model averaging propensity score approaches recover the treatment effect estimates well and generally provide larger uncertainty estimates, as expected. Both Bayesian model averaging approaches offer slightly better prediction of the propensity score compared with the Bayesian approach with a single propensity score equation. Covariate balance checks for the case study show that both Bayesian model averaging approaches offer good balance. The fully Bayesian model averaging approach also provides posterior probability intervals of the balance indices.

  9. Pedestrian dynamics via Bayesian networks

    Science.gov (United States)

    Venkat, Ibrahim; Khader, Ahamad Tajudin; Subramanian, K. G.

    2014-06-01

    Studies on pedestrian dynamics have vital applications in crowd control management relevant to organizing safer large scale gatherings including pilgrimages. Reasoning pedestrian motion via computational intelligence techniques could be posed as a potential research problem within the realms of Artificial Intelligence. In this contribution, we propose a "Bayesian Network Model for Pedestrian Dynamics" (BNMPD) to reason the vast uncertainty imposed by pedestrian motion. With reference to key findings from literature which include simulation studies, we systematically identify: What are the various factors that could contribute to the prediction of crowd flow status? The proposed model unifies these factors in a cohesive manner using Bayesian Networks (BNs) and serves as a sophisticated probabilistic tool to simulate vital cause and effect relationships entailed in the pedestrian domain.

  10. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples...

  11. BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS

    Directory of Open Access Journals (Sweden)

    Thordis Linda Thorarinsdottir

    2011-05-01

    Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.

  12. Bayesian Inference on Proportional Elections

    Science.gov (United States)

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  13. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  14. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  15. Separating Gravitational Wave Signals from Instrument Artifacts

    Science.gov (United States)

    Littenberg, Tyson B.; Cornish, Neil J.

    2010-01-01

    Central to the gravitational wave detection problem is the challenge of separating features in the data produced by astrophysical sources from features produced by the detector. Matched filtering provides an optimal solution for Gaussian noise, but in practice, transient noise excursions or "glitches" complicate the analysis. Detector diagnostics and coincidence tests can be used to veto many glitches which may otherwise be misinterpreted as gravitational wave signals. The glitches that remain can lead to long tails in the matched filter search statistics and drive up the detection threshold. Here we describe a Bayesian approach that incorporates a more realistic model for the instrument noise allowing for fluctuating noise levels that vary independently across frequency bands, and deterministic "glitch fitting" using wavelets as "glitch templates", the number of which is determined by a trans-dimensional Markov chain Monte Carlo algorithm. We demonstrate the method's effectiveness on simulated data containing low amplitude gravitational wave signals from inspiraling binary black hole systems, and simulated non-stationary and non-Gaussian noise comprised of a Gaussian component with the standard LIGO/Virgo spectrum, and injected glitches of various amplitude, prevalence, and variety. Glitch fitting allows us to detect significantly weaker signals than standard techniques.

  16. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  17. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  18. Reliability analysis with Bayesian networks

    OpenAIRE

    Zwirglmaier, Kilian Martin

    2017-01-01

    Bayesian networks (BNs) represent a probabilistic modeling tool with large potential for reliability engineering. While BNs have been successfully applied to reliability engineering, there are remaining issues, some of which are addressed in this work. Firstly a classification of BN elicitation approaches is proposed. Secondly two approximate inference approaches, one of which is based on discretization and the other one on sampling, are proposed. These approaches are applicable to hybrid/con...

  19. Interim Bayesian Persuasion: First Steps

    OpenAIRE

    Perez, Eduardo

    2015-01-01

    This paper makes a first attempt at building a theory of interim Bayesian persuasion. I work in a minimalist model where a low or high type sender seeks validation from a receiver who is willing to validate high types exclusively. After learning her type, the sender chooses a complete conditional information structure for the receiver from a possibly restricted feasible set. I suggest a solution to this game that takes into account the signaling potential of the sender's choice.

  20. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    . This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....

  1. Computational Neuropsychology and Bayesian Inference

    Directory of Open Access Journals (Sweden)

    Thomas Parr

    2018-02-01

    Full Text Available Computational theories of brain function have become very influential in neuroscience. They have facilitated the growth of formal approaches to disease, particularly in psychiatric research. In this paper, we provide a narrative review of the body of computational research addressing neuropsychological syndromes, and focus on those that employ Bayesian frameworks. Bayesian approaches to understanding brain function formulate perception and action as inferential processes. These inferences combine ‘prior’ beliefs with a generative (predictive model to explain the causes of sensations. Under this view, neuropsychological deficits can be thought of as false inferences that arise due to aberrant prior beliefs (that are poor fits to the real world. This draws upon the notion of a Bayes optimal pathology – optimal inference with suboptimal priors – and provides a means for computational phenotyping. In principle, any given neuropsychological disorder could be characterized by the set of prior beliefs that would make a patient’s behavior appear Bayes optimal. We start with an overview of some key theoretical constructs and use these to motivate a form of computational neuropsychology that relates anatomical structures in the brain to the computations they perform. Throughout, we draw upon computational accounts of neuropsychological syndromes. These are selected to emphasize the key features of a Bayesian approach, and the possible types of pathological prior that may be present. They range from visual neglect through hallucinations to autism. Through these illustrative examples, we review the use of Bayesian approaches to understand the link between biology and computation that is at the heart of neuropsychology.

  2. Bayesian methods applied to GWAS.

    Science.gov (United States)

    Fernando, Rohan L; Garrick, Dorian

    2013-01-01

    Bayesian multiple-regression methods are being successfully used for genomic prediction and selection. These regression models simultaneously fit many more markers than the number of observations available for the analysis. Thus, the Bayes theorem is used to combine prior beliefs of marker effects, which are expressed in terms of prior distributions, with information from data for inference. Often, the analyses are too complex for closed-form solutions and Markov chain Monte Carlo (MCMC) sampling is used to draw inferences from posterior distributions. This chapter describes how these Bayesian multiple-regression analyses can be used for GWAS. In most GWAS, false positives are controlled by limiting the genome-wise error rate, which is the probability of one or more false-positive results, to a small value. As the number of test in GWAS is very large, this results in very low power. Here we show how in Bayesian GWAS false positives can be controlled by limiting the proportion of false-positive results among all positives to some small value. The advantage of this approach is that the power of detecting associations is not inversely related to the number of markers.

  3. Separable algebras

    CERN Document Server

    Ford, Timothy J

    2017-01-01

    This book presents a comprehensive introduction to the theory of separable algebras over commutative rings. After a thorough introduction to the general theory, the fundamental roles played by separable algebras are explored. For example, Azumaya algebras, the henselization of local rings, and Galois theory are rigorously introduced and treated. Interwoven throughout these applications is the important notion of étale algebras. Essential connections are drawn between the theory of separable algebras and Morita theory, the theory of faithfully flat descent, cohomology, derivations, differentials, reflexive lattices, maximal orders, and class groups. The text is accessible to graduate students who have finished a first course in algebra, and it includes necessary foundational material, useful exercises, and many nontrivial examples.

  4. Corruption of parameter behavior and regionalization by model and forcing data errors: a Bayesian example using the SNOW17 model

    NARCIS (Netherlands)

    He, M.; Hogue, T.S.; Franz, K.J.; Margulis, S.A.; Vrugt, J.A.

    2011-01-01

    The current study evaluates the impacts of various sources of uncertainty involved in hydrologic modeling on parameter behavior and regionalization utilizing different Bayesian likelihood functions and the Differential Evolution Adaptive Metropolis (DREAM) algorithm. The developed likelihood

  5. Isotope separation

    International Nuclear Information System (INIS)

    Bartlett, R.J.; Morrey, J.R.

    1978-01-01

    A method and apparatus is described for separating gas molecules containing one isotope of an element from gas molecules containing other isotopes of the same element in which all of the molecules of the gas are at the same electronic state in their ground state. Gas molecules in a gas stream containing one of the isotopes are selectively excited to a different electronic state while leaving the other gas molecules in their original ground state. Gas molecules containing one of the isotopes are then deflected from the other gas molecules in the stream and thus physically separated

  6. Isotopic separation

    International Nuclear Information System (INIS)

    Chen, C.

    1981-01-01

    Method and apparatus for separating isotopes in an isotopic mixture of atoms or molecules by increasing the mass differential among isotopic species. The mixture containing a particular isotope is selectively irradiated so as to selectively excite the isotope. This preferentially excited species is then reacted rapidly with an additional preselected radiation, an electron or another chemical species so as to form a product containing the specific isotope, but having a mass different than the original species initially containing the particular isotope. The product and the remaining balance of the mixture is then caused to flow through a device which separates the product from the mixture based upon the increased mass differential

  7. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  8. A Comparative Study of Information-Based Source Number Estimation Methods and Experimental Validations on Mechanical Systems

    Directory of Open Access Journals (Sweden)

    Wei Cheng

    2014-04-01

    Full Text Available This paper investigates one eigenvalue decomposition-based source number estimation method, and three information-based source number estimation methods, namely the Akaike Information Criterion (AIC, Minimum Description Length (MDL and Bayesian Information Criterion (BIC, and improves BIC as Improved BIC (IBIC to make it more efficient and easier for calculation. The performances of the abovementioned source number estimation methods are studied comparatively with numerical case studies, which contain a linear superposition case and a both linear superposition and nonlinear modulation mixing case. A test bed with three sound sources is constructed to test the performances of these methods on mechanical systems, and source separation is carried out to validate the effectiveness of the experimental studies. This work can benefit model order selection, complexity analysis of a system, and applications of source separation to mechanical systems for condition monitoring and fault diagnosis purposes.

  9. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  10. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  11. Bayesian Posterior Distributions Without Markov Chains

    OpenAIRE

    Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.

    2012-01-01

    Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...

  12. 3rd Bayesian Young Statisticians Meeting

    CERN Document Server

    Lanzarone, Ettore; Villalobos, Isadora; Mattei, Alessandra

    2017-01-01

    This book is a selection of peer-reviewed contributions presented at the third Bayesian Young Statisticians Meeting, BAYSM 2016, Florence, Italy, June 19-21. The meeting provided a unique opportunity for young researchers, M.S. students, Ph.D. students, and postdocs dealing with Bayesian statistics to connect with the Bayesian community at large, to exchange ideas, and to network with others working in the same field. The contributions develop and apply Bayesian methods in a variety of fields, ranging from the traditional (e.g., biostatistics and reliability) to the most innovative ones (e.g., big data and networks).

  13. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learn....... An automated procedure for specifying prior distributions for the parameters in a dynamic Bayesian network is presented. It is a simple extension of the procedure for the ordinary Bayesian networks. Finally the W¨olfer?s sunspot numbers are analyzed....

  14. Bayesian network as a modelling tool for risk management in agriculture

    DEFF Research Database (Denmark)

    Rasmussen, Svend; Madsen, Anders Læsø; Lund, Mogens

    . In this paper we use Bayesian networks as an integrated modelling approach for representing uncertainty and analysing risk management in agriculture. It is shown how historical farm account data may be efficiently used to estimate conditional probabilities, which are the core elements in Bayesian network models....... We further show how the Bayesian network model RiBay is used for stochastic simulation of farm income, and we demonstrate how RiBay can be used to simulate risk management at the farm level. It is concluded that the key strength of a Bayesian network is the transparency of assumptions......, and that it has the ability to link uncertainty from different external sources to budget figures and to quantify risk at the farm level....

  15. A Bayesian approach for integrating multilevel priors and data for aerospace system reliability assessment

    Directory of Open Access Journals (Sweden)

    Jian GUO

    2018-01-01

    Full Text Available This paper investigates Bayesian methods for aerospace system reliability analysis using various sources of test data and expert knowledge at both subsystem and system levels. Four scenarios based on available information for the priors and test data of a system and/or subsystems are studied using specific Bayesian inference techniques. This paper proposes the Bayesian melding method for integrating subsystem-level priors with system-level priors for both system- and subsystem-level reliability analysis. System and subsystem reliability outcomes are compared under different scenarios. Computational challenges for posterior inferences using the sophisticated Bayesian melding method are addressed using Markov Chain Monte Carlo (MCMC and adaptive Sampling Importance Re-sampling (SIR methods. A case study with simulation results illustrates the applications of the proposed methods and provides insights for aerospace system reliability analysis using available multilevel information.

  16. Using literature and data to learn Bayesian networks as clinical models of ovarian tumors

    DEFF Research Database (Denmark)

    Antal, P.; Fannes, G.; Timmerman, D.

    2004-01-01

    Thanks to its increasing availability, electronic literature has become a potential source of information for the development of complex Bayesian networks (BN), when human expertise is missing or data is scarce or contains much noise. This opportunity raises the question of how to integrate...... information from free-text resources with statistical data in learning Bayesian networks. Firstly, we report on the collection of prior information resources in the ovarian cancer domain, which includes "kernel" annotations of the domain variables. We introduce methods based on the annotations and literature...... an expert reference and against data scores (the mutual information (MI) and a Bayesian score). Next, we transform the text-based dependency measures into informative text-based priors for Bayesian network structures. Finally, we report the benefit of such informative text-based priors on the performance...

  17. Evaluating Spatial Variability in Sediment and Phosphorus Concentration-Discharge Relationships Using Bayesian Inference and Self-Organizing Maps

    Science.gov (United States)

    Underwood, Kristen L.; Rizzo, Donna M.; Schroth, Andrew W.; Dewoolkar, Mandar M.

    2017-12-01

    Given the variable biogeochemical, physical, and hydrological processes driving fluvial sediment and nutrient export, the water science and management communities need data-driven methods to identify regions prone to production and transport under variable hydrometeorological conditions. We use Bayesian analysis to segment concentration-discharge linear regression models for total suspended solids (TSS) and particulate and dissolved phosphorus (PP, DP) using 22 years of monitoring data from 18 Lake Champlain watersheds. Bayesian inference was leveraged to estimate segmented regression model parameters and identify threshold position. The identified threshold positions demonstrated a considerable range below and above the median discharge—which has been used previously as the default breakpoint in segmented regression models to discern differences between pre and post-threshold export regimes. We then applied a Self-Organizing Map (SOM), which partitioned the watersheds into clusters of TSS, PP, and DP export regimes using watershed characteristics, as well as Bayesian regression intercepts and slopes. A SOM defined two clusters of high-flux basins, one where PP flux was predominantly episodic and hydrologically driven; and another in which the sediment and nutrient sourcing and mobilization were more bimodal, resulting from both hydrologic processes at post-threshold discharges and reactive processes (e.g., nutrient cycling or lateral/vertical exchanges of fine sediment) at prethreshold discharges. A separate DP SOM defined two high-flux clusters exhibiting a bimodal concentration-discharge response, but driven by differing land use. Our novel framework shows promise as a tool with broad management application that provides insights into landscape drivers of riverine solute and sediment export.

  18. Simultaneous Quantification of Free Cholesterol, Cholesteryl Esters, and Triglycerides without Ester Hydrolysis by UHPLC Separation and In-Source Collision Induced Dissociation Coupled MS/MS

    Science.gov (United States)

    Gardner, Michael S.; McWilliams, Lisa G.; Jones, Jeffrey I.; Kuklenyik, Zsuzsanna; Pirkle, James L.; Barr, John R.

    2017-08-01

    We demonstrate the application of in-source nitrogen collision-induced dissociation (CID) that eliminates the need for ester hydrolysis before simultaneous analysis of esterified cholesterol (EC) and triglycerides (TG) along with free cholesterol (FC) from human serum, using normal phase liquid chromatography (LC) coupled to atmospheric pressure chemical ionization (APCI) tandem mass spectrometry (MS/MS). The analysis requires only 50 μL of 1:100 dilute serum with a high-throughput, precipitation/evaporation/extraction protocol in one pot. Known representative mixtures of EC and TG species were used as calibrators with stable isotope labeled analogs as internal standards. The APCI MS source was operated with nitrogen source gas. Reproducible in-source CID was achieved with the use of optimal cone voltage (declustering potential), generating FC, EC, and TG lipid class-specific precursor fragment ions for multiple reaction monitoring (MRM). Using a representative mixture of purified FC, CE, and TG species as calibrators, the method accuracy was assessed with analysis of five inter-laboratory standardization materials, showing -10% bias for Total-C and -3% for Total-TG. Repeated duplicate analysis of a quality control pool showed intra-day and inter-day variation of 5% and 5.8% for FC, 5.2% and 8.5% for Total-C, and 4.1% and 7.7% for Total-TG. The applicability of the method was demonstrated on 32 serum samples and corresponding lipoprotein sub-fractions collected from normolipidemic, hypercholesterolemic, hypertriglyceridemic, and hyperlipidemic donors. The results show that in-source CID coupled with isotope dilution UHPLC-MS/MS is a viable high precision approach for translational research studies where samples are substantially diluted or the amounts of archived samples are limited. [Figure not available: see fulltext.

  19. Using isotopes of dissolved inorganic carbon species and water to separate sources of recharge in a cave spring, northwestern Arkansas, USA Blowing Spring Cave

    Science.gov (United States)

    Knierim, Katherine J.; Pollock, Erik; Hays, Phillip D.

    2013-01-01

    Blowing Spring Cave in northwestern Arkansas is representative of cave systems in the karst of the Ozark Plateaus, and stable isotopes of water (δ18O and δ2H) and inorganic carbon (δ13C) were used to quantify soil-water, bedrock-matrix water, and precipitation contributions to cave-spring flow during storm events to understand controls on cave water quality. Water samples from recharge-zone soils and the cave were collected from March to May 2012 to implement a multicomponent hydrograph separation approach using δ18O and δ2H of water and dissolved inorganic carbon (δ13C–DIC). During baseflow, median δ2H and δ18O compositions were –41.6‰ and –6.2‰ for soil water and were –37.2‰ and –5.9‰ for cave water, respectively. Median DIC concentrations for soil and cave waters were 1.8 mg/L and 25.0 mg/L, respectively, and median δ13C–DIC compositions were –19.9‰ and –14.3‰, respectively. During a March storm event, 12.2 cm of precipitation fell over 82 h and discharge increased from 0.01 to 0.59 m3/s. The isotopic composition of precipitation varied throughout the storm event because of rainout, a change of 50‰ and 10‰ for δ2H and δ18O was observed, respectively. Although, at the spring, δ2H and δ18O only changed by approximately 3‰ and 1‰, respectively. The isotopic compositions of precipitation and pre-event (i.e., soil and bedrock matrix) water were isotopically similar and the two-component hydrograph separation was inaccurate, either overestimating (>100%) or underestimating (<0%) the precipitation contribution to the spring. During the storm event, spring DIC and δ13C–DIC decreased to a minimum of 8.6 mg/L and –16.2‰, respectively. If the contribution from precipitation was assumed to be zero, soil water was found to contribute between 23 to 72% of the total volume of discharge. Although the assumption of negligible contributions from precipitation is unrealistic, especially in karst systems where rapid flow

  20. Approximate Bayesian computation for spatial SEIR(S) epidemic models.

    Science.gov (United States)

    Brown, Grant D; Porter, Aaron T; Oleson, Jacob J; Hinman, Jessica A

    2018-02-01

    Approximate Bayesia n Computation (ABC) provides an attractive approach to estimation in complex Bayesian inferential problems for which evaluation of the kernel of the posterior distribution is impossible or computationally expensive. These highly parallelizable techniques have been successfully applied to many fields, particularly in cases where more traditional approaches such as Markov chain Monte Carlo (MCMC) are impractical. In this work, we demonstrate the application of approximate Bayesian inference to spatially heterogeneous Susceptible-Exposed-Infectious-Removed (SEIR) stochastic epidemic models. These models have a tractable posterior distribution, however MCMC techniques nevertheless become computationally infeasible for moderately sized problems. We discuss the practical implementation of these techniques via the open source ABSEIR package for R. The performance of ABC relative to traditional MCMC methods in a small problem is explored under simulation, as well as in the spatially heterogeneous context of the 2014 epidemic of Chikungunya in the Americas. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Applying Bayesian belief networks in rapid response situations

    Energy Technology Data Exchange (ETDEWEB)

    Gibson, William L [Los Alamos National Laboratory; Deborah, Leishman, A. [Los Alamos National Laboratory; Van Eeckhout, Edward [Los Alamos National Laboratory

    2008-01-01

    The authors have developed an enhanced Bayesian analysis tool called the Integrated Knowledge Engine (IKE) for monitoring and surveillance. The enhancements are suited for Rapid Response Situations where decisions must be made based on uncertain and incomplete evidence from many diverse and heterogeneous sources. The enhancements extend the probabilistic results of the traditional Bayesian analysis by (1) better quantifying uncertainty arising from model parameter uncertainty and uncertain evidence, (2) optimizing the collection of evidence to reach conclusions more quickly, and (3) allowing the analyst to determine the influence of the remaining evidence that cannot be obtained in the time allowed. These extended features give the analyst and decision maker a better comprehension of the adequacy of the acquired evidence and hence the quality of the hurried decisions. They also describe two example systems where the above features are highlighted.

  2. Fast Bayesian optimal experimental design and its applications

    KAUST Repository

    Long, Quan

    2015-01-07

    We summarize our Laplace method and multilevel method of accelerating the computation of the expected information gain in a Bayesian Optimal Experimental Design (OED). Laplace method is a widely-used method to approximate an integration in statistics. We analyze this method in the context of optimal Bayesian experimental design and extend this method from the classical scenario, where a single dominant mode of the parameters can be completely-determined by the experiment, to the scenarios where a non-informative parametric manifold exists. We show that by carrying out this approximation the estimation of the expected Kullback-Leibler divergence can be significantly accelerated. While Laplace method requires a concentration of measure, multi-level Monte Carlo method can be used to tackle the problem when there is a lack of measure concentration. We show some initial results on this approach. The developed methodologies have been applied to various sensor deployment problems, e.g., impedance tomography and seismic source inversion.

  3. Isotope separation

    International Nuclear Information System (INIS)

    Coleman, G.H.; Bett, R.; Cuninghame, J.G.; Sims, H.

    1982-01-01

    In the separation of short-lived isotopes for medical usage, a solution containing sup(195m)Hg is contacted with vicinal dithiol cellulose which adsorbs and retains the sup(195m)Hg. sup(195m)Au is eluted from the vicinal dithiol cellulose by using a suitable elutant. The sup(195m)Au arises from the radioactive decay of the sup(195m)Hg. The preferred elutant is a solution containing CN - ion. (author)

  4. Gas separating

    Science.gov (United States)

    Gollan, Arye Z.

    1990-12-25

    Feed gas is directed tangentially along the non-skin surface of gas separation membrane modules comprising a cylindrical bundle of parallel contiguous hollow fibers supported to allow feed gas to flow from an inlet at one end of a cylindrical housing through the bores of the bundled fibers to an outlet at the other end while a component of the feed gas permeates through the fibers, each having the skin side on the outside, through a permeate outlet in the cylindrical casing.

  5. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  6. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  7. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  8. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    , and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... settings, where cues shape expectations about a small number of upcoming stimuli and thus convey "prior" information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its...

  9. BEAST: Bayesian evolutionary analysis by sampling trees.

    Science.gov (United States)

    Drummond, Alexei J; Rambaut, Andrew

    2007-11-08

    The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.

  10. BEAST: Bayesian evolutionary analysis by sampling trees

    Directory of Open Access Journals (Sweden)

    Drummond Alexei J

    2007-11-01

    Full Text Available Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models suitable for both within- and between-species sequence data are implemented. Results BEAST version 1.4.6 consists of 81000 lines of Java source code, 779 classes and 81 packages. It provides models for DNA and protein sequence evolution, highly parametric coalescent analysis, relaxed clock phylogenetics, non-contemporaneous sequence data, statistical alignment and a wide range of options for prior distributions. BEAST source code is object-oriented, modular in design and freely available at http://beast-mcmc.googlecode.com/ under the GNU LGPL license. Conclusion BEAST is a powerful and flexible evolutionary analysis package for molecular sequence variation. It also provides a resource for the further development of new models and statistical methods of evolutionary analysis.

  11. Seeded Bayesian Networks: Constructing genetic networks from microarray data

    Directory of Open Access Journals (Sweden)

    Quackenbush John

    2008-07-01

    Full Text Available Abstract Background DNA microarrays and other genomics-inspired technologies provide large datasets that often include hidden patterns of correlation between genes reflecting the complex processes that underlie cellular metabolism and physiology. The challenge in analyzing large-scale expression data has been to extract biologically meaningful inferences regarding these processes – often represented as networks – in an environment where the datasets are often imperfect and biological noise can obscure the actual signal. Although many techniques have been developed in an attempt to address these issues, to date their ability to extract meaningful and predictive network relationships has been limited. Here we describe a method that draws on prior information about gene-gene interactions to infer biologically relevant pathways from microarray data. Our approach consists of using preliminary networks derived from the literature and/or protein-protein interaction data as seeds for a Bayesian network analysis of microarray results. Results Through a bootstrap analysis of gene expression data derived from a number of leukemia studies, we demonstrate that seeded Bayesian Networks have the ability to identify high-confidence gene-gene interactions which can then be validated by comparison to other sources of pathway data. Conclusion The use of network seeds greatly improves the ability of Bayesian Network analysis to learn gene interaction networks from gene expression data. We demonstrate that the use of seeds derived from the biomedical literature or high-throughput protein-protein interaction data, or the combination, provides improvement over a standard Bayesian Network analysis, allowing networks involving dynamic processes to be deduced from the static snapshots of biological systems that represent the most common source of microarray data. Software implementing these methods has been included in the widely used TM4 microarray analysis package.

  12. Accounting for model error in Bayesian solutions to hydrogeophysical inverse problems using a local basis approach

    Science.gov (United States)

    Irving, J.; Koepke, C.; Elsheikh, A. H.

    2017-12-01

    Bayesian solutions to geophysical and hydrological inverse problems are dependent upon a forward process model linking subsurface parameters to measured data, which is typically assumed to be known perfectly in the inversion procedure. However, in order to make the stochastic solution of the inverse problem computationally tractable using, for example, Markov-chain-Monte-Carlo (MCMC) methods, fast approximations of the forward model are commonly employed. This introduces model error into the problem, which has the potential to significantly bias posterior statistics and hamper data integration efforts if not properly accounted for. Here, we present a new methodology for addressing the issue of model error in Bayesian solutions to hydrogeophysical inverse problems that is geared towards the common case where these errors cannot be effectively characterized globally through some parametric statistical distribution or locally based on interpolation between a small number of computed realizations. Rather than focusing on the construction of a global or local error model, we instead work towards identification of the model-error component of the residual through a projection-based approach. In this regard, pairs of approximate and detailed model runs are stored in a dictionary that grows at a specified rate during the MCMC inversion procedure. At each iteration, a local model-error basis is constructed for the current test set of model parameters using the K-nearest neighbour entries in the dictionary, which is then used to separate the model error from the other error sources before computing the likelihood of the proposed set of model parameters. We demonstrate the performance of our technique on the inversion of synthetic crosshole ground-penetrating radar traveltime data for three different subsurface parameterizations of varying complexity. The synthetic data are generated using the eikonal equation, whereas a straight-ray forward model is assumed in the inversion

  13. Temporal and spatial variabilities of Antarctic ice mass changes inferred by GRACE in a Bayesian framework

    Science.gov (United States)

    Wang, L.; Davis, J. L.; Tamisiea, M. E.

    2017-12-01

    The Antarctic ice sheet (AIS) holds about 60% of all fresh water on the Earth, an amount equivalent to about 58 m of sea-level rise. Observation of AIS mass change is thus essential in determining and predicting its contribution to sea level. While the ice mass loss estimates for West Antarctica (WA) and the Antarctic Peninsula (AP) are in good agreement, what the mass balance over East Antarctica (EA) is, and whether or not it compensates for the mass loss is under debate. Besides the different error sources and sensitivities of different measurement types, complex spatial and temporal variabilities would be another factor complicating the accurate estimation of the AIS mass balance. Therefore, a model that allows for variabilities in both melting rate and seasonal signals would seem appropriate in the estimation of present-day AIS melting. We present a stochastic filter technique, which enables the Bayesian separation of the systematic stripe noise and mass signal in decade-length GRACE monthly gravity series, and allows the estimation of time-variable seasonal and inter-annual components in the signals. One of the primary advantages of this Bayesian method is that it yields statistically rigorous uncertainty estimates reflecting the inherent spatial resolution of the data. By applying the stochastic filter to the decade-long GRACE observations, we present the temporal variabilities of the AIS mass balance at basin scale, particularly over East Antarctica, and decipher the EA mass variations in the past decade, and their role in affecting overall AIS mass balance and sea level.

  14. Bayesian LASSO, scale space and decision making in association genetics.

    Science.gov (United States)

    Pasanen, Leena; Holmström, Lasse; Sillanpää, Mikko J

    2015-01-01

    LASSO is a penalized regression method that facilitates model fitting in situations where there are as many, or even more explanatory variables than observations, and only a few variables are relevant in explaining the data. We focus on the Bayesian version of LASSO and consider four problems that need special attention: (i) controlling false positives, (ii) multiple comparisons, (iii) collinearity among explanatory variables, and (iv) the choice of the tuning parameter that controls the amount of shrinkage and the sparsity of the estimates. The particular application considered is association genetics, where LASSO regression can be used to find links between chromosome locations and phenotypic traits in a biological organism. However, the proposed techniques are relevant also in other contexts where LASSO is used for variable selection. We separate the true associations from false positives using the posterior distribution of the effects (regression coefficients) provided by Bayesian LASSO. We propose to solve the multiple comparisons problem by using simultaneous inference based on the joint posterior distribution of the effects. Bayesian LASSO also tends to distribute an effect among collinear variables, making detection of an association difficult. We propose to solve this problem by considering not only individual effects but also their functionals (i.e. sums and differences). Finally, whereas in Bayesian LASSO the tuning parameter is often regarded as a random variable, we adopt a scale space view and consider a whole range of fixed tuning parameters, instead. The effect estimates and the associated inference are considered for all tuning parameters in the selected range and the results are visualized with color maps that provide useful insights into data and the association problem considered. The methods are illustrated using two sets of artificial data and one real data set, all representing typical settings in association genetics.

  15. Robust bayesian inference of generalized Pareto distribution ...

    African Journals Online (AJOL)

    Abstract. In this work, robust Bayesian estimation of the generalized Pareto distribution is proposed. The methodology is presented in terms of oscillation of posterior risks of the Bayesian estimators. By using a Monte Carlo simulation study, we show that, under a suitable generalized loss function, we can obtain a robust ...

  16. Bayesian Decision Theoretical Framework for Clustering

    Science.gov (United States)

    Chen, Mo

    2011-01-01

    In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…

  17. Using Bayesian belief networks in adaptive management.

    Science.gov (United States)

    J.B. Nyberg; B.G. Marcot; R. Sulyma

    2006-01-01

    Bayesian belief and decision networks are relatively new modeling methods that are especially well suited to adaptive-management applications, but they appear not to have been widely used in adaptive management to date. Bayesian belief networks (BBNs) can serve many purposes for practioners of adaptive management, from illustrating system relations conceptually to...

  18. Particle identification in ALICE: a Bayesian approach

    NARCIS (Netherlands)

    Adam, J.; Adamova, D.; Aggarwal, M. M.; Rinella, G. Aglieri; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Anticic, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshaeuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badala, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnafoeldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Bathen, B.; Batigne, G.; Camejo, A. Batista; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielcik, J.; Bielcikova, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Boggild, H.; Boldizsar, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossu, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Diaz, L. Calero; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castellanos, J. Castillo; Castro, A. J.; Casula, E. A. R.; Sanchez, C. Ceballos; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Barroso, V. Chibante; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Balbastre, G. Conesa; del Valle, Z. Conesa; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Morales, Y. Corrales; Cortes Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Denes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Corchero, M. A. Diaz; Dietel, T.; Dillenseger, P.; Divia, R.; Djuvsland, O.; Dobrin, A.; Gimenez, D. Domenicis; Doenigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernandez Tellez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Girard, M. Fusco; Gaardhoje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glaessel, P.; Gomez Coral, D. M.; Ramirez, A. Gomez; Gonzalez, A. S.; Gonzalez, V.; Gonzalez-Zamora, P.; Gorbunov, S.; Goerlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Haake, R.; Haaland, O.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbaer, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Bustamante, R. T. Jimenez; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kaplin, V.; Kar, S.; Uysal, A. Karasu; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Khan, M. Mohisin; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, J. S.; Kim, M.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein-Boesing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Meethaleveedu, G. Koyithatta; Kralik, I.; Kravcakova, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kucera, V.; Kuijer, P. G.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; Monzon, I. Leon; Leon Vargas, H.; Leoncino, M.; Levai, P.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; Torres, E. Lopez; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mares, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marin, A.; Markert, C.; Marquard, M.; Martin, N. A.; Blanco, J. Martin; Martinengo, P.; Martinez, M. I.; Garcia, G. Martinez; Pedreira, M. Martinez; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Perez, J. Mercado; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miskowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montano Zetina, L.; Montes, E.; De Godoy, D. A. Moreira; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Muehlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paic, G.; Pal, S. K.; Pan, J.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Da Costa, H. Pereira; Peresunko, D.; Lara, C. E. Perez; Lezama, E. Perez; Peskov, V.; Pestov, Y.; Petracek, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Ploskon, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Raesaenen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodriguez Cahuantzi, M.; Manso, A. Rodriguez; Roed, K.; Rogochaya, E.; Rohr, D.; Roehrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Montero, A. J. Rubio; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Safarik, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Sandor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Sefcik, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; de Souza, R. D.; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Sumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Munoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thaeder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Palomo, L. Valencia; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vyvre, P. Vande; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limon, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Baillie, O. Villalobos; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Voelkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrlakova, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Watanabe, D.; Watanabe, Y.; Weiser, D. F.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yano, S.; Yasin, Z.; Yokoyama, H.; Yoo, I. -K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Zavada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, C.; Zhao, C.; Zhigareva, N.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.; Collaboration, ALICE

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian

  19. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    Zajdel, W.P.; Kröse, B.J.A.; Blockeel, H.; Denecker, M.

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a

  20. Bayesian learning theory applied to human cognition.

    Science.gov (United States)

    Jacobs, Robert A; Kruschke, John K

    2011-01-01

    Probabilistic models based on Bayes' rule are an increasingly popular approach to understanding human cognition. Bayesian models allow immense representational latitude and complexity. Because they use normative Bayesian mathematics to process those representations, they define optimal performance on a given task. This article focuses on key mechanisms of Bayesian information processing, and provides numerous examples illustrating Bayesian approaches to the study of human cognition. We start by providing an overview of Bayesian modeling and Bayesian networks. We then describe three types of information processing operations-inference, parameter learning, and structure learning-in both Bayesian networks and human cognition. This is followed by a discussion of the important roles of prior knowledge and of active learning. We conclude by outlining some challenges for Bayesian models of human cognition that will need to be addressed by future research. WIREs Cogn Sci 2011 2 8-21 DOI: 10.1002/wcs.80 For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  1. Properties of the Bayesian Knowledge Tracing Model

    Science.gov (United States)

    van de Sande, Brett

    2013-01-01

    Bayesian Knowledge Tracing is used very widely to model student learning. It comes in two different forms: The first form is the Bayesian Knowledge Tracing "hidden Markov model" which predicts the probability of correct application of a skill as a function of the number of previous opportunities to apply that skill and the model…

  2. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    and secondly, to gain efficiency during modification of an object oriented Bayesian network. To accomplish these two goals we have exploited a mechanism allowing local triangulation of instances to develop a method for updating the junction trees associated with object oriented Bayesian networks in highly...

  3. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  4. Bayesian models: A statistical primer for ecologists

    Science.gov (United States)

    Hobbs, N. Thompson; Hooten, Mevin B.

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods—in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach.Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probability and develops a step-by-step sequence of connected ideas, including basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and inference from single and multiple models. This unique book places less emphasis on computer coding, favoring instead a concise presentation of the mathematical statistics needed to understand how and why Bayesian analysis works. It also explains how to write out properly formulated hierarchical Bayesian models and use them in computing, research papers, and proposals.This primer enables ecologists to understand the statistical principles behind Bayesian modeling and apply them to research, teaching, policy, and management.Presents the mathematical and statistical foundations of Bayesian modeling in language accessible to non-statisticiansCovers basic distribution theory, network diagrams, hierarchical models, Markov chain Monte Carlo, and moreDeemphasizes computer coding in favor of basic principlesExplains how to write out properly factored statistical expressions representing Bayesian models

  5. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  6. Study of the applicability of Markov chain Monte Carlo methods to the statistical separation of electron sources via the impact parameter for ALICE

    Energy Technology Data Exchange (ETDEWEB)

    Wittner, Manuel [Physikalisches Institut, Universitaet Heidelberg, Heidelberg (Germany); Collaboration: ALICE-Collaboration

    2015-07-01

    One particularly interesting measurement detected by the ALICE set-up at the LHC are electrons from charm and beauty hadron decays. Heavy quarks originate from initial hard scattering processes and thus experience the whole history of a heavy ion collision. Therefore, they are valuable probes to study the mechanisms of energy loss and hadronization in the hot and dense state of matter, that is expected to be formed in a heavy-ion collision at LHC. One important task is the distinction of the different electron sources, for which a method was developed. Hereby, the impact parameter distribution of the measurement data is compared with impact parameter distributions for the individual sources, which are created through Monte Carlo simulations. Afterwards, a maximum likelihood fit is applied. However, creating a posterior distribution of the likelihood according to Bayes' theorem and sampling it with Markov Chain Monte Carlo algorithms provides several advantages, e.g. a mathematically correct estimation of the uncertainties or the usage of prior knowledge. Hence for the first time in this particular problem, a Markov Chain Monte Carlo algorithm, namely the Metropolis algorithm, was implemented and investigated for its applicability in heavy flavor physics. First studies indicate its great usefulness in this field of physics.

  7. Flexible Bayesian Human Fecundity Models.

    Science.gov (United States)

    Kim, Sungduk; Sundaram, Rajeshwari; Buck Louis, Germaine M; Pyper, Cecilia

    2012-12-01

    Human fecundity is an issue of considerable interest for both epidemiological and clinical audiences, and is dependent upon a couple's biologic capacity for reproduction coupled with behaviors that place a couple at risk for pregnancy. Bayesian hierarchical models have been proposed to better model the conception probabilities by accounting for the acts of intercourse around the day of ovulation, i.e., during the fertile window. These models can be viewed in the framework of a generalized nonlinear model with an exponential link. However, a fixed choice of link function may not always provide the best fit, leading to potentially biased estimates for probability of conception. Motivated by this, we propose a general class of models for fecundity by relaxing the choice of the link function under the generalized nonlinear model framework. We use a sample from the Oxford Conception Study (OCS) to illustrate the utility and fit of this general class of models for estimating human conception. Our findings reinforce the need for attention to be paid to the choice of link function in modeling conception, as it may bias the estimation of conception probabilities. Various properties of the proposed models are examined and a Markov chain Monte Carlo sampling algorithm was developed for implementing the Bayesian computations. The deviance information criterion measure and logarithm of pseudo marginal likelihood are used for guiding the choice of links. The supplemental material section contains technical details of the proof of the theorem stated in the paper, and contains further simulation results and analysis.

  8. Bayesian Nonparametric Longitudinal Data Analysis.

    Science.gov (United States)

    Quintana, Fernando A; Johnson, Wesley O; Waetjen, Elaine; Gold, Ellen

    2016-01-01

    Practical Bayesian nonparametric methods have been developed across a wide variety of contexts. Here, we develop a novel statistical model that generalizes standard mixed models for longitudinal data that include flexible mean functions as well as combined compound symmetry (CS) and autoregressive (AR) covariance structures. AR structure is often specified through the use of a Gaussian process (GP) with covariance functions that allow longitudinal data to be more correlated if they are observed closer in time than if they are observed farther apart. We allow for AR structure by considering a broader class of models that incorporates a Dirichlet Process Mixture (DPM) over the covariance parameters of the GP. We are able to take advantage of modern Bayesian statistical methods in making full predictive inferences and about characteristics of longitudinal profiles and their differences across covariate combinations. We also take advantage of the generality of our model, which provides for estimation of a variety of covariance structures. We observe that models that fail to incorporate CS or AR structure can result in very poor estimation of a covariance or correlation matrix. In our illustration using hormone data observed on women through the menopausal transition, biology dictates the use of a generalized family of sigmoid functions as a model for time trends across subpopulation categories.

  9. BELM: Bayesian extreme learning machine.

    Science.gov (United States)

    Soria-Olivas, Emilio; Gómez-Sanchis, Juan; Martín, José D; Vila-Francés, Joan; Martínez, Marcelino; Magdalena, José R; Serrano, Antonio J

    2011-03-01

    The theory of extreme learning machine (ELM) has become very popular on the last few years. ELM is a new approach for learning the parameters of the hidden layers of a multilayer neural network (as the multilayer perceptron or the radial basis function neural network). Its main advantage is the lower computational cost, which is especially relevant when dealing with many patterns defined in a high-dimensional space. This brief proposes a bayesian approach to ELM, which presents some advantages over other approaches: it allows the introduction of a priori knowledge; obtains the confidence intervals (CIs) without the need of applying methods that are computationally intensive, e.g., bootstrap; and presents high generalization capabilities. Bayesian ELM is benchmarked against classical ELM in several artificial and real datasets that are widely used for the evaluation of machine learning algorithms. Achieved results show that the proposed approach produces a competitive accuracy with some additional advantages, namely, automatic production of CIs, reduction of probability of model overfitting, and use of a priori knowledge.

  10. Particle separation

    International Nuclear Information System (INIS)

    Baker, C.A.

    1990-01-01

    Solid particles are separated from a liquid which also contains ferric hydroxide by subjecting the liquid to ultrasonic agitation from a transducer in order to break up the flocs so that they will pass with the liquid through a filter belt. The belt thus retains the solid particles without interference from the flocs. As shown the woven nylon belt collects rare radioactive solid particles from liquid and carries them under sensors. The belt is washed clean, with further ultrasonic agitation in a trough on its return run. (author)

  11. Isotope separation

    International Nuclear Information System (INIS)

    Rosevear, A.; Sims, H.E.

    1985-01-01

    sup(195m)Au for medical usage is separated from sup(195m)Hg in a solution containing ions of sup(195m)Hg by contacting the solution with an adsorbing agent to adsorb 195 Hgsup(H) thereon, followed by selective elution of sup(195m)Au generated by radioactive decay of the sup(195m)Hg. The adsorbing agent comprises a composite material in the form of an inert porous inorganic substrate (e.g. Kieselguhr),the pores of which are occupied by a hydrogel of a polysaccharide (e.g. agarose) carrying terminal thiol groups for binding Hgsup(H) ions. (author)

  12. Gas separating

    Science.gov (United States)

    Gollan, A.

    1988-03-29

    Feed gas is directed tangentially along the non-skin surface of gas separation membrane modules comprising a cylindrical bundle of parallel contiguous hollow fibers supported to allow feed gas to flow from an inlet at one end of a cylindrical housing through the bores of the bundled fibers to an outlet at the other end while a component of the feed gas permeates through the fibers, each having the skin side on the outside, through a permeate outlet in the cylindrical casing. 3 figs.

  13. Time independent seismic hazard analysis of Greece deduced from Bayesian statistics

    Directory of Open Access Journals (Sweden)

    T. M. Tsapanos

    2003-01-01

    Full Text Available A Bayesian statistics approach is applied in the seismogenic sources of Greece and the surrounding area in order to assess seismic hazard, assuming that the earthquake occurrence follows the Poisson process. The Bayesian approach applied supplies the probability that a certain cut-off magnitude of Ms = 6.0 will be exceeded in time intervals of 10, 20 and 75 years. We also produced graphs which present the different seismic hazard in the seismogenic sources examined in terms of varying probability which is useful for engineering and civil protection purposes, allowing the designation of priority sources for earthquake-resistant design. It is shown that within the above time intervals the seismogenic source (4 called Igoumenitsa (in NW Greece and west Albania has the highest probability to experience an earthquake with magnitude M > 6.0. High probabilities are found also for Ochrida (source 22, Samos (source 53 and Chios (source 56.

  14. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  15. Bayesian natural language semantics and pragmatics

    CERN Document Server

    Zeevat, Henk

    2015-01-01

    The contributions in this volume focus on the Bayesian interpretation of natural languages, which is widely used in areas of artificial intelligence, cognitive science, and computational linguistics. This is the first volume to take up topics in Bayesian Natural Language Interpretation and make proposals based on information theory, probability theory, and related fields. The methodologies offered here extend to the target semantic and pragmatic analyses of computational natural language interpretation. Bayesian approaches to natural language semantics and pragmatics are based on methods from signal processing and the causal Bayesian models pioneered by especially Pearl. In signal processing, the Bayesian method finds the most probable interpretation by finding the one that maximizes the product of the prior probability and the likelihood of the interpretation. It thus stresses the importance of a production model for interpretation as in Grice's contributions to pragmatics or in interpretation by abduction.

  16. Crystal structure prediction accelerated by Bayesian optimization

    Science.gov (United States)

    Yamashita, Tomoki; Sato, Nobuya; Kino, Hiori; Miyake, Takashi; Tsuda, Koji; Oguchi, Tamio

    2018-01-01

    We propose a crystal structure prediction method based on Bayesian optimization. Our method is classified as a selection-type algorithm which is different from evolution-type algorithms such as an evolutionary algorithm and particle swarm optimization. Crystal structure prediction with Bayesian optimization can efficiently select the most stable structure from a large number of candidate structures with a lower number of searching trials using a machine learning technique. Crystal structure prediction using Bayesian optimization combined with random search is applied to known systems such as NaCl and Y2Co17 to discuss the efficiency of Bayesian optimization. These results demonstrate that Bayesian optimization can significantly reduce the number of searching trials required to find the global minimum structure by 30-40% in comparison with pure random search, which leads to much less computational cost.

  17. Separation of 103Ru from a proton irradiated thorium matrix: A potential source of Auger therapy radionuclide 103mRh.

    Directory of Open Access Journals (Sweden)

    Tara Mastren

    Full Text Available Ruthenium-103 is the parent isotope of 103mRh (t1/2 56.1 min, an isotope of interest for Auger electron therapy. During the proton irradiation of thorium targets, large amounts of 103Ru are generated through proton induced fission. The development of a two part chemical separation process to isolate 103Ru in high yield and purity from a proton irradiated thorium matrix on an analytical scale is described herein. The first part employed an anion exchange column to remove cationic actinide/lanthanide impurities along with the majority of the transition metal fission products. Secondly, an extraction chromatographic column utilizing diglycolamide functional groups was used to decontaminate 103Ru from the remaining impurities. This method resulted in a final radiochemical yield of 83 ± 5% of 103Ru with a purity of 99.9%. Additionally, measured nuclear reaction cross sections for the formation of 103Ru and 106Ru via the 232Th(p,f103,106Ru reactions are reported within.

  18. A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.

    Science.gov (United States)

    Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing

    2015-01-01

    This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.

  19. Bayesian Multi-Energy Computed Tomography reconstruction approaches based on decomposition models

    International Nuclear Information System (INIS)

    Cai, Caifang

    2013-01-01

    Multi-Energy Computed Tomography (MECT) makes it possible to get multiple fractions of basis materials without segmentation. In medical application, one is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical MECT measurements are usually obtained with polychromatic X-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam poly-chromaticity fail to estimate the correct decomposition fractions and result in Beam-Hardening Artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log pre-processing and the water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on non-linear forward models counting the beam poly-chromaticity show great potential for giving accurate fraction images.This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint Maximum A Posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a non-quadratic cost function. To solve it, the use of a monotone Conjugate Gradient (CG) algorithm with suboptimal descent steps is proposed.The performances of the proposed approach are analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  20. Separation of tautomeric forms of [2-nitrophloroglucinol-H]- by an in-electrospray ionization source hydrogen/deuterium exchange approach.

    Science.gov (United States)

    Kostyukevich, Yury; Kononikhin, Alexey; Popov, Igor; Starodubtseva, Natalia; Kukaev, Eugene; Nikotaev, Eugene

    2014-01-01

    Here we report the observation that, depending on the solvent used for the electrospray, 2-nitrophloroglucinol undergoes a deprotona- tion from different sites forming two tautomeric gas phase ions. Those ions differ bythe collision-induced dissociation [CID] spectra and by the gas phase hydrogen/deuterium (H/D) exchange kinetic. We performed H/D exchange in the electrospray ionization (ESI) source by saturation ESI region with vapors of deuterated solvent (D20). It was observed that [2-nitrophloroglucinol-H]- exchanges two -OH hydrogens when MeOD is used as the spray solvent but when the spray solvent is 50:50 MeOD/DO20 we observed an additional two H/D exchanges at the aromatic ring. We propose that the reaction occurs via a keto-enolt tautomerization mechanism which was found to be energetically favorable.