WorldWideScience

Sample records for bayesian aerosol release

  1. A Bayesian Analysis of the Radioactive Releases of Fukushima

    DEFF Research Database (Denmark)

    Tomioka, Ryota; Mørup, Morten

    2012-01-01

    The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know...... the types of nuclides and their levels of concentration from the recorded mixture of radiations to take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. From the sparsely sampled...... the Fukushima Daiichi plant we establish that the model is able to account for the data. We further demonstrate how the model extends to include all the available measurements recorded throughout Japan. The model can be considered a first attempt to apply Bayesian learning unsupervised in order to give a more...

  2. Status of the ORNL Aerosol Release and Transport Project

    Energy Technology Data Exchange (ETDEWEB)

    Adams, R.E.

    1985-01-01

    The behavior of aerosols assumed to be characteristic of those generated during light water reactor (LWR) accident sequences and released into containment is being studied. Recent activities in the ORNL Aerosol Release and Transport Project include studies of (1) the thermal hydraulic conditions existing during Nuclear Safety Pilot Plant (NSPP) aerosol tests in steam-air environments, (2) the thermal output and aerosol mass generation rates for plasma torch aerosol generators, and (3) the influence of humidity on the shape of agglomerated aerosols of various materials. A new Aerosol-Moisture Interaction Test (AMIT) facility was prepared at the NSPP site to accommodate the aerosol shape studies; several tests with Fe/sub 2/O/sub 3/ aerosol have been conducted. In addition to the above activities a special study was conducted to determine the suitability of the technique of aerosol production by plasma torch under the operating conditions of future tests of the LWR Aerosol Containment Experiments (LACE) at the Hanford Engineering Development Laboratory. 3 refs., 2 figs., 7 tabs.

  3. Bayesian inference of synaptic quantal parameters from correlated vesicle release

    Directory of Open Access Journals (Sweden)

    Alexander D Bird

    2016-11-01

    Full Text Available Synaptic transmission is both history-dependent and stochastic, resulting in varying responses to presentations of the same presynaptic stimulus. This complicates attempts to infer synaptic parameters and has led to the proposal of a number of different strategies for their quantification. Recently Bayesian approaches have been applied to make more efficient use of the data collected in paired intracellular recordings. Methods have been developed that either provide a complete model of the distribution of amplitudes for isolated responses or approximate the amplitude distributions of a train of post-synaptic potentials, with correct short-term synaptic dynamics but neglecting correlations. In both cases the methods provided significantly improved inference of model parameters as compared to existing mean-variance fitting approaches. However, for synapses with high release probability, low vesicle number or relatively low restock rate and for data in which only one or few repeats of the same pattern are available, correlations between serial events can allow for the extraction of significantly more information from experiment: a more complete Bayesian approach would take this into account also. This has not been possible previously because of the technical difficulty in calculating the likelihood of amplitudes seen in correlated post-synaptic potential trains; however, recent theoretical advances have now rendered the likelihood calculation tractable for a broad class of synaptic dynamics models. Here we present a compact mathematical form for the likelihood in terms of a matrix product and demonstrate how marginals of the posterior provide information on covariance of parameter distributions. The associated computer code for Bayesian parameter inference for a variety of models of synaptic dynamics is provided in the supplementary material allowing for quantal and dynamical parameters to be readily inferred from experimental data sets.

  4. Bayesian Inference of Synaptic Quantal Parameters from Correlated Vesicle Release

    Science.gov (United States)

    Bird, Alex D.; Wall, Mark J.; Richardson, Magnus J. E.

    2016-01-01

    Synaptic transmission is both history-dependent and stochastic, resulting in varying responses to presentations of the same presynaptic stimulus. This complicates attempts to infer synaptic parameters and has led to the proposal of a number of different strategies for their quantification. Recently Bayesian approaches have been applied to make more efficient use of the data collected in paired intracellular recordings. Methods have been developed that either provide a complete model of the distribution of amplitudes for isolated responses or approximate the amplitude distributions of a train of post-synaptic potentials, with correct short-term synaptic dynamics but neglecting correlations. In both cases the methods provided significantly improved inference of model parameters as compared to existing mean-variance fitting approaches. However, for synapses with high release probability, low vesicle number or relatively low restock rate and for data in which only one or few repeats of the same pattern are available, correlations between serial events can allow for the extraction of significantly more information from experiment: a more complete Bayesian approach would take this into account also. This has not been possible previously because of the technical difficulty in calculating the likelihood of amplitudes seen in correlated post-synaptic potential trains; however, recent theoretical advances have now rendered the likelihood calculation tractable for a broad class of synaptic dynamics models. Here we present a compact mathematical form for the likelihood in terms of a matrix product and demonstrate how marginals of the posterior provide information on covariance of parameter distributions. The associated computer code for Bayesian parameter inference for a variety of models of synaptic dynamics is provided in the Supplementary Material allowing for quantal and dynamical parameters to be readily inferred from experimental data sets. PMID:27932970

  5. Small-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, Lenna A.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, Garrett N.; Kurath, Dean E.; Buchmiller, William C.; Smith, Dennese M.; Blanchard, Jeremy; Song, Chen; Daniel, Richard C.; Wells, Beric E.; Tran, Diana N.; Burns, Carolyn A.

    2012-11-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  6. Large-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  7. Small-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Mahoney, Lenna A.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, Garrett N.; Kurath, Dean E.; Buchmiller, William C.; Smith, Dennese M.; Blanchard, Jeremy; Song, Chen; Daniel, Richard C.; Wells, Beric E.; Tran, Diana N.; Burns, Carolyn A.

    2013-05-29

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and net generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of antifoam agents was assessed with most of the simulants. Orifices included round holes and

  8. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  9. Small-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, G. N.; Mahoney, Lenna A.; Tran, Diana N.; Burns, Carolyn A.; Kurath, Dean E.

    2013-08-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are largely absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale. The small-scale testing and resultant data are described in Mahoney et al. (2012b) and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used to mimic the

  10. Bayesian estimation of a source term of radiation release with approximately known nuclide ratios

    Science.gov (United States)

    Tichý, Ondřej; Šmídl, Václav; Hofman, Radek

    2016-04-01

    We are concerned with estimation of a source term in case of an accidental release from a known location, e.g. a power plant. Usually, the source term of an accidental release of radiation comprises of a mixture of nuclide. The gamma dose rate measurements do not provide a direct information on the source term composition. However, physical properties of respective nuclide (deposition properties, decay half-life) can be used when uncertain information on nuclide ratios is available, e.g. from known reactor inventory. The proposed method is based on linear inverse model where the observation vector y arise as a linear combination y = Mx of a source-receptor-sensitivity (SRS) matrix M and the source term x. The task is to estimate the unknown source term x. The problem is ill-conditioned and further regularization is needed to obtain a reasonable solution. In this contribution, we assume that nuclide ratios of the release is known with some degree of uncertainty. This knowledge is used to form the prior covariance matrix of the source term x. Due to uncertainty in the ratios the diagonal elements of the covariance matrix are considered to be unknown. Positivity of the source term estimate is guaranteed by using multivariate truncated Gaussian distribution. Following Bayesian approach, we estimate all parameters of the model from the data so that y, M, and known ratios are the only inputs of the method. Since the inference of the model is intractable, we follow the Variational Bayes method yielding an iterative algorithm for estimation of all model parameters. Performance of the method is studied on simulated 6 hour power plant release where 3 nuclide are released and 2 nuclide ratios are approximately known. The comparison with method with unknown nuclide ratios will be given to prove the usefulness of the proposed approach. This research is supported by EEA/Norwegian Financial Mechanism under project MSMT-28477/2014 Source-Term Determination of Radionuclide Releases

  11. Large methane releases lead to strong aerosol forcing and reduced cloudiness

    DEFF Research Database (Denmark)

    Kurten, T.; Zhou, L.; Makkonen, R.;

    2011-01-01

    The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller...... contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH4) levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations...

  12. Release the BEESTS: Bayesian Estimation of Ex-Gaussian STop-Signal Reaction Time Distributions

    Directory of Open Access Journals (Sweden)

    Dora eMatzke

    2013-12-01

    Full Text Available The stop-signal paradigm is frequently used to study response inhibition. Inthis paradigm, participants perform a two-choice response time task wherethe primary task is occasionally interrupted by a stop-signal that promptsparticipants to withhold their response. The primary goal is to estimatethe latency of the unobservable stop response (stop signal reaction timeor SSRT. Recently, Matzke, Dolan, Logan, Brown, and Wagenmakers (inpress have developed a Bayesian parametric approach that allows for theestimation of the entire distribution of SSRTs. The Bayesian parametricapproach assumes that SSRTs are ex-Gaussian distributed and uses Markovchain Monte Carlo sampling to estimate the parameters of the SSRT distri-bution. Here we present an efficient and user-friendly software implementa-tion of the Bayesian parametric approach —BEESTS— that can be appliedto individual as well as hierarchical stop-signal data. BEESTS comes withan easy-to-use graphical user interface and provides users with summarystatistics of the posterior distribution of the parameters as well various diag-nostic tools to assess the quality of the parameter estimates. The softwareis open source and runs on Windows and OS X operating systems. In sum,BEESTS allows experimental and clinical psychologists to estimate entiredistributions of SSRTs and hence facilitates the more rigorous analysis ofstop-signal data.

  13. A Bayesian method for characterizing distributed micro-releases: II. inference under model uncertainty with short time-series data.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M. (Peterson AFB, CO); Ray, J. P.

    2006-01-01

    Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.

  14. Sparse Bayesian learning machine for real-time management of reservoir releases

    Science.gov (United States)

    Khalil, Abedalrazq; McKee, Mac; Kemblowski, Mariush; Asefa, Tirusew

    2005-11-01

    Water scarcity and uncertainties in forecasting future water availabilities present serious problems for basin-scale water management. These problems create a need for intelligent prediction models that learn and adapt to their environment in order to provide water managers with decision-relevant information related to the operation of river systems. This manuscript presents examples of state-of-the-art techniques for forecasting that combine excellent generalization properties and sparse representation within a Bayesian paradigm. The techniques are demonstrated as decision tools to enhance real-time water management. A relevance vector machine, which is a probabilistic model, has been used in an online fashion to provide confident forecasts given knowledge of some state and exogenous conditions. In practical applications, online algorithms should recognize changes in the input space and account for drift in system behavior. Support vectors machines lend themselves particularly well to the detection of drift and hence to the initiation of adaptation in response to a recognized shift in system structure. The resulting model will normally have a structure and parameterization that suits the information content of the available data. The utility and practicality of this proposed approach have been demonstrated with an application in a real case study involving real-time operation of a reservoir in a river basin in southern Utah.

  15. Correlation Between Hierarchical Bayesian and Aerosol Optical Depth PM2.5 Data and Respiratory-Cardiovascular Chronic Diseases

    Science.gov (United States)

    Tools to estimate PM2.5 mass have expanded in recent years, and now include: 1) stationary monitor readings, 2) Community Multi-Scale Air Quality (CMAQ) model estimates, 3) Hierarchical Bayesian (HB) estimates from combined stationary monitor readings and CMAQ model output; and, ...

  16. Sensitivity of depositions to the size and hygroscopicity of Cs-bearing aerosols released from the Fukushima nuclear accident

    Science.gov (United States)

    Kajino, Mizuo; Adachi, Kouji; Sekiyama, Tsuyoshi; Zaizen, Yuji; Igarashi, Yasuhito

    2014-05-01

    We recently revealed that the microphysical properties of aerosols carrying the radioactive Cs released from the Fukushima Daiichi Nuclear Power Plant (FDNPP) at an early stage (March 14-15, 2011) of the accident could be very different from what we assumed previously: super-micron and non-hygroscopic at the early stage, whereas sub-micron and hygroscopic afterwards (at least later than March 20-22). In the study, two sensitivity simulations with the two different aerosol microphysical properties were conducted using a regional scale meteorology- chemical transport model (NHM-Chem). The impact of the difference was quite significant. 17% (0.001%) of the radioactive Cs fell onto the ground by dry (wet) deposition processes, and the rest was deposited into the ocean or was transported out of the model domain, which is central and northern part of the main land of Japan, under the assumption that Cs-bearing aerosols are non-hygroscopic and super-micron. On the other hand, 5.7% (11.3%) fell onto the ground by dry (wet) deposition, for the cases under the assumption that the Cs-bearing aerosols are hygroscopic and sub-micron. For the accurate simulation of the deposition of radionuclides, knowledge of the aerosol microphysical properties is essential as well as the accuracy of the simulated wind fields and precipitation patterns.

  17. Deployable Plume and Aerosol Release Prediction and Tracking System. Nuclear Non-Proliferation Task 1. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kleppe, John; Norris, William; Etezadi, Mehdi

    2006-07-19

    This contract was awarded in response to a proposal in which a deployable plume and aerosol release prediction and tracking system would be designed, fabricated, and tested. The system would gather real time atmospheric data and input it into a real time atmospheric model that could be used for plume predition and tracking. The system would be able to be quickly deployed by aircraft to points of interest or positioned for deployment by vehicles. The system would provide three dimensional (u, v, and w) wind vector data, inversion height measurements, surface wind information, classical weather station data, and solar radiation. The on-board real time computer model would provide the prediction of the behavior of plumes and released aerosols.

  18. Distribution of sulfur aerosol precursors in the SPCZ released by continuous volcanic degassing at Ambrym, Vanuatu

    Science.gov (United States)

    Lefèvre, Jérôme; Menkes, Christophe; Bani, Philipson; Marchesiello, Patrick; Curci, Gabriele; Grell, Georg A.; Frouin, Robert

    2016-08-01

    The Melanesian Volcanic Arc (MVA) emits about 12 kT d- 1 of sulfur dioxide (SO2) to the atmosphere from continuous passive (non-explosive) volcanic degassing, which contributes 20% of the global SO2 emission from volcanoes. Here we assess, from up-to-date and long-term observations, the SO2 emission of the Ambrym volcano, one of the dominant volcanoes in the MVA, and we investigate its role as sulfate precursor on the regional distribution of aerosols, using both satellite observations and model results at 1° × 1° spatial resolution from WRF-Chem/GOCART. Without considering aerosol forcing on clouds, our model parameterizations for convection, vertical mixing and cloud properties provide a reliable chemical weather representation, making possible a cross-examination of model solution and observations. This preliminary work enables the identification of biases and limitations affecting both the model (missing sources) and satellite sensors and algorithms (for aerosol detection and classification) and leads to the implementation of improved transport and aerosol processes in the modeling system. On the one hand, the model confirms a 50% underestimation of SO2 emissions due to satellite swath sampling of the Ozone Monitoring Instrument (OMI), consistent with field studies. The OMI irregular sampling also produces a level of noise that impairs its monitoring capacity during short-term volcanic events. On the other hand, the model reveals a large sensitivity on aerosol composition and Aerosol Optical Depth (AOD) due to choices of both the source function in WRF-Chem and size parameters for sea-salt in FlexAOD, the post-processor used to compute offline the simulated AOD. We then proceed to diagnosing the role of SO2 volcanic emission in the regional aerosol composition. The model shows that both dynamics and cloud properties associated with the South Pacific Convergence Zone (SPCZ) have a large influence on the oxidation of SO2 and on the transport pathways of

  19. Effect of phytoplankton-released organic matter on the production and properties of the primary marine aerosol (Invited)

    Science.gov (United States)

    Fuentes, E.; Coe, H.; Green, D.; de Leeuw, G.; McFiggans, G.

    2010-12-01

    This study investigates the effect of the biogenic matter exuded by marine biota on the production and properties of the submicron primary sea-spray, based on the laboratory simulation of marine aerosol formation from seawater enriched with organic matter released by laboratory-grown algal cultures. Primary aerosol formation by bubble bursting was reproduced by using a plunging water jet generation system. Particle production experiments with seawater enriched in marine exudate marine organics. An increase in the production of particles production experiments. Estimations of the relationship between Chl-a biomass and seawater OC concentration indicated that effects on particle fluxes due to biological activity are likely to occur in diatom blooms with Chl-a diatom biomass >0.35-2 mg/m3 (OC>175 µM), depending on the primary organic production conditions in the algal bloom. Analysis of the hygroscopicity and cloud condensation nuclei (CCN) activity of the organics-enriched primary aerosol indicated both a suppression of the water uptake and the CCN activity with increasing amount of organic exudate in the source seawater. The increase in the CCN number likely to occur in algal bloom areas due to the potential increase in particle production would therefore be counteracted by the reduction of the particle CCN activity induced by the incorporation of organic matter. Calculations of the primary particle composition using a mixing rule yielded organic mass fractions in the range 5-37%, with the organic particle enrichment proportional to the seawater organic content. This level of organic mass fraction is in contrast with values up to 80% reported from atmospheric measurements, suggesting the presence of organics of secondary origin in the atmospheric marine aerosol.

  20. Final Report: Safety of Plasma Components and Aerosol Transport During Hard Disruptions and Accidental Energy Release in Fusion Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Bourham, Mohamed A.; Gilligan, John G.

    1999-08-14

    Safety considerations in large future fusion reactors like ITER are important before licensing the reactor. Several scenarios are considered hazardous, which include safety of plasma-facing components during hard disruptions, high heat fluxes and thermal stresses during normal operation, accidental energy release, and aerosol formation and transport. Disruption events, in large tokamaks like ITER, are expected to produce local heat fluxes on plasma-facing components, which may exceed 100 GW/m{sup 2} over a period of about 0.1 ms. As a result, the surface temperature dramatically increases, which results in surface melting and vaporization, and produces thermal stresses and surface erosion. Plasma-facing components safety issues extends to cover a wide range of possible scenarios, including disruption severity and the impact of plasma-facing components on disruption parameters, accidental energy release and short/long term LOCA's, and formation of airborne particles by convective current transport during a LOVA (water/air ingress disruption) accident scenario. Study, and evaluation of, disruption-induced aerosol generation and mobilization is essential to characterize database on particulate formation and distribution for large future fusion tokamak reactor like ITER. In order to provide database relevant to ITER, the SIRENS electrothermal plasma facility at NCSU has been modified to closely simulate heat fluxes expected in ITER.

  1. Aerosol-Assisted Fast Formulating Uniform Pharmaceutical Polymer Microparticles with Variable Properties toward pH-Sensitive Controlled Drug Release

    Directory of Open Access Journals (Sweden)

    Hong Lei

    2016-05-01

    Full Text Available Microencapsulation is highly attractive for oral drug delivery. Microparticles are a common form of drug carrier for this purpose. There is still a high demand on efficient methods to fabricate microparticles with uniform sizes and well-controlled particle properties. In this paper, uniform hydroxypropyl methylcellulose phthalate (HPMCP-based pharmaceutical microparticles loaded with either hydrophobic or hydrophilic model drugs have been directly formulated by using a unique aerosol technique, i.e., the microfluidic spray drying technology. A series of microparticles of controllable particle sizes, shapes, and structures are fabricated by tuning the solvent composition and drying temperature. It is found that a more volatile solvent and a higher drying temperature can result in fast evaporation rates to form microparticles of larger lateral size, more irregular shape, and denser matrix. The nature of the model drugs also plays an important role in determining particle properties. The drug release behaviors of the pharmaceutical microparticles are dependent on their structural properties and the nature of a specific drug, as well as sensitive to the pH value of the release medium. Most importantly, drugs in the microparticles obtained by using a more volatile solvent or a higher drying temperature can be well protected from degradation in harsh simulated gastric fluids due to the dense structures of the microparticles, while they can be fast-released in simulated intestinal fluids through particle dissolution. These pharmaceutical microparticles are potentially useful for site-specific (enteric delivery of orally-administered drugs.

  2. Large methane releases lead to strong aerosol forcing and reduced cloudiness

    DEFF Research Database (Denmark)

    Kurten, T.; Zhou, L.; Makkonen, R.

    2011-01-01

    contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH4) levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations...... is predicted to significantly decrease hydroxyl radical (OH) concentrations, while moderately increasing ozone (O-3). These changes lead to a 70% increase in the atmospheric lifetime of methane, and an 18% decrease in global mean cloud droplet number concentrations (CDNC). The CDNC change causes a radiative...

  3. Release of Reactive Halogen Species from Sea-Salt Aerosols under Tropospheric Conditions with/without the Influence of Organic Matter in Smog-Chamber Experiments

    Science.gov (United States)

    Balzer, N.; Behnke, W.; Bleicher, S.; Krueger, H.; Ofner, J.; Siekmann, F.; Zetzsch, C.

    2008-12-01

    Experiments to investigate the release of reactive halogen species from sea-salt aerosol and the influence of organic matter were performed in an aerosol smog-chamber (3500 l), made of Teflon film (FEP 200A, Dupont). Smog chamber facilities at lowered temperature (coolable down to -25°C) enable us to simulate these reactions under polar, tropospheric conditions. First experiments were performed to investigate the production of atomic Br and Cl without the impact of organic aerosol. Br and Cl play an important role in atmospheric ozone depletion, particularly regarding ozone depletion events (bromine explosion) during polar spring. In these studies, the aerosol was generated by atomizing salt solutions containing the typical Br/Cl ratio of 1/660 in seawater by an ultrasonic nebulizer and increasing the Br content up to sixfold. To ensure the aqueous surface of the aerosol, the experiments were performed at relative humidities above 76%. We determined the atomic Cl and OH-radical concentrations from the simultaneous consumption of four reference hydrocarbons. The Br-radical concentration was calculated on the basis of ozone depletion. Organic aerosol may take part in these reaction cycles by halogenation and production of volatile organic halogens. Further experiments are planned to add organic aerosol for mechanistic and kinetic studies on the influence of secondary organic aerosols (SOA) and humic-like substances (HULIS) on bromine explosion. The formation of the secondary organic aerosol and the determination of possible halogenated gaseous and solid organic products will be studied using longpath-FTIR, DRIFTS, ATR-FTIR, GC-FID, GC-ECD, GC-MS, TPD-MS and DMA-CNC.

  4. Development of a sampling method for carbonyl compounds released due to the use of electronic cigarettes and quantitation of their conversion from liquid to aerosol.

    Science.gov (United States)

    Jo, Sang-Hee; Kim, Ki-Hyun

    2016-01-15

    In this study, an experimental method for the collection and analysis of carbonyl compounds (CCs) released due to the use of electronic cigarettes (e-cigarettes or ECs) was developed and validated through a series of laboratory experiments. As part of this work, the conversion of CCs from a refill solution (e-solution) to aerosol also was investigated based on mass change tracking (MCT) approach. Aerosol samples generated from an e-cigarette were collected manually using 2,4-dinitrophenylhydrazine (DNPH) cartridges at a constant sampling (puffing) velocity of 1 L min(-1) with the following puff conditions: puff duration (2s), interpuff interval (10s), and puff number (5, 10, and 15 times). The MCT approach allowed us to improve the sampling of CCs through critical evaluation of the puff conditions in relation to the consumed quantities of refill solution. The emission concentrations of CCs remained constant when e-cigarettes were sampled at or above 10 puff. Upon aerosolization, the concentrations of formaldehyde and acetaldehyde increased 6.23- and 58.4-fold, respectively, relative to their concentrations in e-solution. Furthermore, a number of CCs were found to be present in the aerosol samples which were not detected in the initial e-solution (e.g., acetone, butyraldehyde, and o-tolualdehyde).

  5. Aerosol distribution apparatus

    Science.gov (United States)

    Hanson, W.D.

    An apparatus for uniformly distributing an aerosol to a plurality of filters mounted in a plenum, wherein the aerosol and air are forced through a manifold system by means of a jet pump and released into the plenum through orifices in the manifold. The apparatus allows for the simultaneous aerosol-testing of all the filters in the plenum.

  6. Bayesian biostatistics

    CERN Document Server

    Lesaffre, Emmanuel

    2012-01-01

    The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd

  7. Bayesian statistics

    OpenAIRE

    新家, 健精

    2013-01-01

    © 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography

  8. Bayesian Theory

    CERN Document Server

    Bernardo, Jose M

    2000-01-01

    This highly acclaimed text, now available in paperback, provides a thorough account of key concepts and theoretical results, with particular emphasis on viewing statistical inference as a special case of decision theory. Information-theoretic concepts play a central role in the development of the theory, which provides, in particular, a detailed discussion of the problem of specification of so-called prior ignorance . The work is written from the authors s committed Bayesian perspective, but an overview of non-Bayesian theories is also provided, and each chapter contains a wide-ranging critica

  9. Bayesian SPLDA

    OpenAIRE

    Villalba, Jesús

    2015-01-01

    In this document we are going to derive the equations needed to implement a Variational Bayes estimation of the parameters of the simplified probabilistic linear discriminant analysis (SPLDA) model. This can be used to adapt SPLDA from one database to another with few development data or to implement the fully Bayesian recipe. Our approach is similar to Bishop's VB PPCA.

  10. Bayesian signaling

    OpenAIRE

    Hedlund, Jonas

    2014-01-01

    This paper introduces private sender information into a sender-receiver game of Bayesian persuasion with monotonic sender preferences. I derive properties of increasing differences related to the precision of signals and use these to fully characterize the set of equilibria robust to the intuitive criterion. In particular, all such equilibria are either separating, i.e., the sender's choice of signal reveals his private information to the receiver, or fully disclosing, i.e., the outcome of th...

  11. Bayesian Monitoring.

    OpenAIRE

    Kirstein, Roland

    2005-01-01

    This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...

  12. Bayesian programming

    CERN Document Server

    Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel

    2013-01-01

    Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean

  13. The Plinius/Colima CA-U3 test on fission-product aerosol release over a VVER-type corium pool; L'essai Plinius/Colima CA-U3 sur le relachement des aerosols de produits de fission au-dessus d'un bain de corium de type VVER

    Energy Technology Data Exchange (ETDEWEB)

    Journeau, Ch.; Piluso, P.; Correggio, P.; Godin-Jacqmin, L

    2007-07-01

    In a hypothetical case of severe accident in a PWR type VVER-440, a complex corium pool could be formed and fission products could be released. In order to study aerosols release in terms of mechanisms, kinetics, nature or quantity, and to better precise the source term of VVER-440, a series of experiments have been performed in the Colima facility and the test Colima CA-U3 has been successfully performed thanks to technological modifications to melt a prototypical corium at 2760 C degrees. Specific instrumentation has allowed us to follow the evolution of the corium melt and the release, transport and deposition of the fission products. The main conclusions are: -) there is a large release of Cr, Te, Sr, Pr and Rh (>95%w), -) there is a significant release of Fe (50%w), -) there is a small release of Ba, Ce, La, Nb, Nd and Y (<90%w), -) there is a very small release of U in proportion (<5%w) but it is one of the major released species in mass, and -) there is no release of Zr. The Colima experimental results are consistent with previous experiments on irradiated fuels except for Ba, Fe and U releases. (A.C.)

  14. CATS Aerosol Typing and Future Directions

    Science.gov (United States)

    McGill, Matt; Yorks, John; Scott, Stan; Palm, Stephen; Hlavka, Dennis; Hart, William; Nowottnick, Ed; Selmer, Patrick; Kupchock, Andrew; Midzak, Natalie; Trepte, Chip; Vaughan, Mark; Colarco, Peter; da Silva, Arlindo

    2016-01-01

    The Cloud Aerosol Transport System (CATS), launched in January of 2015, is a lidar remote sensing instrument that will provide range-resolved profile measurements of atmospheric aerosols and clouds from the International Space Station (ISS). CATS is intended to operate on-orbit for at least six months, and up to three years. Status of CATS Level 2 and Plans for the Future:Version. 1. Aerosol Typing (ongoing): Mode 1: L1B data released later this summer; L2 data released shortly after; Identify algorithm biases (ex. striping, FOV (field of view) biases). Mode 2: Processed Released Currently working on correcting algorithm issues. Version 2 Aerosol Typing (Fall, 2016): Implementation of version 1 modifications Integrate GEOS-5 aerosols for typing guidance for non spherical aerosols. Version 3 Aerosol Typing (2017): Implementation of 1-D Var Assimilation into GEOS-5 Dynamic lidar ratio that will evolve in conjunction with simulated aerosol mixtures.

  15. Introduction to Bayesian statistics

    CERN Document Server

    Bolstad, William M

    2017-01-01

    There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...

  16. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2003-01-01

    As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.

  17. Bayesian artificial intelligence

    CERN Document Server

    Korb, Kevin B

    2010-01-01

    Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente

  18. Applied Bayesian Hierarchical Methods

    CERN Document Server

    Congdon, Peter D

    2010-01-01

    Bayesian methods facilitate the analysis of complex models and data structures. Emphasizing data applications, alternative modeling specifications, and computer implementation, this book provides a practical overview of methods for Bayesian analysis of hierarchical models.

  19. Bayesian data analysis

    CERN Document Server

    Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B

    2013-01-01

    FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear

  20. Bayesian Games with Intentions

    Directory of Open Access Journals (Sweden)

    Adam Bjorndahl

    2016-06-01

    Full Text Available We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.

  1. Bayesian statistics an introduction

    CERN Document Server

    Lee, Peter M

    2012-01-01

    Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel

  2. Understanding Computational Bayesian Statistics

    CERN Document Server

    Bolstad, William M

    2011-01-01

    A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic

  3. Bayesian Mediation Analysis

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  4. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  5. Konstruksi Bayesian Network Dengan Algoritma Bayesian Association Rule Mining Network

    OpenAIRE

    Octavian

    2015-01-01

    Beberapa tahun terakhir, Bayesian Network telah menjadi konsep yang populer digunakan dalam berbagai bidang kehidupan seperti dalam pengambilan sebuah keputusan dan menentukan peluang suatu kejadian dapat terjadi. Sayangnya, pengkonstruksian struktur dari Bayesian Network itu sendiri bukanlah hal yang sederhana. Oleh sebab itu, penelitian ini mencoba memperkenalkan algoritma Bayesian Association Rule Mining Network untuk memudahkan kita dalam mengkonstruksi Bayesian Network berdasarkan data ...

  6. Model Diagnostics for Bayesian Networks

    Science.gov (United States)

    Sinharay, Sandip

    2006-01-01

    Bayesian networks are frequently used in educational assessments primarily for learning about students' knowledge and skills. There is a lack of works on assessing fit of Bayesian networks. This article employs the posterior predictive model checking method, a popular Bayesian model checking tool, to assess fit of simple Bayesian networks. A…

  7. Bayesian Lensing Shear Measurement

    CERN Document Server

    Bernstein, Gary M

    2013-01-01

    We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...

  8. Bayesian psychometric scaling

    NARCIS (Netherlands)

    Fox, G.J.A.; Berg, van den S.M.; Veldkamp, B.P.; Irwing, P.; Booth, T.; Hughes, D.

    2015-01-01

    In educational and psychological studies, psychometric methods are involved in the measurement of constructs, and in constructing and validating measurement instruments. Assessment results are typically used to measure student proficiency levels and test characteristics. Recently, Bayesian item resp

  9. Noncausal Bayesian Vector Autoregression

    DEFF Research Database (Denmark)

    Lanne, Markku; Luoto, Jani

    We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...

  10. Practical Bayesian Tomography

    CERN Document Server

    Granade, Christopher; Cory, D G

    2015-01-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  11. Aerosolized Antibiotics.

    Science.gov (United States)

    Restrepo, Marcos I; Keyt, Holly; Reyes, Luis F

    2015-06-01

    Administration of medications via aerosolization is potentially an ideal strategy to treat airway diseases. This delivery method ensures high concentrations of the medication in the targeted tissues, the airways, with generally lower systemic absorption and systemic adverse effects. Aerosolized antibiotics have been tested as treatment for bacterial infections in patients with cystic fibrosis (CF), non-CF bronchiectasis (NCFB), and ventilator-associated pneumonia (VAP). The most successful application of this to date is treatment of infections in patients with CF. It has been hypothesized that similar success would be seen in NCFB and in difficult-to-treat hospital-acquired infections such as VAP. This review summarizes the available evidence supporting the use of aerosolized antibiotics and addresses the specific considerations that clinicians should recognize when prescribing an aerosolized antibiotic for patients with CF, NCFB, and VAP.

  12. Bayesian Face Sketch Synthesis.

    Science.gov (United States)

    Wang, Nannan; Gao, Xinbo; Sun, Leiyu; Li, Jie

    2017-03-01

    Exemplar-based face sketch synthesis has been widely applied to both digital entertainment and law enforcement. In this paper, we propose a Bayesian framework for face sketch synthesis, which provides a systematic interpretation for understanding the common properties and intrinsic difference in different methods from the perspective of probabilistic graphical models. The proposed Bayesian framework consists of two parts: the neighbor selection model and the weight computation model. Within the proposed framework, we further propose a Bayesian face sketch synthesis method. The essential rationale behind the proposed Bayesian method is that we take the spatial neighboring constraint between adjacent image patches into consideration for both aforementioned models, while the state-of-the-art methods neglect the constraint either in the neighbor selection model or in the weight computation model. Extensive experiments on the Chinese University of Hong Kong face sketch database demonstrate that the proposed Bayesian method could achieve superior performance compared with the state-of-the-art methods in terms of both subjective perceptions and objective evaluations.

  13. Bayesian least squares deconvolution

    Science.gov (United States)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  14. Hybrid Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2012-01-01

    Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...

  15. Bayesian least squares deconvolution

    CERN Document Server

    Ramos, A Asensio

    2015-01-01

    Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  16. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  17. Bayesian Visual Odometry

    Science.gov (United States)

    Center, Julian L.; Knuth, Kevin H.

    2011-03-01

    Visual odometry refers to tracking the motion of a body using an onboard vision system. Practical visual odometry systems combine the complementary accuracy characteristics of vision and inertial measurement units. The Mars Exploration Rovers, Spirit and Opportunity, used this type of visual odometry. The visual odometry algorithms in Spirit and Opportunity were based on Bayesian methods, but a number of simplifying approximations were needed to deal with onboard computer limitations. Furthermore, the allowable motion of the rover had to be severely limited so that computations could keep up. Recent advances in computer technology make it feasible to implement a fully Bayesian approach to visual odometry. This approach combines dense stereo vision, dense optical flow, and inertial measurements. As with all true Bayesian methods, it also determines error bars for all estimates. This approach also offers the possibility of using Micro-Electro Mechanical Systems (MEMS) inertial components, which are more economical, weigh less, and consume less power than conventional inertial components.

  18. Probabilistic Inferences in Bayesian Networks

    OpenAIRE

    Ding, Jianguo

    2010-01-01

    This chapter summarizes the popular inferences methods in Bayesian networks. The results demonstrates that the evidence can propagated across the Bayesian networks by any links, whatever it is forward or backward or intercausal style. The belief updating of Bayesian networks can be obtained by various available inference techniques. Theoretically, exact inferences in Bayesian networks is feasible and manageable. However, the computing and inference is NP-hard. That means, in applications, in ...

  19. Bayesian multiple target tracking

    CERN Document Server

    Streit, Roy L

    2013-01-01

    This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements

  20. Toxicity of atmospheric aerosols on marine phytoplankton

    Science.gov (United States)

    Paytan, A.; Mackey, K.R.M.; Chen, Y.; Lima, I.D.; Doney, S.C.; Mahowald, N.; Labiosa, R.; Post, A.F.

    2009-01-01

    Atmospheric aerosol deposition is an important source of nutrients and trace metals to the open ocean that can enhance ocean productivity and carbon sequestration and thus influence atmospheric carbon dioxide concentrations and climate. Using aerosol samples from different back trajectories in incubation experiments with natural communities, we demonstrate that the response of phytoplankton growth to aerosol additions depends on specific components in aerosols and differs across phytoplankton species. Aerosol additions enhanced growth by releasing nitrogen and phosphorus, but not all aerosols stimulated growth. Toxic effects were observed with some aerosols, where the toxicity affected picoeukaryotes and Synechococcus but not Prochlorococcus.We suggest that the toxicity could be due to high copper concentrations in these aerosols and support this by laboratory copper toxicity tests preformed with Synechococcus cultures. However, it is possible that other elements present in the aerosols or unknown synergistic effects between these elements could have also contributed to the toxic effect. Anthropogenic emissions are increasing atmospheric copper deposition sharply, and based on coupled atmosphere-ocean calculations, we show that this deposition can potentially alter patterns of marine primary production and community structure in high aerosol, low chlorophyll areas, particularly in the Bay of Bengal and downwind of South and East Asia.

  1. Aerosol Emission during Human Speech

    Science.gov (United States)

    Asadi, Sima; Ristenpart, William

    2016-11-01

    The traditional emphasis for airborne disease transmission has been on coughing and sneezing, which are dramatic expiratory events that yield easily visible droplets. Recent research suggests that normal speech can release even larger quantities of aerosols that are too small to see with the naked eye, but are nonetheless large enough to carry a variety of pathogens (e.g., influenza A). This observation raises an important question: what types of speech emit the most aerosols? Here we show that the concentration of aerosols emitted during healthy human speech is positively correlated with both the amplitude (loudness) and fundamental frequency (pitch) of the vocalization. Experimental measurements with an aerodynamic particle sizer (APS) indicate that speaking in a loud voice (95 decibels) yields up to fifty times more aerosols than in a quiet voice (75 decibels), and that sounds associated with certain phonemes (e.g., [a] or [o]) release more aerosols than others. We interpret these results in terms of the egressive airflow rate associated with each phoneme and the corresponding fundamental frequency, which is known to vary significantly with gender and age. The results suggest that individual speech patterns could affect the probability of airborne disease transmission.

  2. Bayesian methods for hackers probabilistic programming and Bayesian inference

    CERN Document Server

    Davidson-Pilon, Cameron

    2016-01-01

    Bayesian methods of inference are deeply natural and extremely powerful. However, most discussions of Bayesian inference rely on intensely complex mathematical analyses and artificial examples, making it inaccessible to anyone without a strong mathematical background. Now, though, Cameron Davidson-Pilon introduces Bayesian inference from a computational perspective, bridging theory to practice–freeing you to get results using computing power. Bayesian Methods for Hackers illuminates Bayesian inference through probabilistic programming with the powerful PyMC language and the closely related Python tools NumPy, SciPy, and Matplotlib. Using this approach, you can reach effective solutions in small increments, without extensive mathematical intervention. Davidson-Pilon begins by introducing the concepts underlying Bayesian inference, comparing it with other techniques and guiding you through building and training your first Bayesian model. Next, he introduces PyMC through a series of detailed examples a...

  3. Bayesian Graphical Models

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Nielsen, Thomas Dyhre

    2016-01-01

    and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...

  4. Subjective Bayesian Beliefs

    DEFF Research Database (Denmark)

    Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;

    2015-01-01

    A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...

  5. Code Development on Aerosol Behavior under Severe Accident-Aerosol Coagulation

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Kwang Soon; Kim, Sung Il; Ryu, Eun Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The behaviors of the larger aerosol particles are described usually by continuum mechanics. The smallest particles have diameters less than the mean free path of gas phase molecules and the behavior of these particles can often be described well by free molecular physics. The vast majority of aerosol particles arising in reactor accident analyses have behaviors in the very complicated regime intermediate between the continuum mechanics and free molecular limit. The package includes initial inventories, release from fuel and debris, aerosol dynamics with vapor condensation and revaporization, deposition on structure surfaces, transport through flow paths, and removal by engineered safety features. Aerosol dynamic processes and the condensation and evaporation of fission product vapors after release from fuel are considered within each MELCOR control volume. The aerosol dynamics models are based on MAEROS, a multi-section, multicomponent aerosol dynamics code, but without calculation of condensation. Aerosols can deposit directly on surfaces such as heat structures and water pools, or can agglomerate and eventually fall out once they exceed the largest size specified by the user for the aerosol size distribution. Aerosols deposited on surfaces cannot currently be resuspended.

  6. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  7. Bayesian community detection

    DEFF Research Database (Denmark)

    Mørup, Morten; Schmidt, Mikkel N

    2012-01-01

    Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...

  8. Bayesian Independent Component Analysis

    DEFF Research Database (Denmark)

    Winther, Ole; Petersen, Kaare Brandt

    2007-01-01

    In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...... in a Matlab toolbox, is demonstrated for non-negative decompositions and compared with non-negative matrix factorization....

  9. Bayesian theory and applications

    CERN Document Server

    Dellaportas, Petros; Polson, Nicholas G; Stephens, David A

    2013-01-01

    The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...

  10. Bayesian Network Based XP Process Modelling

    Directory of Open Access Journals (Sweden)

    Mohamed Abouelela

    2010-07-01

    Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.

  11. TOMS Absorbing Aerosol Index

    Data.gov (United States)

    Washington University St Louis — TOMS_AI_G is an aerosol related dataset derived from the Total Ozone Monitoring Satellite (TOMS) Sensor. The TOMS aerosol index arises from absorbing aerosols such...

  12. Electrically Driven Technologies for Radioactive Aerosol Abatement

    Energy Technology Data Exchange (ETDEWEB)

    David W. DePaoli; Ofodike A. Ezekoye; Costas Tsouris; Valmor F. de Almeida

    2003-01-28

    The purpose of this research project was to develop an improved understanding of how electriexecy driven processes, including electrocoalescence, acoustic agglomeration, and electric filtration, may be employed to efficiently treat problems caused by the formation of aerosols during DOE waste treatment operations. The production of aerosols during treatment and retrieval operations in radioactive waste tanks and during thermal treatment operations such as calcination presents a significant problem of cost, worker exposure, potential for release, and increased waste volume.

  13. Aerosol Observation System

    Data.gov (United States)

    Oak Ridge National Laboratory — The aerosol observation system (AOS) is the primary Atmospheric Radiation Measurement (ARM) platform for in situ aerosol measurements at the surface. The principal...

  14. Bayesian grid matching

    DEFF Research Database (Denmark)

    Hartelius, Karsten; Carstensen, Jens Michael

    2003-01-01

    A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...

  15. Applied Bayesian modelling

    CERN Document Server

    Congdon, Peter

    2014-01-01

    This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU

  16. Bayesian nonparametric data analysis

    CERN Document Server

    Müller, Peter; Jara, Alejandro; Hanson, Tim

    2015-01-01

    This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.

  17. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to neura

  18. Bayesian Intersubjectivity and Quantum Theory

    Science.gov (United States)

    Pérez-Suárez, Marcos; Santos, David J.

    2005-02-01

    Two of the major approaches to probability, namely, frequentism and (subjectivistic) Bayesian theory, are discussed, together with the replacement of frequentist objectivity for Bayesian intersubjectivity. This discussion is then expanded to Quantum Theory, as quantum states and operations can be seen as structural elements of a subjective nature.

  19. Bayesian Approach for Inconsistent Information.

    Science.gov (United States)

    Stein, M; Beer, M; Kreinovich, V

    2013-10-01

    In engineering situations, we usually have a large amount of prior knowledge that needs to be taken into account when processing data. Traditionally, the Bayesian approach is used to process data in the presence of prior knowledge. Sometimes, when we apply the traditional Bayesian techniques to engineering data, we get inconsistencies between the data and prior knowledge. These inconsistencies are usually caused by the fact that in the traditional approach, we assume that we know the exact sample values, that the prior distribution is exactly known, etc. In reality, the data is imprecise due to measurement errors, the prior knowledge is only approximately known, etc. So, a natural way to deal with the seemingly inconsistent information is to take this imprecision into account in the Bayesian approach - e.g., by using fuzzy techniques. In this paper, we describe several possible scenarios for fuzzifying the Bayesian approach. Particular attention is paid to the interaction between the estimated imprecise parameters. In this paper, to implement the corresponding fuzzy versions of the Bayesian formulas, we use straightforward computations of the related expression - which makes our computations reasonably time-consuming. Computations in the traditional (non-fuzzy) Bayesian approach are much faster - because they use algorithmically efficient reformulations of the Bayesian formulas. We expect that similar reformulations of the fuzzy Bayesian formulas will also drastically decrease the computation time and thus, enhance the practical use of the proposed methods.

  20. Inference in hybrid Bayesian networks

    DEFF Research Database (Denmark)

    Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael

    2009-01-01

    Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....

  1. Aerosol Physics Considerations for Using Cerium Oxide CeO2 as a Surrogate for Plutonium Oxide PuO2 in Airborne Release Fraction Measurements for Storage Container Investigations

    Energy Technology Data Exchange (ETDEWEB)

    Moore, Murray E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Tao, Yong [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-02-16

    Cerium oxide (CeO2) dust is recommended as a surrogate for plutonium oxide (PuO2) in airborne release fraction experiments. The total range of applicable particle sizes for PuO2 extends from 0.0032 μm (the diameter of a single PuO2 molecule) to 10 μm (the defined upper boundary for respirable particles). For particulates with a physical particle diameter of 1.0 μm, the corresponding aerodynamic diameters for CeO2 and PuO2 are 2.7 μm and 3.4 μm, respectively. Cascade impactor air samplers are capable of measuring the size distributions of CeO2 or PuO2 particulates. In this document, the aerodynamic diameters for CeO2 and PuO2 were calculated for seven different physical diameters (0.0032, 0.02, 0.11, 0.27, 1.0, 3.2, and 10 μm). For cascade impactor measurements, CeO2 and PuO2 particulates with the same physical diameter would be collected onto the same or adjacent collection substrates. The difference between the aerodynamic diameter of CeO2 and PuO2 particles (that have the same physical diameter) is 39% of the resolution of a twelve-stage MSP Inc. 125 cascade impactor, and 34% for an eight-stage Andersen impactor. An approach is given to calculate the committed effective dose (CED) coefficient for PuO2 aerosol particles, compared to a corresponding aerodynamic diameter of CeO2 particles. With this approach, use of CeO2 as a surrogate for PuO2 material would follow a direct conversion based on a molar equivalent. In addition to the analytical information developed for this document, several US national labs have published articles about the use of CeO2 as a PuO2 surrogate. Different physical and chemical aspects were considered by these investigators, including thermal properties, ceramic formulations, cold pressing, sintering, molecular reactions, and mass loss in high temperature gas flows. All of those US national lab studies recommended the use of CeO2 as a surrogate material for PuO2.

  2. Bayesian Inference on Gravitational Waves

    Directory of Open Access Journals (Sweden)

    Asad Ali

    2015-12-01

    Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an  overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.

  3. Approximate Bayesian computation.

    Directory of Open Access Journals (Sweden)

    Mikael Sunnåker

    Full Text Available Approximate Bayesian computation (ABC constitutes a class of computational methods rooted in Bayesian statistics. In all model-based statistical inference, the likelihood function is of central importance, since it expresses the probability of the observed data under a particular statistical model, and thus quantifies the support data lend to particular values of parameters and to choices among different models. For simple models, an analytical formula for the likelihood function can typically be derived. However, for more complex models, an analytical formula might be elusive or the likelihood function might be computationally very costly to evaluate. ABC methods bypass the evaluation of the likelihood function. In this way, ABC methods widen the realm of models for which statistical inference can be considered. ABC methods are mathematically well-founded, but they inevitably make assumptions and approximations whose impact needs to be carefully assessed. Furthermore, the wider application domain of ABC exacerbates the challenges of parameter estimation and model selection. ABC has rapidly gained popularity over the last years and in particular for the analysis of complex problems arising in biological sciences (e.g., in population genetics, ecology, epidemiology, and systems biology.

  4. Simultaneous aerosol measurements of unusual aerosol enhancement in troposphere over Syowa Station, Antarctica

    Directory of Open Access Journals (Sweden)

    K. Hara

    2013-10-01

    Full Text Available Unusual aerosol enhancement is often observed at Syowa Station, Antarctica during winter through spring. Simultaneous aerosol measurements near the surface and in the upper atmosphere were conducted twice using a ground-based optical particle counter, a balloon-borne optical particle counter, and micro-pulse LIDAR (MPL in August and September 2012. During 13–15 August, aerosol enhancement occurred immediately after a storm condition. A high backscatter ratio and aerosol concentrations were observed from the surface to ca. 2.5 km over Syowa Station. Clouds appeared occasionally at the top of aerosol-enhanced layer during the episode. Aerosol enhancement was terminated on 15 August by strong winds caused by a cyclone's approach. In the second case on 5–7 September, aerosol number concentrations in Dp > 0.3 μm near the surface reached > 104 L−1 at about 15:00 UT on 5 September in spite of calm wind conditions, whereas MPL measurement exhibited aerosols were enhanced at about 04:00 UT at 1000–1500 m above Syowa Station. The aerosol enhancement occurred near the surface–ca. 4 km. In both cases, air masses with high aerosol enhancement below 2.5–3 km were transported mostly from the boundary layer over the sea-ice area. In addition, air masses at 3–4 km in the second case came from the boundary layer over the open-sea area. This air mass history strongly suggests that dispersion of sea-salt particles from the sea-ice surface contributes considerably to the aerosol enhancement in the lower free troposphere (about 3 km and that the release of sea-salt particles from the ocean surface engenders high aerosol concentrations in the free troposphere (3–4 km.

  5. How to practise Bayesian statistics outside the Bayesian church: What philosophy for Bayesian statistical modelling?

    NARCIS (Netherlands)

    Borsboom, D.; Haig, B.D.

    2013-01-01

    Unlike most other statistical frameworks, Bayesian statistical inference is wedded to a particular approach in the philosophy of science (see Howson & Urbach, 2006); this approach is called Bayesianism. Rather than being concerned with model fitting, this position in the philosophy of science primar

  6. MORPHOLOGY OF BLACK CARBON AEROSOLS AND UBIQUITY OF 50-NANOMETER BLACK CARBON AEROSOLS IN THE ATMOSPHERE

    Institute of Scientific and Technical Information of China (English)

    Fengfu Fu; Liangjun Xu; Wei Ye; Yiquan Chen; Mingyu Jiang; Xueqin Xu

    2006-01-01

    Different-sized aerosols were collected by an Andersen air sampler to observe the detailed morphology of the black carbon (BC) aerosols which were separated chemically from the other accompanying aerosols, using a Scanning Electron Microscope equipped with an Energy Dispersive X-ray Spectrometer (SEM-EDX). The results indicate that most BC aerosols are spherical particles of about 50 nm in diameter and with a homogeneous surface. Results also show that these particles aggregate with other aerosols or with themselves to form larger agglomerates in the micrometer range. The shape of these 50-nm BC spherical particles was found to be very similar to that of BC particles released from petroleum-powered vehicular internal combustion engines. These spherical BC particles were shown to be different from the previously reported fullerenes found using Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight Mass Spectrometry (MALDI-TOF-MS).

  7. Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions

    Directory of Open Access Journals (Sweden)

    Richard M. Todd

    1988-03-01

    Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.

  8. Book review: Bayesian analysis for population ecology

    Science.gov (United States)

    Link, William A.

    2011-01-01

    Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)

  9. Bayesian Causal Induction

    CERN Document Server

    Ortega, Pedro A

    2011-01-01

    Discovering causal relationships is a hard task, often hindered by the need for intervention, and often requiring large amounts of data to resolve statistical uncertainty. However, humans quickly arrive at useful causal relationships. One possible reason is that humans use strong prior knowledge; and rather than encoding hard causal relationships, they encode beliefs over causal structures, allowing for sound generalization from the observations they obtain from directly acting in the world. In this work we propose a Bayesian approach to causal induction which allows modeling beliefs over multiple causal hypotheses and predicting the behavior of the world under causal interventions. We then illustrate how this method extracts causal information from data containing interventions and observations.

  10. Bayesian Rose Trees

    CERN Document Server

    Blundell, Charles; Heller, Katherine A

    2012-01-01

    Hierarchical structure is ubiquitous in data across many domains. There are many hier- archical clustering methods, frequently used by domain experts, which strive to discover this structure. However, most of these meth- ods limit discoverable hierarchies to those with binary branching structure. This lim- itation, while computationally convenient, is often undesirable. In this paper we ex- plore a Bayesian hierarchical clustering algo- rithm that can produce trees with arbitrary branching structure at each node, known as rose trees. We interpret these trees as mixtures over partitions of a data set, and use a computationally efficient, greedy ag- glomerative algorithm to find the rose trees which have high marginal likelihood given the data. Lastly, we perform experiments which demonstrate that rose trees are better models of data than the typical binary trees returned by other hierarchical clustering algorithms.

  11. Bayesian inference in geomagnetism

    Science.gov (United States)

    Backus, George E.

    1988-01-01

    The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

  12. Current trends in Bayesian methodology with applications

    CERN Document Server

    Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia

    2015-01-01

    Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on

  13. Aerosol typing - key information from aerosol studies

    Science.gov (United States)

    Mona, Lucia; Kahn, Ralph; Papagiannopoulos, Nikolaos; Holzer-Popp, Thomas; Pappalardo, Gelsomina

    2016-04-01

    Aerosol typing is a key source of aerosol information from ground-based and satellite-borne instruments. Depending on the specific measurement technique, aerosol typing can be used as input for retrievals or represents an output for other applications. Typically aerosol retrievals require some a priori or external aerosol type information. The accuracy of the derived aerosol products strongly depends on the reliability of these assumptions. Different sensors can make use of different aerosol type inputs. A critical review and harmonization of these procedures could significantly reduce related uncertainties. On the other hand, satellite measurements in recent years are providing valuable information about the global distribution of aerosol types, showing for example the main source regions and typical transport paths. Climatological studies of aerosol load at global and regional scales often rely on inferred aerosol type. There is still a high degree of inhomogeneity among satellite aerosol typing schemes, which makes the use different sensor datasets in a consistent way difficult. Knowledge of the 4d aerosol type distribution at these scales is essential for understanding the impact of different aerosol sources on climate, precipitation and air quality. All this information is needed for planning upcoming aerosol emissions policies. The exchange of expertise and the communication among satellite and ground-based measurement communities is fundamental for improving long-term dataset consistency, and for reducing aerosol type distribution uncertainties. Aerosol typing has been recognized as one of its high-priority activities of the AEROSAT (International Satellite Aerosol Science Network, http://aero-sat.org/) initiative. In the AEROSAT framework, a first critical review of aerosol typing procedures has been carried out. The review underlines the high heterogeneity in many aspects: approach, nomenclature, assumed number of components and parameters used for the

  14. 硫酸沙丁胺醇气雾剂吸入与茶碱控释片治疗老年哮喘的疗效%Effect of inhaled salbutamol sulfate aerosol and sustained release theophylline treatment of elderly asthma

    Institute of Scientific and Technical Information of China (English)

    禤肇泉; 谭力雄

    2016-01-01

    目的:对硫酸沙丁胺醇气雾剂吸入与茶碱控释片急诊治疗老年哮喘患者的临床效果进行分析探讨。方法选取我院急诊科收治的120例老年哮喘患者随机分为研究组与参考组,各60例,参考组患者仅应用硫酸沙丁胺醇气雾剂治疗,研究组患者应用硫酸沙丁胺醇气雾剂吸入与茶碱控释片治疗。结果研究组患者哮喘症状消失时间、肺功能改善、治疗有效率、生存质量以及不良反应等指标均优于参考组(P<0.05)。结论应用硫酸沙丁胺醇气雾剂吸入与茶碱控释片急诊治疗老年哮喘患者,能够明显改善患者肺功能,提高生存质量与治疗有效率,并降低不良反应。%Objective To analyze and study the clinical effect of inhaled salbutamol sulfate aerosol and sustained release theophylline in the emergency treatment of patients with elderly asthma. Methods 120 patients with elderly asthma, who were received and treated in emergency department of our hospital,were selected and were randomly divided into study group and control group,with 60 cases in each group.To treat single with inhaled salbutamol sulfate aerosol on patients in control group,to treat with inhaled salbutamol sulfate aerosol and sustained release theophylline on patients in study group. Results The asthma symptoms disappeared time,the improvement of pulmonary function, the treatment effective rate,the quality of life,and the adverse reactions and etc.of patients in study group were better than which in control group(P < 0.05). Conclusion Inhaled salbutamol sulfate aerosol and sustained release theophylline in the emergency treatment of patients with elderly asthma could obviously improve the pulmonary function of patients,could improve the quality of life and treatment effective rate, could reduce the adverse reactions.

  15. Irregular-Time Bayesian Networks

    CERN Document Server

    Ramati, Michael

    2012-01-01

    In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...

  16. Bayesian Inference: with ecological applications

    Science.gov (United States)

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  17. Bayesian Methods for Statistical Analysis

    OpenAIRE

    Puza, Borek

    2015-01-01

    Bayesian methods for statistical analysis is a book on statistical methods for analysing a wide variety of data. The book consists of 12 chapters, starting with basic concepts and covering numerous topics, including Bayesian estimation, decision theory, prediction, hypothesis testing, hierarchical models, Markov chain Monte Carlo methods, finite population inference, biased sampling and nonignorable nonresponse. The book contains many exercises, all with worked solutions, including complete c...

  18. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

     Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...

  19. Microphysical processing of aerosol particles in orographic clouds

    Directory of Open Access Journals (Sweden)

    S. Pousse-Nottelmann

    2015-01-01

    Full Text Available An explicit and detailed treatment of cloud-borne particles allowing for the consideration of aerosol cycling in clouds has been implemented in the regional weather forecast and climate model COSMO. The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener-Bergeron-Findeisen process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snow flakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snow flakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. However, the processes not only impact the total aerosol number and mass, but also the shape of the aerosol size distributions by enhancing the internally mixed/soluble accumulation mode and generating coarse mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases the cloud droplet number concentration with possible

  20. Microphysical processing of aerosol particles in orographic clouds

    Directory of Open Access Journals (Sweden)

    S. Pousse-Nottelmann

    2015-08-01

    aerosol cycling in clouds has been implemented into COSMO-Model, the regional weather forecast and climate model of the Consortium for Small-scale Modeling (COSMO. The effects of aerosol scavenging, cloud microphysical processing and regeneration upon cloud evaporation on the aerosol population and on subsequent cloud formation are investigated. For this, two-dimensional idealized simulations of moist flow over two bell-shaped mountains were carried out varying the treatment of aerosol scavenging and regeneration processes for a warm-phase and a mixed-phase orographic cloud. The results allowed us to identify different aerosol cycling mechanisms. In the simulated non-precipitating warm-phase cloud, aerosol mass is incorporated into cloud droplets by activation scavenging and released back to the atmosphere upon cloud droplet evaporation. In the mixed-phase cloud, a first cycle comprises cloud droplet activation and evaporation via the Wegener–Bergeron–Findeisen (WBF process. A second cycle includes below-cloud scavenging by precipitating snow particles and snow sublimation and is connected to the first cycle via the riming process which transfers aerosol mass from cloud droplets to snowflakes. In the simulated mixed-phase cloud, only a negligible part of the total aerosol mass is incorporated into ice crystals. Sedimenting snowflakes reaching the surface remove aerosol mass from the atmosphere. The results show that aerosol processing and regeneration lead to a vertical redistribution of aerosol mass and number. Thereby, the processes impact the total aerosol number and mass and additionally alter the shape of the aerosol size distributions by enhancing the internally mixed/soluble Aitken and accumulation mode and generating coarse-mode particles. Concerning subsequent cloud formation at the second mountain, accounting for aerosol processing and regeneration increases the cloud droplet number concentration with possible implications for the ice crystal number

  1. Protection of air in premises and environment against beryllium aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Bitkolov, N.Z.; Vishnevsky, E.P.; Krupkin, A.V. [Research Inst. of Industrial and Marine Medicine, St. Petersburg (Russian Federation)

    1998-01-01

    First and foremost, the danger of beryllium aerosols concerns a possibility of their inhalation. The situation is aggravated with high biological activity of the beryllium in a human lung. The small allowable beryllium aerosols` concentration in air poses a rather complex and expensive problem of the pollution prevention and clearing up of air. The delivery and transportation of beryllium aerosols from sites of their formation are defined by the circuit of ventilation, that forms aerodynamics of air flows in premises, and aerodynamic links between premises. The causes of aerosols release in air of premises from hoods, isolated and hermetically sealed vessels can be vibrations, as well as pulses of temperature and pressure. Furthermore, it is possible the redispersion of aerosols from dirty surfaces. The effective protection of air against beryllium aerosols at industrial plants is provided by a complex of hygienic measures: from individual means of breath protection up to collective means of the prevention of air pollution. (J.P.N.)

  2. Aerosol mobility size spectrometer

    Science.gov (United States)

    Wang, Jian; Kulkarni, Pramod

    2007-11-20

    A device for measuring aerosol size distribution within a sample containing aerosol particles. The device generally includes a spectrometer housing defining an interior chamber and a camera for recording aerosol size streams exiting the chamber. The housing includes an inlet for introducing a flow medium into the chamber in a flow direction, an aerosol injection port adjacent the inlet for introducing a charged aerosol sample into the chamber, a separation section for applying an electric field to the aerosol sample across the flow direction and an outlet opposite the inlet. In the separation section, the aerosol sample becomes entrained in the flow medium and the aerosol particles within the aerosol sample are separated by size into a plurality of aerosol flow streams under the influence of the electric field. The camera is disposed adjacent the housing outlet for optically detecting a relative position of at least one aerosol flow stream exiting the outlet and for optically detecting the number of aerosol particles within the at least one aerosol flow stream.

  3. A Bayesian ensemble of sensitivity measures for severe accident modeling

    Energy Technology Data Exchange (ETDEWEB)

    Hoseyni, Seyed Mohsen [Department of Basic Sciences, East Tehran Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Di Maio, Francesco, E-mail: francesco.dimaio@polimi.it [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Vagnoli, Matteo [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Zio, Enrico [Energy Department, Politecnico di Milano, Via La Masa 34, 20156 Milano (Italy); Chair on System Science and Energetic Challenge, Fondation EDF – Electricite de France Ecole Centrale, Paris, and Supelec, Paris (France); Pourgol-Mohammad, Mohammad [Department of Mechanical Engineering, Sahand University of Technology, Tabriz (Iran, Islamic Republic of)

    2015-12-15

    Highlights: • We propose a sensitivity analysis (SA) method based on a Bayesian updating scheme. • The Bayesian updating schemes adjourns an ensemble of sensitivity measures. • Bootstrap replicates of a severe accident code output are fed to the Bayesian scheme. • The MELCOR code simulates the fission products release of LOFT LP-FP-2 experiment. • Results are compared with those of traditional SA methods. - Abstract: In this work, a sensitivity analysis framework is presented to identify the relevant input variables of a severe accident code, based on an incremental Bayesian ensemble updating method. The proposed methodology entails: (i) the propagation of the uncertainty in the input variables through the severe accident code; (ii) the collection of bootstrap replicates of the input and output of limited number of simulations for building a set of finite mixture models (FMMs) for approximating the probability density function (pdf) of the severe accident code output of the replicates; (iii) for each FMM, the calculation of an ensemble of sensitivity measures (i.e., input saliency, Hellinger distance and Kullback–Leibler divergence) and the updating when a new piece of evidence arrives, by a Bayesian scheme, based on the Bradley–Terry model for ranking the most relevant input model variables. An application is given with respect to a limited number of simulations of a MELCOR severe accident model describing the fission products release in the LP-FP-2 experiment of the loss of fluid test (LOFT) facility, which is a scaled-down facility of a pressurized water reactor (PWR).

  4. Dynamic Batch Bayesian Optimization

    CERN Document Server

    Azimi, Javad; Fern, Xiaoli

    2011-01-01

    Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...

  5. On the implications of aerosol liquid water and phase separation for organic aerosol mass

    Science.gov (United States)

    Pye, Havala O. T.; Murphy, Benjamin N.; Xu, Lu; Ng, Nga L.; Carlton, Annmarie G.; Guo, Hongyu; Weber, Rodney; Vasilakos, Petros; Wyat Appel, K.; Hapsari Budisulistiorini, Sri; Surratt, Jason D.; Nenes, Athanasios; Hu, Weiwei; Jimenez, Jose L.; Isaacman-VanWertz, Gabriel; Misztal, Pawel K.; Goldstein, Allen H.

    2017-01-01

    Organic compounds and liquid water are major aerosol constituents in the southeast United States (SE US). Water associated with inorganic constituents (inorganic water) can contribute to the partitioning medium for organic aerosol when relative humidities or organic matter to organic carbon (OM / OC) ratios are high such that separation relative humidities (SRH) are below the ambient relative humidity (RH). As OM / OC ratios in the SE US are often between 1.8 and 2.2, organic aerosol experiences both mixing with inorganic water and separation from it. Regional chemical transport model simulations including inorganic water (but excluding water uptake by organic compounds) in the partitioning medium for secondary organic aerosol (SOA) when RH > SRH led to increased SOA concentrations, particularly at night. Water uptake to the organic phase resulted in even greater SOA concentrations as a result of a positive feedback in which water uptake increased SOA, which further increased aerosol water and organic aerosol. Aerosol properties, such as the OM / OC and hygroscopicity parameter (κorg), were captured well by the model compared with measurements during the Southern Oxidant and Aerosol Study (SOAS) 2013. Organic nitrates from monoterpene oxidation were predicted to be the least water-soluble semivolatile species in the model, but most biogenically derived semivolatile species in the Community Multiscale Air Quality (CMAQ) model were highly water soluble and expected to contribute to water-soluble organic carbon (WSOC). Organic aerosol and SOA precursors were abundant at night, but additional improvements in daytime organic aerosol are needed to close the model-measurement gap. When taking into account deviations from ideality, including both inorganic (when RH > SRH) and organic water in the organic partitioning medium reduced the mean bias in SOA for routine monitoring networks and improved model performance compared to observations from SOAS. Property updates from

  6. Bayesian seismic AVO inversion

    Energy Technology Data Exchange (ETDEWEB)

    Buland, Arild

    2002-07-01

    A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S

  7. Bayesian microsaccade detection

    Science.gov (United States)

    Mihali, Andra; van Opheusden, Bas; Ma, Wei Ji

    2017-01-01

    Microsaccades are high-velocity fixational eye movements, with special roles in perception and cognition. The default microsaccade detection method is to determine when the smoothed eye velocity exceeds a threshold. We have developed a new method, Bayesian microsaccade detection (BMD), which performs inference based on a simple statistical model of eye positions. In this model, a hidden state variable changes between drift and microsaccade states at random times. The eye position is a biased random walk with different velocity distributions for each state. BMD generates samples from the posterior probability distribution over the eye state time series given the eye position time series. Applied to simulated data, BMD recovers the “true” microsaccades with fewer errors than alternative algorithms, especially at high noise. Applied to EyeLink eye tracker data, BMD detects almost all the microsaccades detected by the default method, but also apparent microsaccades embedded in high noise—although these can also be interpreted as false positives. Next we apply the algorithms to data collected with a Dual Purkinje Image eye tracker, whose higher precision justifies defining the inferred microsaccades as ground truth. When we add artificial measurement noise, the inferences of all algorithms degrade; however, at noise levels comparable to EyeLink data, BMD recovers the “true” microsaccades with 54% fewer errors than the default algorithm. Though unsuitable for online detection, BMD has other advantages: It returns probabilities rather than binary judgments, and it can be straightforwardly adapted as the generative model is refined. We make our algorithm available as a software package. PMID:28114483

  8. Maximum margin Bayesian network classifiers.

    Science.gov (United States)

    Pernkopf, Franz; Wohlmayr, Michael; Tschiatschek, Sebastian

    2012-03-01

    We present a maximum margin parameter learning algorithm for Bayesian network classifiers using a conjugate gradient (CG) method for optimization. In contrast to previous approaches, we maintain the normalization constraints on the parameters of the Bayesian network during optimization, i.e., the probabilistic interpretation of the model is not lost. This enables us to handle missing features in discriminatively optimized Bayesian networks. In experiments, we compare the classification performance of maximum margin parameter learning to conditional likelihood and maximum likelihood learning approaches. Discriminative parameter learning significantly outperforms generative maximum likelihood estimation for naive Bayes and tree augmented naive Bayes structures on all considered data sets. Furthermore, maximizing the margin dominates the conditional likelihood approach in terms of classification performance in most cases. We provide results for a recently proposed maximum margin optimization approach based on convex relaxation. While the classification results are highly similar, our CG-based optimization is computationally up to orders of magnitude faster. Margin-optimized Bayesian network classifiers achieve classification performance comparable to support vector machines (SVMs) using fewer parameters. Moreover, we show that unanticipated missing feature values during classification can be easily processed by discriminatively optimized Bayesian network classifiers, a case where discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.

  9. Bayesian modeling using WinBUGS

    CERN Document Server

    Ntzoufras, Ioannis

    2009-01-01

    A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...

  10. Bayesian Methods and Universal Darwinism

    CERN Document Server

    Campbell, John

    2010-01-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...

  11. Attention in a bayesian framework

    DEFF Research Database (Denmark)

    Whiteley, Louise Emma; Sahani, Maneesh

    2012-01-01

    The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...

  12. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  13. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    Science.gov (United States)

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.

  14. Recent Improvements to CALIOP Level 3 Aerosol Profile Product for Global 3-D Aerosol Extinction Characterization

    Science.gov (United States)

    Tackett, J. L.; Getzewich, B. J.; Winker, D. M.; Vaughan, M. A.

    2015-12-01

    With nine years of retrievals, the CALIOP level 3 aerosol profile product provides an unprecedented synopsis of aerosol extinction in three dimensions and the potential to quantify changes in aerosol distributions over time. The CALIOP level 3 aerosol profile product, initially released as a beta product in 2011, reports monthly averages of quality-screened aerosol extinction profiles on a uniform latitude/longitude grid for different cloud-cover scenarios, called "sky conditions". This presentation demonstrates improvements to the second version of the product which will be released in September 2015. The largest improvements are the new sky condition definitions which parse the atmosphere into "cloud-free" views accessible to passive remote sensors, "all-sky" views accessible to active remote sensors and "cloudy-sky" views for opaque and transparent clouds which were previously inaccessible to passive remote sensors. Taken together, the new sky conditions comprehensively summarize CALIOP aerosol extinction profiles for a broad range of scientific queries. In addition to dust-only extinction profiles, the new version will include polluted-dust and smoke-only extinction averages. A new method is adopted for averaging dust-only extinction profiles to reduce high biases which exist in the beta version of the level 3 aerosol profile product. This presentation justifies the new averaging methodology and demonstrates vertical profiles of dust and smoke extinction over Africa during the biomass burning season. Another crucial advancement demonstrated in this presentation is a new approach for computing monthly mean aerosol optical depth which removes low biases reported in the beta version - a scenario unique to lidar datasets.

  15. Bayesian Missile System Reliability from Point Estimates

    Science.gov (United States)

    2014-10-28

    OCT 2014 2. REPORT TYPE N/A 3. DATES COVERED - 4. TITLE AND SUBTITLE Bayesian Missile System Reliability from Point Estimates 5a. CONTRACT...Principle (MEP) to convert point estimates to probability distributions to be used as priors for Bayesian reliability analysis of missile data, and...illustrate this approach by applying the priors to a Bayesian reliability model of a missile system. 15. SUBJECT TERMS priors, Bayesian , missile

  16. Perception, illusions and Bayesian inference.

    Science.gov (United States)

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  17. Bayesian test and Kuhn's paradigm

    Institute of Scientific and Technical Information of China (English)

    Chen Xiaoping

    2006-01-01

    Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.

  18. 3D Bayesian contextual classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    2000-01-01

    We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....

  19. A Bayesian Nonparametric Approach to Test Equating

    Science.gov (United States)

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  20. Bayesian Model Averaging for Propensity Score Analysis

    Science.gov (United States)

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  1. Bayesian networks and food security - An introduction

    NARCIS (Netherlands)

    Stein, A.

    2004-01-01

    This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup

  2. Plug & Play object oriented Bayesian networks

    DEFF Research Database (Denmark)

    Bangsø, Olav; Flores, J.; Jensen, Finn Verner

    2003-01-01

    Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have b...

  3. Bayesian stable isotope mixing models

    Science.gov (United States)

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...

  4. Naive Bayesian for Email Filtering

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The paper presents a method of email filter based on Naive Bayesian theory that can effectively filter junk mail and illegal mail. Furthermore, the keys of implementation are discussed in detail. The filtering model is obtained from training set of email. The filtering can be done without the users specification of filtering rules.

  5. Bayesian analysis of binary sequences

    Science.gov (United States)

    Torney, David C.

    2005-03-01

    This manuscript details Bayesian methodology for "learning by example", with binary n-sequences encoding the objects under consideration. Priors prove influential; conformable priors are described. Laplace approximation of Bayes integrals yields posterior likelihoods for all n-sequences. This involves the optimization of a definite function over a convex domain--efficiently effectuated by the sequential application of the quadratic program.

  6. Bayesian NL interpretation and learning

    NARCIS (Netherlands)

    Zeevat, H.

    2011-01-01

    Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language

  7. ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY

    Directory of Open Access Journals (Sweden)

    Felipe Schneider Costa

    2013-01-01

    Full Text Available The naïve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naïve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naïve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naïve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.

  8. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...

  9. Bayesian Classification of Image Structures

    DEFF Research Database (Denmark)

    Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert

    2009-01-01

    In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...

  10. 3-D contextual Bayesian classifiers

    DEFF Research Database (Denmark)

    Larsen, Rasmus

    In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...

  11. Bayesian Networks and Influence Diagrams

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders Læsø

    Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...

  12. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...

  13. Bayesian image restoration, using configurations

    DEFF Research Database (Denmark)

    Thorarinsdottir, Thordis Linda

    2006-01-01

    configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...

  14. Bayesian Evidence and Model Selection

    CERN Document Server

    Knuth, Kevin H; Malakar, Nabin K; Mubeen, Asim M; Placek, Ben

    2014-01-01

    In this paper we review the concept of the Bayesian evidence and its application to model selection. The theory is presented along with a discussion of analytic, approximate and numerical techniques. Application to several practical examples within the context of signal processing are discussed.

  15. Differentiated Bayesian Conjoint Choice Designs

    NARCIS (Netherlands)

    Z. Sándor (Zsolt); M. Wedel (Michel)

    2003-01-01

    textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about

  16. Bayesian Alternation During Tactile Augmentation

    Directory of Open Access Journals (Sweden)

    Caspar Mathias Goeke

    2016-10-01

    Full Text Available A large number of studies suggest that the integration of multisensory signals by humans is well described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition, rotation only (native condition, and both augmented and native information (bimodal condition. Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants’ responses with a probit model and calculated the just notable difference (JND. Then we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67 than the Bayesian integration model (χred2= 4.34. Slightly higher accuracy showed a non-Bayesian winner takes all model (χred2= 1.64, which either used only native or only augmented values per subject for prediction. However the performance of the Bayesian alternation model could be substantially improved (χred2= 1.09 utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  17. Bayesian analysis of rare events

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  18. Aerosol MTF revisited

    Science.gov (United States)

    Kopeika, Norman S.; Zilberman, Arkadi; Yitzhaky, Yitzhak

    2014-05-01

    Different views of the significance of aerosol MTF have been reported. For example, one recent paper [OE, 52(4)/2013, pp. 046201] claims that the aerosol MTF "contrast reduction is approximately independent of spatial frequency, and image blur is practically negligible". On the other hand, another recent paper [JOSA A, 11/2013, pp. 2244-2252] claims that aerosols "can have a non-negligible effect on the atmospheric point spread function". We present clear experimental evidence of common significant aerosol blur and evidence that aerosol contrast reduction can be extremely significant. In the IR, it is more appropriate to refer to such phenomena as aerosol-absorption MTF. The role of imaging system instrumentation on such MTF is addressed too.

  19. LMFBR source term experiments in the Fuel Aerosol Simulant Test (FAST) facility

    Energy Technology Data Exchange (ETDEWEB)

    Petrykowski, J.C.; Longest, A.W.

    1985-01-01

    The transport of uranium dioxide (UO/sub 2/) aerosol through liquid sodium was studied in a series of ten experiments in the Fuel Aerosol Simulant Test (FAST) facility at Oak Ridge National Laboratory (ORNL). The experiments were designed to provide a mechanistic basis for evaluating the radiological source term associated with a postulated, energetic core disruptive accident (CDA) in a liquid metal fast breeder reactor (LMFBR). Aerosol was generated by capacitor discharge vaporization of UO/sub 2/ pellets which were submerged in a sodium pool under an argon cover gas. Measurements of the pool and cover gas pressures were used to study the transport of aerosol contained by vapor bubbles within the pool. Samples of cover gas were filtered to determine the quantity of aerosol released from the pool. The depth at which the aerosol was generated was found to be the most critical parameter affecting release. The largest release was observed in the baseline experiment where the sample was vaporized above the sodium pool. In the nine ''undersodium'' experiments aerosol was generated beneath the surface of the pool at depths varying from 30 to 1060 mm. The mass of aerosol released from the pool was found to be a very small fraction of the original specimen. It appears that the bulk of aerosol was contained by bubbles which collapsed within the pool. 18 refs., 11 figs., 4 tabs.

  20. STATISTICAL BAYESIAN ANALYSIS OF EXPERIMENTAL DATA.

    Directory of Open Access Journals (Sweden)

    AHLAM LABDAOUI

    2012-12-01

    Full Text Available The Bayesian researcher should know the basic ideas underlying Bayesian methodology and the computational tools used in modern Bayesian econometrics.  Some of the most important methods of posterior simulation are Monte Carlo integration, importance sampling, Gibbs sampling and the Metropolis- Hastings algorithm. The Bayesian should also be able to put the theory and computational tools together in the context of substantive empirical problems. We focus primarily on recent developments in Bayesian computation. Then we focus on particular models. Inevitably, we combine theory and computation in the context of particular models. Although we have tried to be reasonably complete in terms of covering the basic ideas of Bayesian theory and the computational tools most commonly used by the Bayesian, there is no way we can cover all the classes of models used in econometrics. We propose to the user of analysis of variance and linear regression model.

  1. Bayesian methods for measures of agreement

    CERN Document Server

    Broemeling, Lyle D

    2009-01-01

    Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...

  2. 硫酸沙丁胺醇气雾剂吸入与茶碱控释片治疗急性老年哮喘的疗效对比分析%Salbutamol Aerosol Inhalation and Efficacy Analysis of Theophylline Con-trolled Release Tablets in the Treatment of Acute Asthma in the Elderly

    Institute of Scientific and Technical Information of China (English)

    宋丽军

    2016-01-01

    Objective To explore the curative effect of inhalation of Salbutamol Aerosol and theophylline controlled release tablets in the treatment of acute asthma in the elderly. Methods Convenient selection from January 2013 to January 2016, 78 patients with acute senile asthma were selected as the research subjects. The patients were randomly divided into the in-halation group and the oral group, 39 cases in each group. Inhalation group received salbutamol sulfate aerosol agent inhala-tion therapy and combination group given salbutamol sulfate aerosol combined with theophylline controlled release tablets in the treatment, pulmonary function, arterial blood gas analysis were observed after the treatment, the patients in the two groups, adverse reactions and therapeutic effect. Results The combined group FEV1/FVC, FEV1%, PaCO2, Pa2O2 were bet-ter than the inhalation group, the total effective rate of the combined group was 94.87%, significantly higher than that in the group of 82.05%, it was proved that the combined group was significantly better than the inhalation group, and the differ-ence between the groups was P < 0.05. Conclusion Salbutamol sulfate aerosol combined with theophylline controlled re-lease tablets can effectively alleviate the asthma in elderly patients with acute arterial blood gas and lung function, reduced in elderly patients with dyspnea felt pain, improve their quality of life, and to enhance the clinical curative effect, worthy of our respiratory physicians actively promote the use of.%目的:探究硫酸沙丁胺醇气雾剂吸入与茶碱控释片治疗急性老年哮喘的疗效对比。方法方便选取该院2013年1月—2016年1月收治患急性老年哮喘患者78例作为研究对象,随机分为吸入组和口服组,每组39例。吸入组给予硫酸沙丁胺醇气雾剂吸入治疗,联合组给予硫酸沙丁胺醇气雾剂联合茶碱控释片治疗,观察两组患者治疗后肺功能水平、动脉血气分析、不良反

  3. Aerosols Science and Technology

    CERN Document Server

    Agranovski, Igor

    2011-01-01

    This self-contained handbook and ready reference examines aerosol science and technology in depth, providing a detailed insight into this progressive field. As such, it covers fundamental concepts, experimental methods, and a wide variety of applications, ranging from aerosol filtration to biological aerosols, and from the synthesis of carbon nanotubes to aerosol reactors.Written by a host of internationally renowned experts in the field, this is an essential resource for chemists and engineers in the chemical and materials disciplines across multiple industries, as well as ideal supplementary

  4. Airborne studies of aerosol emissions from savanna fires in southern Africa: 2. Aerosol chemical composition

    Science.gov (United States)

    Andreae, M. O.; Andreae, T. W.; Annegarn, H.; Beer, J.; Cachier, H.; Le Canut, P.; Elbert, W.; Maenhaut, W.; Salma, I.; Wienhold, F. G.; Zenker, T.

    1998-12-01

    We investigated smoke emissions from fires in savanna, forest, and agricultural ecosystems by airborne sampling of plumes close to prescribed burns and incidental fires in southern Africa. Aerosol samples were collected on glass fiber filters and on stacked filter units, consisting of a Nuclepore prefilter for particles larger than ˜1-2 μm and a Teflon second filter stage for the submicron fraction. The samples were analyzed for soluble ionic components, organic carbon, and black carbon. Onboard the research aircraft, particle number and volume distributions as a function of size were determined with a laser-optical particle counter and the black carbon content of the aerosol with an aethalometer. We determined the emission ratios (relative to CO2 and CO) and emission factors (relative to the amount of biomass burnt) for the various aerosol constituents. The smoke aerosols were rich in organic and black carbon, the latter representing 10-30% of the aerosol mass. K+ and NH4+ were the dominant cationic species in the smoke of most fires, while Cl- and SO42- were the most important anions. The aerosols were unusually rich in Cl-, probably due to the high Cl content of the semiarid vegetation. Comparison of the element budget of the fuel before and after the fires shows that the fraction of the elements released during combustion is highly variable between elements. In the case of the halogen elements, almost the entire amount released during the fire is present in the aerosol phase, while in the case of C, N, and S, only a small proportion ends up as particulate matter. This suggests that the latter elements are present predominantly as gaseous species in the fresh fire plumes studied here.

  5. Bayesian versus 'plain-vanilla Bayesian' multitarget statistics

    Science.gov (United States)

    Mahler, Ronald P. S.

    2004-08-01

    Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."

  6. Climate implications of carbonaceous aerosols: An aerosol microphysical study using the GISS/MATRIX climate model

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Susanne E.; Menon, Surabi; Koch, Dorothy; Bond, Tami; Tsigaridis, Kostas

    2010-04-09

    Recently, attention has been drawn towards black carbon aerosols as a likely short-term climate warming mitigation candidate. However the global and regional impacts of the direct, cloud-indirect and semi-direct forcing effects are highly uncertain, due to the complex nature of aerosol evolution and its climate interactions. Black carbon is directly released as particle into the atmosphere, but then interacts with other gases and particles through condensation and coagulation processes leading to further aerosol growth, aging and internal mixing. A detailed aerosol microphysical scheme, MATRIX, embedded within the global GISS modelE includes the above processes that determine the lifecycle and climate impact of aerosols. This study presents a quantitative assessment of the impact of microphysical processes involving black carbon, such as emission size distributions and optical properties on aerosol cloud activation and radiative forcing. Our best estimate for net direct and indirect aerosol radiative forcing change is -0.56 W/m{sup 2} between 1750 and 2000. However, the direct and indirect aerosol effects are very sensitive to the black and organic carbon size distribution and consequential mixing state. The net radiative forcing change can vary between -0.32 to -0.75 W/m{sup 2} depending on these carbonaceous particle properties. Assuming that sulfates, nitrates and secondary organics form a coating shell around a black carbon core, rather than forming a uniformly mixed particles, changes the overall net radiative forcing from a negative to a positive number. Black carbon mitigation scenarios showed generally a benefit when mainly black carbon sources such as diesel emissions are reduced, reducing organic and black carbon sources such as bio-fuels, does not lead to reduced warming.

  7. Bayesian priors for transiting planets

    CERN Document Server

    Kipping, David M

    2016-01-01

    As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...

  8. Bayesian approach to rough set

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.

  9. Deep Learning and Bayesian Methods

    Directory of Open Access Journals (Sweden)

    Prosper Harrison B.

    2017-01-01

    Full Text Available A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  10. Deep Learning and Bayesian Methods

    Science.gov (United States)

    Prosper, Harrison B.

    2017-03-01

    A revolution is underway in which deep neural networks are routinely used to solve diffcult problems such as face recognition and natural language understanding. Particle physicists have taken notice and have started to deploy these methods, achieving results that suggest a potentially significant shift in how data might be analyzed in the not too distant future. We discuss a few recent developments in the application of deep neural networks and then indulge in speculation about how such methods might be used to automate certain aspects of data analysis in particle physics. Next, the connection to Bayesian methods is discussed and the paper ends with thoughts on a significant practical issue, namely, how, from a Bayesian perspective, one might optimize the construction of deep neural networks.

  11. Bayesian Source Separation and Localization

    CERN Document Server

    Knuth, K H

    1998-01-01

    The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...

  12. Bayesian Inference for Radio Observations

    CERN Document Server

    Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin

    2015-01-01

    (Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...

  13. Bayesian inference on proportional elections.

    Directory of Open Access Journals (Sweden)

    Gabriel Hideki Vatanabe Brunello

    Full Text Available Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  14. The Effect of Water Injection on the Fission Product Aerosol Behavior in Fukushima Unit 1

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sung Il; Ha, Kwang Soon; Kim, Dong Ha; Kim, Tae Woon [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The most important factor affects human health is fission product that is released from the plant. Fission products usually released with types of aerosol and vapor. The amount of released aerosols out of the plant is crucial, because it can be breathed by people. In this study, the best estimated scenario of Fukushima unit 1 accident was modeled with MELCOR. The amount of released fission product aerosols was estimated according to the amount of added water into reactor pressure vessel (RPV). The analysis of Fukushima unit 1 accident was conducted in view of fission product aerosol release using MELCOR. First of all, thermodynamic results of the plant were compared to the measured data, and then fission product aerosol (CsOH) behavior was calculated with changing the amount of water injection. Water injection affects the amount of aerosol which released into reactor building, because it decreases the temperature of deposition surface. In this study, only aerosol behavior was considered, further study will be conducted including hygroscopic model.

  15. Bayesian analysis for kaon photoproduction

    Energy Technology Data Exchange (ETDEWEB)

    Marsainy, T., E-mail: tmart@fisika.ui.ac.id; Mart, T., E-mail: tmart@fisika.ui.ac.id [Department Fisika, FMIPA, Universitas Indonesia, Depok 16424 (Indonesia)

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  16. Bayesian priors and nuisance parameters

    CERN Document Server

    Gupta, Sourendu

    2016-01-01

    Bayesian techniques are widely used to obtain spectral functions from correlators. We suggest a technique to rid the results of nuisance parameters, ie, parameters which are needed for the regularization but cannot be determined from data. We give examples where the method works, including a pion mass extraction with two flavours of staggered quarks at a lattice spacing of about 0.07 fm. We also give an example where the method does not work.

  17. Space Shuttle RTOS Bayesian Network

    Science.gov (United States)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  18. Elements of Bayesian experimental design

    Energy Technology Data Exchange (ETDEWEB)

    Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.

  19. Bayesian Sampling using Condition Indicators

    DEFF Research Database (Denmark)

    Faber, Michael H.; Sørensen, John Dalsgaard

    2002-01-01

    . This allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion....... It is shown how half-cell potential measurements may be utilized to update the probability of excessive repair after 50 years....

  20. Aerosol processing in stratiform clouds in ECHAM6-HAM

    Science.gov (United States)

    Neubauer, David; Lohmann, Ulrike; Hoose, Corinna

    2013-04-01

    Aerosol processing in stratiform clouds by uptake into cloud particles, collision-coalescence, chemical processing inside the cloud particles and release back into the atmosphere has important effects on aerosol concentration, size distribution, chemical composition and mixing state. Aerosol particles can act as cloud condensation nuclei. Cloud droplets can take up further aerosol particles by collisions. Atmospheric gases may also be transferred into the cloud droplets and undergo chemical reactions, e.g. the production of atmospheric sulphate. Aerosol particles are also processed in ice crystals. They may be taken up by homogeneous freezing of cloud droplets below -38° C or by heterogeneous freezing above -38° C. This includes immersion freezing of already immersed aerosol particles in the droplets and contact freezing of particles colliding with a droplet. Many clouds do not form precipitation and also much of the precipitation evaporates before it reaches the ground. The water soluble part of the aerosol particles concentrates in the hydrometeors and together with the insoluble part forms a single, mixed, larger particle, which is released. We have implemented aerosol processing into the current version of the general circulation model ECHAM6 (Stevens et al., 2013) coupled to the aerosol module HAM (Stier et al., 2005). ECHAM6-HAM solves prognostic equations for the cloud droplet number and ice crystal number concentrations. In the standard version of HAM, seven modes are used to describe the total aerosol. The modes are divided into soluble/mixed and insoluble modes and the number concentrations and masses of different chemical components (sulphate, black carbon, organic carbon, sea salt and mineral dust) are prognostic variables. We extended this by an explicit representation of aerosol particles in cloud droplets and ice crystals in stratiform clouds similar to Hoose et al. (2008a,b). Aerosol particles in cloud droplets are represented by 5 tracers for the

  1. 12th Brazilian Meeting on Bayesian Statistics

    CERN Document Server

    Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo

    2015-01-01

    Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...

  2. Bayesian Inversion of Seabed Scattering Data

    Science.gov (United States)

    2014-09-30

    Bayesian Inversion of Seabed Scattering Data (Special Research Award in Ocean Acoustics) Gavin A.M.W. Steininger School of Earth & Ocean...project are to carry out joint Bayesian inversion of scattering and reflection data to estimate the in-situ seabed scattering and geoacoustic parameters...valid OMB control number. 1. REPORT DATE 30 SEP 2014 2. REPORT TYPE 3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE Bayesian

  3. Anomaly Detection and Attribution Using Bayesian Networks

    Science.gov (United States)

    2014-06-01

    UNCLASSIFIED Anomaly Detection and Attribution Using Bayesian Networks Andrew Kirk, Jonathan Legg and Edwin El-Mahassni National Security and...detection in Bayesian networks , en- abling both the detection and explanation of anomalous cases in a dataset. By exploiting the structure of a... Bayesian network , our algorithm is able to efficiently search for local maxima of data conflict between closely related vari- ables. Benchmark tests using

  4. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan

    2004-01-01

    We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  5. DARE : Dedicated Aerosols Retrieval Experiment

    NARCIS (Netherlands)

    Smorenburg, K.; Courrèges-Lacoste, G.B.; Decae, R.; Court, A.J.; Leeuw, G. de; Visser, H.

    2004-01-01

    At present there is an increasing interest in remote sensing of aerosols from space because of the large impact of aerosols on climate, earth observation and health. TNO has performed a study aimed at improving aerosol characterisation using a space based instrument and state-of-the-art aerosol retr

  6. Interactions of fission product vapours with aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Benson, C.G.; Newland, M.S. [AEA Technology, Winfrith (United Kingdom)

    1996-12-01

    Reactions between structural and reactor materials aerosols and fission product vapours released during a severe accident in a light water reactor (LWR) will influence the magnitude of the radiological source term ultimately released to the environment. The interaction of cadmium aerosol with iodine vapour at different temperatures has been examined in a programme of experiments designed to characterise the kinetics of the system. Laser induced fluorescence (LIF) is a technique that is particularly amenable to the study of systems involving elemental iodine because of the high intensity of the fluorescence lines. Therefore this technique was used in the experiments to measure the decrease in the concentration of iodine vapour as the reaction with cadmium proceeded. Experiments were conducted over the range of temperatures (20-350{sup o}C), using calibrated iodine vapour and cadmium aerosol generators that gave well-quantified sources. The LIF results provided information on the kinetics of the process, whilst examination of filter samples gave data on the composition and morphology of the aerosol particles that were formed. The results showed that the reaction of cadmium with iodine was relatively fast, giving reaction half-lives of approximately 0.3 s. This suggests that the assumption used by primary circuit codes such as VICTORIA that reaction rates are mass-transfer limited, is justified for the cadmium-iodine reaction. The reaction was first order with respect to both cadmium and iodine, and was assigned as pseudo second order overall. However, there appeared to be a dependence of aerosol surface area on the overall rate constant, making the precise order of the reaction difficult to assign. The relatively high volatility of the cadmium iodide formed in the reaction played an important role in determining the composition of the particles. (author) 23 figs., 7 tabs., 22 refs.

  7. Detection of biological aerosols by luminescence techniques

    Science.gov (United States)

    Stopa, Peter J.; Tieman, Darlene; Coon, Phillip A.; Paterno, Dorothea A.; Milton, Maurice M.

    1999-12-01

    Luciferin-Luciferase (L-L) luminescence techniques were used to successfully measure adenosine triphosphate (ATP) (pg/ml) in concentrated aerosol samples containing either vegetative bacterial cells or spores. Aerosols were collected with wet and dry sampling devices. Evaluation for the presence of total bio-mass from bacterial and non-bacterial sources of ATP was achieved by suspending the collected aerosol samples in phosphate buffered saline (PBS), pipeting a 50-(mu) l aliquot of the PBS suspension into a FiltravetteTM, and then adding bacterial releasing agent (BRA). The sample was then reacted with L-L, and the resulting Relative Luminescence Units (RLU's), indicative of ATP from all sources, were measured. Bacterial cells were enumerated with the additional application of a wash with somatic cell releasing agent (SRA) to remove any interferences and non-bacterial sources of ATP prior to BRA application. This step removes interfering substances and non-bacterial sources of ATP. For spore analysis, an equi-volume sample of the PBS suspension was added to an equi-volume of trypticase soy broth (TSB), incubated at 37 C for 15 minutes, and processed using methods identical to bacterial cell analysis. Using these technique we were able to detect Bacillus subtilin variation niger, formerly known as Bacillus globigii (BG), in aerosol samples at concentrations greater than or equal to 105 colony forming units (CFU) per ml. Results of field and chamber trials show that one can detect the presence of bacterial and non-bacterial sources of ATP. One can also differentiate spore and vegetative bacterial cells. These techniques may be appropriate to situations where the measurement of bacterial aerosols is needed.

  8. SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE

    Institute of Scientific and Technical Information of China (English)

    Ming HAN; Yuanyao DING

    2004-01-01

    This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.

  9. Learning dynamic Bayesian networks with mixed variables

    DEFF Research Database (Denmark)

    Bøttcher, Susanne Gammelgaard

    This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned....... An automated procedure for specifying prior distributions for the parameters in a dynamic Bayesian network is presented. It is a simple extension of the procedure for the ordinary Bayesian networks. Finally the W¨olfer?s sunspot numbers are analyzed....

  10. Variational bayesian method of estimating variance components.

    Science.gov (United States)

    Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi

    2016-07-01

    We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.

  11. MSA in Beijing aerosol

    Institute of Scientific and Technical Information of China (English)

    YUAN Hui; WANG Ying; ZHUANG Guoshun

    2004-01-01

    Methane sulphonate (MSA) and sulfate (SO42-), the main oxidation products of dimethyl sulfide (DMS), are the target of atmospheric chemistry study, as sulfate aerosol would have important impact on the global climate change. It is widely believed that DMS is mainly emitted from phytoplankton production in marine boundary layer (MBL), and MSA is usually used as the tracer of non-sea-salt sulfate (nss- SO42-) in marine and coastal areas (MSA/SO42- = 1/18). Many observations of MSA were in marine and coastal aerosols. To our surprise, MSA was frequently (>60%) detected in Beijing TSP, PM10, and PM2.5 aerosols, even in the samples collected during the dust storm period. The concentrations of MSA were higher than those measured in marine aerosols. Factor analysis, correlation analysis and meteorology analysis indicated that there was no obvious marine influence on Beijing aerosols. DMS from terrestrial emissions and dimethyl sulphoxide (DMSO) from industrial wastes could be the two possible precursors of MSA. Warm and low-pressure air masses and long time radiation were beneficial to the formation of MSA. Anthropogenic pollution from regional and local sources might be the dominant contributor to MSA in Beijing aerosol. This was the first report of MSA in aerosols collected in an inland site in China. This new finding would lead to the further study on the balance of sulfur in inland cities and its global biogeochemical cycle.

  12. Modal aerosol dynamics modeling

    Energy Technology Data Exchange (ETDEWEB)

    Whitby, E.R.; McMurry, P.H.; Shankar, U.; Binkowski, F.S.

    1991-02-01

    The report presents the governing equations for representing aerosol dynamics, based on several different representations of the aerosol size distribution. Analytical and numerical solution techniques for these governing equations are also reviewed. Described in detail is a computationally efficient numerical technique for simulating aerosol behavior in systems undergoing simultaneous heat transfer, fluid flow, and mass transfer in and between the gas and condensed phases. The technique belongs to a general class of models known as modal aerosol dynamics (MAD) models. These models solve for the temporal and spatial evolution of the particle size distribution function. Computational efficiency is achieved by representing the complete aerosol population as a sum of additive overlapping populations (modes), and solving for the time rate of change of integral moments of each mode. Applications of MAD models for simulating aerosol dynamics in continuous stirred tank aerosol reactors and flow aerosol reactors are provided. For the application to flow aerosol reactors, the discussion is developed in terms of considerations for merging a MAD model with the SIMPLER routine described by Patankar (1980). Considerations for incorporating a MAD model into the U.S. Environmental Protection Agency's Regional Particulate Model are also described. Numerical and analytical techniques for evaluating the size-space integrals of the modal dynamics equations (MDEs) are described. For multimodal logonormal distributions, an analytical expression for the coagulation integrals of the MDEs, applicable for all size regimes, is derived, and is within 20% of accurate numerical evaluation of the same moment coagulation integrals. A computationally efficient integration technique, based on Gauss-Hermite numerical integration, is also derived.

  13. Preliminary results of the aerosol optical depth retrieval in Johor, Malaysia

    Science.gov (United States)

    Lim, H. Q.; Kanniah, K. D.; Lau, A. M. S.

    2014-02-01

    Monitoring of atmospheric aerosols over the urban area is important as tremendous amounts of pollutants are released by industrial activities and heavy traffic flow. Air quality monitoring by satellite observation provides better spatial coverage, however, detailed aerosol properties retrieval remains a challenge. This is due to the limitation of aerosol retrieval algorithm on high reflectance (bright surface) areas. The aim of this study is to retrieve aerosol optical depth over urban areas of Iskandar Malaysia; the main southern development zone in Johor state, using Moderate Resolution Imaging Spectroradiometer (MODIS) 500 m resolution data. One of the important steps is the aerosol optical depth retrieval is to characterise different types of aerosols in the study area. This information will be used to construct a Look Up Table containing the simulated aerosol reflectance and corresponding aerosol optical depth. Thus, in this study we have characterised different aerosol types in the study area using Aerosol Robotic Network (AERONET) data. These data were processed using cluster analysis and the preliminary results show that the area is consisting of coastal urban (65%), polluted urban (27.5%), dust particles (6%) and heavy pollution (1.5%) aerosols.

  14. Bayesian Methods and Universal Darwinism

    Science.gov (United States)

    Campbell, John

    2009-12-01

    Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.

  15. Bayesian phylogeography finds its roots.

    Directory of Open Access Journals (Sweden)

    Philippe Lemey

    2009-09-01

    Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.

  16. Bayesian Query-Focused Summarization

    CERN Document Server

    Daumé, Hal

    2009-01-01

    We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.

  17. Numeracy, frequency, and Bayesian reasoning

    Directory of Open Access Journals (Sweden)

    Gretchen B. Chapman

    2009-02-01

    Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.

  18. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    2013-01-01

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  19. Bayesian inference for Hawkes processes

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Gulddahl

    The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....

  20. Bayesian homeopathy: talking normal again.

    Science.gov (United States)

    Rutten, A L B

    2007-04-01

    Homeopathy has a communication problem: important homeopathic concepts are not understood by conventional colleagues. Homeopathic terminology seems to be comprehensible only after practical experience of homeopathy. The main problem lies in different handling of diagnosis. In conventional medicine diagnosis is the starting point for randomised controlled trials to determine the effect of treatment. In homeopathy diagnosis is combined with other symptoms and personal traits of the patient to guide treatment and predict response. Broadening our scope to include diagnostic as well as treatment research opens the possibility of multi factorial reasoning. Adopting Bayesian methodology opens the possibility of investigating homeopathy in everyday practice and of describing some aspects of homeopathy in conventional terms.

  1. Aerosols from biomass combustion

    Energy Technology Data Exchange (ETDEWEB)

    Nussbaumer, T.

    2001-07-01

    This report is the proceedings of a seminar on biomass combustion and aerosol production organised jointly by the International Energy Agency's (IEA) Task 32 on bio energy and the Swiss Federal Office of Energy (SFOE). This collection of 16 papers discusses the production of aerosols and fine particles by the burning of biomass and their effects. Expert knowledge on the environmental impact of aerosols, formation mechanisms, measurement technologies, methods of analysis and measures to be taken to reduce such emissions is presented. The seminar, visited by 50 participants from 11 countries, shows, according to the authors, that the reduction of aerosol emissions resulting from biomass combustion will remain a challenge for the future.

  2. Bayesian credible interval construction for Poisson statistics

    Institute of Scientific and Technical Information of China (English)

    ZHU Yong-Sheng

    2008-01-01

    The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.

  3. Advances in Bayesian Modeling in Educational Research

    Science.gov (United States)

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  4. Nonparametric Bayesian Modeling of Complex Networks

    DEFF Research Database (Denmark)

    Schmidt, Mikkel Nørgaard; Mørup, Morten

    2013-01-01

    Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....

  5. Modeling Diagnostic Assessments with Bayesian Networks

    Science.gov (United States)

    Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego

    2007-01-01

    This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…

  6. Using Bayesian Networks to Improve Knowledge Assessment

    Science.gov (United States)

    Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra

    2013-01-01

    In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…

  7. The Bayesian Revolution Approaches Psychological Development

    Science.gov (United States)

    Shultz, Thomas R.

    2007-01-01

    This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…

  8. Bayesian analysis of exoplanet and binary orbits

    OpenAIRE

    Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas

    2012-01-01

    We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.

  9. Bayesian Network for multiple hypthesis tracking

    NARCIS (Netherlands)

    W.P. Zajdel; B.J.A. Kröse

    2002-01-01

    For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ

  10. Emergency Protection from Aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Cristy, G.A.

    2001-11-13

    Expedient methods were developed that could be used by an average person, using only materials readily available, to protect himself and his family from injury by toxic (e.g., radioactive) aerosols. The most effective means of protection was the use of a household vacuum cleaner to maintain a small positive pressure on a closed house during passage of the aerosol cloud. Protection factors of 800 and above were achieved.

  11. Hepatitis disease detection using Bayesian theory

    Science.gov (United States)

    Maseleno, Andino; Hidayati, Rohmah Zahroh

    2017-02-01

    This paper presents hepatitis disease diagnosis using a Bayesian theory for better understanding of the theory. In this research, we used a Bayesian theory for detecting hepatitis disease and displaying the result of diagnosis process. Bayesian algorithm theory is rediscovered and perfected by Laplace, the basic idea is using of the known prior probability and conditional probability density parameter, based on Bayes theorem to calculate the corresponding posterior probability, and then obtained the posterior probability to infer and make decisions. Bayesian methods combine existing knowledge, prior probabilities, with additional knowledge derived from new data, the likelihood function. The initial symptoms of hepatitis which include malaise, fever and headache. The probability of hepatitis given the presence of malaise, fever, and headache. The result revealed that a Bayesian theory has successfully identified the existence of hepatitis disease.

  12. 2nd Bayesian Young Statisticians Meeting

    CERN Document Server

    Bitto, Angela; Kastner, Gregor; Posekany, Alexandra

    2015-01-01

    The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...

  13. RACORO aerosol data processing

    Energy Technology Data Exchange (ETDEWEB)

    Elisabeth Andrews

    2011-10-31

    The RACORO aerosol data (cloud condensation nuclei (CCN), condensation nuclei (CN) and aerosol size distributions) need further processing to be useful for model evaluation (e.g., GCM droplet nucleation parameterizations) and other investigations. These tasks include: (1) Identification and flagging of 'splash' contaminated Twin Otter aerosol data. (2) Calculation of actual supersaturation (SS) values in the two CCN columns flown on the Twin Otter. (3) Interpolation of CCN spectra from SGP and Twin Otter to 0.2% SS. (4) Process data for spatial variability studies. (5) Provide calculated light scattering from measured aerosol size distributions. Below we first briefly describe the measurements and then describe the results of several data processing tasks that which have been completed, paving the way for the scientific analyses for which the campaign was designed. The end result of this research will be several aerosol data sets which can be used to achieve some of the goals of the RACORO mission including the enhanced understanding of cloud-aerosol interactions and improved cloud simulations in climate models.

  14. BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.

    Science.gov (United States)

    Khakabimamaghani, Sahand; Ester, Martin

    2016-01-01

    The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.

  15. Patient's Guide to Aerosol Drug Delivery

    Science.gov (United States)

    ... Table of Contents Page Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 ................................................................ 1. Aerosol Drug Delivery: The Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2. Aerosol Drugs: The Major Categories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .12 3. Aerosol Drug Delivery Devices: Small-Volume Nebulizers . . . . . . . . . . . . .17 4. Aerosol Drug ...

  16. Characterisation of Aerosols from Simulated Radiological Dispersion Events

    NARCIS (Netherlands)

    Di Lemma, F.G.

    2015-01-01

    The research described in this thesis aims at improving the evaluation of the radiaoctive aerosol release from different Radiological Dispersion Events (RDE's), such as accidents and sabotage involving radioactive and nuclear materials. These studies help in a better assessment of the source term as

  17. Organic aerosol sources an partitioning in CMAQv5.2

    Science.gov (United States)

    We describe a major CMAQ update, available in version 5.2, which explicitly treats the semivolatile mass transfer of primary organic aerosol compounds, in agreement with available field and laboratory observations. Until this model release, CMAQ has considered these compounds to ...

  18. Aerosol characteristics and sources for the Amazon Basin during the wet season

    Science.gov (United States)

    Artaxo, Paulo; Maenhaut, Willy; Stroms, Hedwig; van Grieken, Rene

    1990-09-01

    An attempt is made to obtain the main chemical and morphological characteristics of particles released to the atmosphere by the tropical rain forest during the wet season. Fine and coarse aerosol fractions were collected and analyzed for elemental composition by bulk and individual particle analysis techniques. Elemental concentrations were measured by the particle-induced X-ray emission technique. Absolute principal factor analysis was applied to the elemental concentrations in order to derive source profiles for the identified sources of aerosols. Total mass source apportionment was used to obtain a quantitative assessment of the biogenic aerosol component and of the other aerosol sources. Total aerosol mass source apportionment showed that biogenic particles account for 55-95 percent of the airborne concentrations, indicating that biogenic particles can play an important role in the global aerosol budget and in the global biogeochemical cycles of various elements.

  19. Physical metrology of aerosols; Metrologie physique des aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Boulaud, D.; Vendel, J. [CEA Saclay, 91 - Gif-sur-Yvette (France). Inst. de Protection et de Surete Nucleaire

    1996-12-31

    The various detection and measuring methods for aerosols are presented, and their selection is related to aerosol characteristics (size range, concentration or mass range), thermo-hydraulic conditions (carrier fluid temperature, pressure and flow rate) and to the measuring system conditions (measuring frequency, data collection speed, cost...). Methods based on aerosol dynamic properties (inertial, diffusional and electrical methods) and aerosol optical properties (localized and integral methods) are described and their performances and applications are compared

  20. Quantum Bayesianism at the Perimeter

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.

  1. On Bayesian System Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Soerensen Ringi, M.

    1995-05-01

    The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person`s state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs.

  2. Hedging Strategies for Bayesian Optimization

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.

  3. Nonparametric Bayesian inference in biostatistics

    CERN Document Server

    Müller, Peter

    2015-01-01

    As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...

  4. Bayesian Inference with Optimal Maps

    CERN Document Server

    Moselhy, Tarek A El

    2011-01-01

    We present a new approach to Bayesian inference that entirely avoids Markov chain simulation, by constructing a map that pushes forward the prior measure to the posterior measure. Existence and uniqueness of a suitable measure-preserving map is established by formulating the problem in the context of optimal transport theory. We discuss various means of explicitly parameterizing the map and computing it efficiently through solution of an optimization problem, exploiting gradient information from the forward model when possible. The resulting algorithm overcomes many of the computational bottlenecks associated with Markov chain Monte Carlo. Advantages of a map-based representation of the posterior include analytical expressions for posterior moments and the ability to generate arbitrary numbers of independent posterior samples without additional likelihood evaluations or forward solves. The optimization approach also provides clear convergence criteria for posterior approximation and facilitates model selectio...

  5. Multiview Bayesian Correlated Component Analysis

    DEFF Research Database (Denmark)

    Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai

    2015-01-01

    Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks are iden......Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....

  6. Bayesian networks in educational assessment

    CERN Document Server

    Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M

    2015-01-01

    Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...

  7. Bayesian anti-sparse coding

    CERN Document Server

    Elvira, Clément; Dobigeon, Nicolas

    2015-01-01

    Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...

  8. Bayesian Inference in Queueing Networks

    CERN Document Server

    Sutton, Charles

    2010-01-01

    Modern Web services, such as those at Google, Yahoo!, and Amazon, handle billions of requests per day on clusters of thousands of computers. Because these services operate under strict performance requirements, a statistical understanding of their performance is of great practical interest. Such services are modeled by networks of queues, where one queue models each of the individual computers in the system. A key challenge is that the data is incomplete, because recording detailed information about every request to a heavily used system can require unacceptable overhead. In this paper we develop a Bayesian perspective on queueing models in which the arrival and departure times that are not observed are treated as latent variables. Underlying this viewpoint is the observation that a queueing model defines a deterministic transformation between the data and a set of independent variables called the service times. With this viewpoint in hand, we sample from the posterior distribution over missing data and model...

  9. A Bayesian Reflection on Surfaces

    Directory of Open Access Journals (Sweden)

    David R. Wolf

    1999-10-01

    Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.

  10. Generation of aerosolized drugs.

    Science.gov (United States)

    Wolff, R K; Niven, R W

    1994-01-01

    The expanding use of inhalation therapy has placed demands on current aerosol generation systems that are difficult to meet with current inhalers. The desire to deliver novel drug entities such as proteins and peptides, as well as complex formulations including liposomes and microspheres, requires delivery systems of improved efficiency that will target the lung in a reproducible manner. These efforts have also been spurred by the phase out of chlorofluorocarbons (CFCs) and this has included a directed search for alternative propellants. Consequently, a variety of new aerosol devices and methods of generating aerosols are being studied. This includes the use of freon replacement propellants, dry powder generation systems, aqueous unit spray systems and microprocessor controlled technologies. Each approach has advantages and disadvantages depending upon each principle of action and set of design variables. In addition, specific drugs may be better suited for one type of inhaler device vs. another. The extent to which aerosol generation systems achieve their goals is discussed together with a summary of selected papers presented at the recent International Congress of Aerosols in Medicine.

  11. Bayesian models a statistical primer for ecologists

    CERN Document Server

    Hobbs, N Thompson

    2015-01-01

    Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili

  12. Compiling Relational Bayesian Networks for Exact Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark

    2006-01-01

    We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...

  13. Combustion aerosols from potassium-containing fuels

    Energy Technology Data Exchange (ETDEWEB)

    Balzer Nielsen, Lars

    1998-12-31

    The scope of the work presented in this thesis is the formation and evolution of aerosol particles in the submicron range during combustion processes, in particular where biomass is used alone or co-fired with coal. An introduction to the formation processes of fly ash in general and submicron aerosol in particular during combustion is presented, along with some known problems related to combustion of biomass for power generation. The work falls in two parts. The first is the design of a laboratory setup for investigation of homogeneous nucleation and particle dynamics at high temperature. The central unit of the setup is a laminar flow aerosol condenser (LFAC), which essentially is a 173 cm long tubular furnace with an externally cooled wall. A mathematical model is presented which describes the formation and evolution of the aerosol in the LFAC, where the rate of formation of new nuclei is calculated using the so-called classical theory. The model includes mass and energy conservation equations and an expression for the description of particle growth by diffusion. The resulting set of nonlinear second-order partial differential equations are solved numerically using the method of orthogonal collocation. The model is implemented in the FORTRAN code MONAERO. The second part of this thesis describes a comprehensive investigation of submicron aerosol formation during co-firing of coal and straw carried out at a 380 MW{sub Th} pulverized coal unit at Studstrup Power Plant, Aarhus. Three types of coal are used, and total boiler load and straw input is varied systematically. Straw contains large amounts of potassium, which is released during combustion. Submicron aerosol is sampled between the two banks of the economizer at a flue gas temperature of 350 deg. C using a novel ejector probe. The aerosol is characterized using the SMPS system and a Berner-type low pressure impactor. The chemical composition of the particles collected in the impactor is determined using

  14. Chemical aerosol Raman detector

    Science.gov (United States)

    Aggarwal, R. L.; Farrar, L. W.; Di Cecca, S.; Amin, M.; Perkins, B. G.; Clark, M. L.; Jeys, T. H.; Sickenberger, D. W.; D'Amico, F. M.; Emmons, E. D.; Christesen, S. D.; Kreis, R. J.; Kilper, G. K.

    2017-03-01

    A sensitive chemical aerosol Raman detector (CARD) has been developed for the trace detection and identification of chemical particles in the ambient atmosphere. CARD includes an improved aerosol concentrator with a concentration factor of about 40 and a CCD camera for improved detection sensitivity. Aerosolized isovanillin, which is relatively safe, has been used to characterize the performance of the CARD. The limit of detection (SNR = 10) for isovanillin in 15 s has been determined to be 1.6 pg/cm3, which corresponds to 6.3 × 109 molecules/cm3 or 0.26 ppb. While less sensitive, CARD can also detect gases. This paper provides a more detailed description of the CARD hardware and detection algorithm than has previously been published.

  15. The Diagnosis of Reciprocating Machinery by Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.

  16. MODIS 3km Aerosol Product: Algorithm and Global Perspective

    Science.gov (United States)

    Remer, L. A.; Mattoo, S.; Levy, R. C.; Munchak, L.

    2013-01-01

    After more than a decade of producing a nominal 10 km aerosol product based on the dark target method, the MODIS aerosol team will be releasing a nominal 3 km product as part of their Collection 6 release. The new product differs from the original 10 km product only in the manner in which reflectance pixels are ingested, organized and selected by the aerosol algorithm. Overall, the 3 km product closely mirrors the 10 km product. However, the finer resolution product is able to retrieve over ocean closer to islands and coastlines, and is better able to resolve fine aerosol features such as smoke plumes over both ocean and land. In some situations, it provides retrievals over entire regions that the 10 km product barely samples. In situations traditionally difficult for the dark target algorithm, such as over bright or urban surfaces the 3 km product introduces isolated spikes of artificially high aerosol optical depth (AOD) that the 10 km algorithm avoids. Over land, globally, the 3 km product appears to be 0.01 to 0.02 higher than the 10 km product, while over ocean, the 3 km algorithm is retrieving a proportionally greater number of very low aerosol loading situations. Based on collocations with ground-based observations for only six months, expected errors associated with the 3 km land product are determined to be greater than for the 10 km product: 0.05 0.25 AOD. Over ocean, the suggestion is for expected errors to be the same as the 10 km product: 0.03 0.05 AOD. The advantage of the product is on the local scale, which will require continued evaluation not addressed here. Nevertheless, the new 3 km product is expected to provide important information complementary to existing satellite-derived products and become an important tool for the aerosol community.

  17. Stratospheric Aerosol Measurements

    Science.gov (United States)

    Pueschel, Rudolf, F.; Gore, Warren J. (Technical Monitor)

    1998-01-01

    Stratospheric aerosols affect the atmospheric energy balance by scattering and absorbing solar and terrestrial radiation. They also can alter stratospheric chemical cycles by catalyzing heterogeneous reactions which markedly perturb odd nitrogen, chlorine and ozone levels. Aerosol measurements by satellites began in NASA in 1975 with the Stratospheric Aerosol Measurement (SAM) program, to be followed by the Stratospheric Aerosol and Gas Experiment (SAGE) starting in 1979. Both programs employ the solar occultation, or Earth limb extinction, techniques. Major results of these activities include the discovery of polar stratospheric clouds (PSCs) in both hemispheres in winter, illustrations of the impacts of major (El Chichon 1982 and Pinatubo 1991) eruptions, and detection of a negative global trend in lower stratospheric/upper tropospheric aerosol extinction. This latter result can be considered a triumph of successful worldwide sulfur emission controls. The SAGE record will be continued and improved by SAGE III, currently scheduled for multiple launches beginning in 2000 as part of the Earth Observing System (EOS). The satellite program has been supplemented by in situ measurements aboard the ER-2 (20 km ceiling) since 1974, and from the DC-8 (13 km ceiling) aircraft beginning in 1989. Collection by wire impactors and subsequent electron microscopic and X-ray energy-dispersive analyses, and optical particle spectrometry have been the principle techniques. Major findings are: (1) The stratospheric background aerosol consists of dilute sulfuric acid droplets of around 0.1 micrometer modal diameter at concentration of tens to hundreds of monograms per cubic meter; (2) Soot from aircraft amounts to a fraction of one percent of the background total aerosol; (3) Volcanic eruptions perturb the sulfuric acid, but not the soot, aerosol abundance by several orders of magnitude; (4) PSCs contain nitric acid at temperatures below 195K, supporting chemical hypotheses

  18. GPstuff: Bayesian Modeling with Gaussian Processes

    NARCIS (Netherlands)

    Vanhatalo, J.; Riihimaki, J.; Hartikainen, J.; Jylänki, P.P.; Tolvanen, V.; Vehtari, A.

    2013-01-01

    The GPstuff toolbox is a versatile collection of Gaussian process models and computational tools required for Bayesian inference. The tools include, among others, various inference methods, sparse approximations and model assessment methods.

  19. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  20. Picturing classical and quantum Bayesian inference

    CERN Document Server

    Coecke, Bob

    2011-01-01

    We introduce a graphical framework for Bayesian inference that is sufficiently general to accommodate not just the standard case but also recent proposals for a theory of quantum Bayesian inference wherein one considers density operators rather than probability distributions as representative of degrees of belief. The diagrammatic framework is stated in the graphical language of symmetric monoidal categories and of compact structures and Frobenius structures therein, in which Bayesian inversion boils down to transposition with respect to an appropriate compact structure. We characterize classical Bayesian inference in terms of a graphical property and demonstrate that our approach eliminates some purely conventional elements that appear in common representations thereof, such as whether degrees of belief are represented by probabilities or entropic quantities. We also introduce a quantum-like calculus wherein the Frobenius structure is noncommutative and show that it can accommodate Leifer's calculus of `cond...

  1. Learning Bayesian networks for discrete data

    KAUST Repository

    Liang, Faming

    2009-02-01

    Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.

  2. An Intuitive Dashboard for Bayesian Network Inference

    Science.gov (United States)

    Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.

    2014-03-01

    Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.

  3. ProFit: Bayesian galaxy fitting tool

    Science.gov (United States)

    Robotham, A. S. G.; Taranu, D.; Tobar, R.

    2016-12-01

    ProFit is a Bayesian galaxy fitting tool that uses the fast C++ image generation library libprofit (ascl:1612.003) and a flexible R interface to a large number of likelihood samplers. It offers a fully featured Bayesian interface to galaxy model fitting (also called profiling), using mostly the same standard inputs as other popular codes (e.g. GALFIT ascl:1104.010), but it is also able to use complex priors and a number of likelihoods.

  4. Bayesian target tracking based on particle filter

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.

  5. Variational Bayesian Approximation methods for inverse problems

    Science.gov (United States)

    Mohammad-Djafari, Ali

    2012-09-01

    Variational Bayesian Approximation (VBA) methods are recent tools for effective Bayesian computations. In this paper, these tools are used for inverse problems where the prior models include hidden variables and where where the estimation of the hyper parameters has also to be addressed. In particular two specific prior models (Student-t and mixture of Gaussian models) are considered and details of the algorithms are given.

  6. Bayesian Modeling of a Human MMORPG Player

    CERN Document Server

    Synnaeve, Gabriel

    2010-01-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  7. Bayesian Modeling of a Human MMORPG Player

    Science.gov (United States)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  8. Fuzzy Functional Dependencies and Bayesian Networks

    Institute of Scientific and Technical Information of China (English)

    LIU WeiYi(刘惟一); SONG Ning(宋宁)

    2003-01-01

    Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.

  9. Philosophy and the practice of Bayesian statistics.

    Science.gov (United States)

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework.

  10. Advancing Models and Evaluation of Cumulus, Climate and Aerosol Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Gettelman, Andrew [University Corporation for Atmospheric Research (NCAR), Boulder, CO (United States)

    2015-10-27

    This project was successfully able to meet its’ goals, but faced some serious challenges due to personnel issues. Nonetheless, it was largely successful. The Project Objectives were as follows: 1. Develop a unified representation of stratifom and cumulus cloud microphysics for NCAR/DOE global community models. 2. Examine the effects of aerosols on clouds and their impact on precipitation in stratiform and cumulus clouds. We will also explore the effects of clouds and precipitation on aerosols. 3. Test these new formulations using advanced evaluation techniques and observations and release

  11. Bayesian analysis of inflationary features in Planck and SDSS data

    CERN Document Server

    Benetti, Micol

    2016-01-01

    We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyse the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the Cosmic Microwave Background (CMB) and also in the matter power spectrum P (k). We use the most recent CMB data provided by the Planck Collaboration and P (k) measurements from the eleventh data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localised at different intervals of angular scales, corresponding to multipoles in the ranges 10 < l < 60 (Oscill-1) and 150 < l < 300 (Oscill-2). Our results show that one of the step-potentials (Oscill-1) provides a better fit to the CMB data than does the featureless LCDM scenario, with a moderate Bayesian evidence in favor of the former. Adding the P (k) data to the analysis weakens the evidence of the Oscill-1 potential relat...

  12. Aerosol processing of materials: Aerosol dynamics and microstructure evolution

    Science.gov (United States)

    Gurav, Abhijit Shankar

    Spray pyrolysis is an aerosol process commonly used to synthesize a wide variety of materials in powder or film forms including metals, metal oxides and non-oxide ceramics. It is capable of producing high purity, unagglomerated, and micrometer to submicron-size powders, and scale-up has been demonstrated. This dissertation deals with the study of aerosol dynamics during spray pyrolysis of multicomponent systems involving volatile phases/components, and aspects involved with using fuel additives during spray processes to break apart droplets and particles in order to produce powders with smaller sizes. The gas-phase aerosol dynamics and composition size distributions were measured during spray pyrolysis of (Bi, Pb)-Sr-Ca-Cu-O, and Sr-Ru-O and Bi-Ru-O at different temperatures. A differential mobility analyzer (DMA) was used in conjunction with a condensation particle counter (CPC) to monitor the gas-phase particle size distributions, and a Berner-type low-pressure impactor was used to obtain mass size distributions and size-classified samples for chemical analysis. (Bi, Pb)-Sr-Ca-Cu-O powders made at temperatures up to 700sp°C maintained their initial stoichiometry over the whole range of particle sizes monitored, however, those made at 800sp°C and above were heavily depleted in lead in the size range 0.5-5.0 mum. When the reactor temperature was raised from 700 and 800sp°C to 900sp°C, a large number ({˜}10sp7\\ #/cmsp3) of new ultrafine particles were formed from PbO vapor released from the particles and the reactor walls at the beginning of high temperature runs (at 900sp°C). The metal ruthenate systems showed generation of ultrafine particles (measurements were also used to monitor the gas-phase particle size distributions during the generation of fullerene (Csb{60}) nano-particles (30 to 50 nm size) via vapor condensation at 400-650sp°C using Nsb2 carrier gas. In general, during laboratory-scale aerosol processing of materials containing a volatile

  13. Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.

    Science.gov (United States)

    Zanini, Andrea; Woodbury, Allan D

    2016-01-01

    The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).

  14. Bayesian demography 250 years after Bayes.

    Science.gov (United States)

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  15. Bayesian inference for OPC modeling

    Science.gov (United States)

    Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.

    2016-03-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.

  16. Bayesian analysis of cosmic structures

    CERN Document Server

    Kitaura, Francisco-Shu

    2011-01-01

    We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...

  17. Size measurement of radioactive aerosol particles in intense radiation fields using wire screens and imaging plates

    Energy Technology Data Exchange (ETDEWEB)

    Oki, Yuichi; Tanaka, Toru; Takamiya, Koichi; Ishi, Yoshihiro; UesugI, Tomonori; Kuriyama, Yasutoshi; Sakamoto, Masaaki; Ohtsuki, Tsutomu [Kyoto University Research Reactor Institute, Osaka (Japan); Nitta, Shinnosuke [Graduate School of Engineering, Kyoto University, Kyoto (Japan); Osada, Naoyuki [Advanced Science Research Center, Okayama University, Okayama (Japan)

    2016-09-15

    Very fine radiation-induced aerosol particles are produced in intense radiation fields, such as high-intensity accelerator rooms and containment vessels such as those in the Fukushima Daiichi nuclear power plant (FDNPP). Size measurement of the aerosol particles is very important for understanding the behavior of radioactive aerosols released in the FDNPP accident and radiation safety in high-energy accelerators. A combined technique using wire screens and imaging plates was developed for size measurement of fine radioactive aerosol particles smaller than 100 nm in diameter. This technique was applied to the radiation field of a proton accelerator room, in which radioactive atoms produced in air during machine operation are incorporated into radiation-induced aerosol particles. The size of 11C-bearing aerosol particles was analyzed using the wire screen technique in distinction from other positron emitters in combination with a radioactive decay analysis. The size distribution for 11C-bearing aerosol particles was found to be ca. 70 μm in geometric mean diameter. The size was similar to that for 7Be-bearing particles obtained by a Ge detector measurement, and was slightly larger than the number-based size distribution measured with a scanning mobility particle sizer. The particle size measuring method using wire screens and imaging plates was successfully applied to the fine aerosol particles produced in an intense radiation field of a proton accelerator. This technique is applicable to size measurement of radioactive aerosol particles produced in the intense radiation fields of radiation facilities.

  18. ESBWR passive aerosol removal - maintaining plant simplicity for offsite dose reduction; Elimination passive des aerosols dans l`ESBWR - assurer la simplicite pour reduire les doses hors site

    Energy Technology Data Exchange (ETDEWEB)

    Guntay, S.; Suckow, D. [Paul Scherrer Inst. (PSI), Villigen (Switzerland); Siccama, N.B. [Netherlands Energy Research Foundation (ECN), Petten (Netherlands); Khorana, S.S. [General Electric Nuclear Energy (GENE) (United States)

    1996-12-31

    The European Simplified Boiling Water Reactor (ESBWR) being designed by a team consisting of several European partners and General Electric (GE) incorporates the latest research in its approach towards the removal of aerosols that are generated subsequent to a severe accident. The efforts of this international team have resulted in several new developments. One such innovative development in the ESBWR is the dual function PCC (Passive Containment Cooling) heat exchanger. In addition to removing decay remove in-containment aerosols. However, the decay heat removal capability of these PCC heat exchangers in an aerosol environment needs to be determined. This paper provides a general description of fission product control in the ESBWR. In addition, results of one test that has been performed at the Aerosol Impaction and Deposition Analysis (AIDA) - PCCS facility of the Paul Scherrer Institute (PSI) are also presented. The objective of this test was to determine the PCC heat exchanger performance when high concentrations of aerosols are released in the containment. The test was performed under humidity and aerosol concentrations that were on the upper end of the range expected in the ESBWR. Results indicate a high degree of aerosol retention in the PCCS heat exchangers. Moreover, the effectiveness of the PCC heat exchanger units in continuing to remove decay heat in a high concentration aerosol environment was demonstrated. (authors). 7 refs.

  19. An introduction to Gaussian Bayesian networks.

    Science.gov (United States)

    Grzegorczyk, Marco

    2010-01-01

    The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain.

  20. Modification of combustion aerosols in the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Weingartner, E. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-07-01

    Combustion aerosols particles are released on large scale into the atmosphere in the industrialized regions as well as in the tropics (by wood fires). The particles are subjected to various aging processes which depend on the size, morphology, and chemical composition of the particles. The interaction of combustion particles with sunlight and humidity as well as adsorption and desorption of volatile material to or from the particles considerably changes their physical and chemical properties and thus their residence time in the atmosphere. This is of importance because combustion particles are known to have a variety of health effects on people. Moreover, atmospheric aerosol particles have an influence on climate, directly through the reflection and absorption of solar radiation and indirectly through modifying the optical properties and lifetime of clouds. In a first step, a field experiment was carried out to study the sources and characteristics of combustion aerosols that are emitted from vehicles in a road tunnel. It was found that most of the fine particles were tail pipe emissions of diesel powered vehicles. The calculation shows that on an average these vehicles emit about 300 mg fine particulate matter per driven kilometer. This emission factor is at least 100 times higher than the mean emission factor estimated for gasoline powered vehicles. Furthermore, it is found that during their residence time in the tunnel, the particles undergo significant changes: The particles change towards a more compact structure. The conclusion is reached that this is mainly due to adsorption of volatile material from the gas phase to the particle surface. In the atmosphere, the life cycle as well as the radiative and chemical properties of an aerosol particle is strongly dependent on its response to humidity. Therefore the hygroscopic behavior of combustion particles emitted from single sources (i.e. from a gasoline and a diesel engine) were studied in laboratory experiments.

  1. Evaporation of droplets in a Champagne wine aerosol

    Science.gov (United States)

    Ghabache, Elisabeth; Liger-Belair, Gérard; Antkowiak, Arnaud; Séon, Thomas

    2016-04-01

    In a single glass of champagne about a million bubbles nucleate on the wall and rise towards the surface. When these bubbles reach the surface and rupture, they project a multitude of tiny droplets in the form of a particular aerosol holding a concentrate of wine aromas. Based on the model experiment of a single bubble bursting in idealized champagnes, the key features of the champagne aerosol are identified. In particular, we show that film drops, critical in sea spray for example, are here nonexistent. We then demonstrate that compared to a still wine, champagne fizz drastically enhances the transfer of liquid into the atmosphere. There, conditions on bubble radius and wine viscosity that optimize aerosol evaporation are provided. These results pave the way towards the fine tuning of flavor release during sparkling wine tasting, a major issue for the sparkling wine industry.

  2. Bayesian Calibration of Microsimulation Models.

    Science.gov (United States)

    Rutter, Carolyn M; Miglioretti, Diana L; Savarino, James E

    2009-12-01

    Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models.

  3. Aerosol Sample Inhomogeneity with Debris from the Fukushima Daiichi Nuclear Accident

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, Reynaido; Biegalski, Steven R.; Woods, Vincent T.

    2014-09-01

    Radionuclide aerosol sampling is a vital component in the detection of nuclear explosions, nuclear accidents, and other radiation releases. This was proven by the detection and tracking of emissions from the Fukushima Daiichi incident across the globe by IMS stations. Two separate aerosol samplers were operated in Richland, WA following the event and debris from the accident were measured at levels well above detection limits. While the atmospheric activity concentration of radionuclides generally compared well between the two stations, they did not agree within uncertainties. This paper includes a detailed study of the aerosol sample homogeneity of 134Cs and 137Cs, then relates it to the overall uncertainty of the original measurement. Our results show that sample inhomogeneity adds an additional 5–10% uncertainty to each aerosol measurement and that this uncertainty is in the same range as the discrepancies between the two aerosol sample measurements from Richland, WA.

  4. Dimensionality reduction in Bayesian estimation algorithms

    Directory of Open Access Journals (Sweden)

    G. W. Petty

    2013-03-01

    Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.

  5. Bayesian modeling of flexible cognitive control.

    Science.gov (United States)

    Jiang, Jiefeng; Heller, Katherine; Egner, Tobias

    2014-10-01

    "Cognitive control" describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation.

  6. Multi-Fraction Bayesian Sediment Transport Model

    Directory of Open Access Journals (Sweden)

    Mark L. Schmelter

    2015-09-01

    Full Text Available A Bayesian approach to sediment transport modeling can provide a strong basis for evaluating and propagating model uncertainty, which can be useful in transport applications. Previous work in developing and applying Bayesian sediment transport models used a single grain size fraction or characterized the transport of mixed-size sediment with a single characteristic grain size. Although this approach is common in sediment transport modeling, it precludes the possibility of capturing processes that cause mixed-size sediments to sort and, thereby, alter the grain size available for transport and the transport rates themselves. This paper extends development of a Bayesian transport model from one to k fractional dimensions. The model uses an existing transport function as its deterministic core and is applied to the dataset used to originally develop the function. The Bayesian multi-fraction model is able to infer the posterior distributions for essential model parameters and replicates predictive distributions of both bulk and fractional transport. Further, the inferred posterior distributions are used to evaluate parametric and other sources of variability in relations representing mixed-size interactions in the original model. Successful OPEN ACCESS J. Mar. Sci. Eng. 2015, 3 1067 development of the model demonstrates that Bayesian methods can be used to provide a robust and rigorous basis for quantifying uncertainty in mixed-size sediment transport. Such a method has heretofore been unavailable and allows for the propagation of uncertainty in sediment transport applications.

  7. Tactile length contraction as Bayesian inference.

    Science.gov (United States)

    Tong, Jonathan; Ngo, Vy; Goldreich, Daniel

    2016-08-01

    To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.

  8. Bayesian modeling of flexible cognitive control

    Science.gov (United States)

    Jiang, Jiefeng; Heller, Katherine; Egner, Tobias

    2014-01-01

    “Cognitive control” describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation. PMID:24929218

  9. Computationally efficient Bayesian inference for inverse problems.

    Energy Technology Data Exchange (ETDEWEB)

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  10. Sea Spray Aerosols

    DEFF Research Database (Denmark)

    Butcher, Andrew Charles

    a relationship between plunging jet particle ux, oceanic particle ux, and energy dissipation rate in both systems. Previous sea spray aerosol studies dissipate an order of magnitude more energy for the same particle ux production as the open ocean. A scaling factor related to the energy expended in air...... emissions produced directly from bubble bursting as the result of air entrainment from breaking waves and particles generated from secondary emissions of volatile organic compounds. In the first paper, we study the chemical properties of particles produced from several sea water proxies with the use...... jet in high concentrations of surface active organics and brackish water salinities. The jet produces particles with less cloud condensation activity, implying an increase in organic material in aerosol particles produced by the plunging jet over the frit. In the second paper we determine...

  11. Aerosol Observing System (AOS) Handbook

    Energy Technology Data Exchange (ETDEWEB)

    Jefferson, A

    2011-01-17

    The Aerosol Observing System (AOS) is a suite of in situ surface measurements of aerosol optical and cloud-forming properties. The instruments measure aerosol properties that influence the earth’s radiative balance. The primary optical measurements are those of the aerosol scattering and absorption coefficients as a function of particle size and radiation wavelength and cloud condensation nuclei (CCN) measurements as a function of percent supersaturation. Additional measurements include those of the particle number concentration and scattering hygroscopic growth. Aerosol optical measurements are useful for calculating parameters used in radiative forcing calculations such as the aerosol single-scattering albedo, asymmetry parameter, mass scattering efficiency, and hygroscopic growth. CCN measurements are important in cloud microphysical models to predict droplet formation.

  12. Aerosol characterization during project POLINAT

    Energy Technology Data Exchange (ETDEWEB)

    Hagen, D.E.; Hopkins, A.R.; Paladino, J.D.; Whitefield, P.D. [Missouri Univ., Rolla, MO (United States). Cloud and Aerosol Sciences Lab.; Lilenfeld, H.V. [McDonnell Douglas Aerospace-East, St. Louis, MO (United States)

    1997-12-31

    The objectives of the aerosol/particulate characterization measurements of project POLINAT (POLlution from aircraft emissions In the North ATlantic flight corridor) are: to search for aerosol/particulate signatures of air traffic emissions in the region of the North Atlantic Flight Corridor; to search for the aerosol/particulate component of large scale enhancement (`corridor effects`) of air traffic related species in the North Atlantic region; to determine the effective emission indices for the aerosol/particulate component of engine exhaust in both the near and far field of aircraft exhaust plumes; to measure the dispersion and transformation of the aerosol/particulate component of aircraft emissions as a function of ambient condition; to characterize background levels of aerosol/particulate concentrations in the North Atlantic Region; and to determine effective emission indices for engine exhaust particulates for regimes beyond the jet phase of plume expansion. (author) 10 refs.

  13. Acidic aerosol in urban air

    Energy Technology Data Exchange (ETDEWEB)

    Fukuda, M.; Yamaoka, S.; Miyazaki, T.; Oka, M.

    1982-01-01

    The distribution and chemical composition of acidic aerosol in Osaka City were investigated. Samples were collected at five sites in the city from June to September, 1979. Acidic aerosol was determined by the acid-base titration method, sulfate ion by barium chloride turbidimetry, nitrate ion by the xylenol method, and chloride ion by the mercury thiocyanate method. The concentration of acidic aerosol at five sites ranged from 7.7 micrograms per cubic meter to 10.0 micrograms per cubic meter, but mean concentrations in the residential area were slightly higher than those in the industrial area. When acidic aerosol concentrations were compared with concentrations of sulfate, nitrate, and chloride ions, a significant correlation was found between acidic aerosol and sulfate ion. The sum of the ion equivalents of the three types showed good correlation with the acidic aerosol equivalent during the whole period.

  14. Monsoon sensitivity to aerosol direct radiative forcing in the community atmosphere model

    Science.gov (United States)

    Sajani, S.; Krishna Moorthy, K.; Rajendran, K.; Nanjundiah, Ravi S.

    2012-08-01

    Aerosol forcing remains a dominant uncertainty in climate studies. The impact of aerosol direct radiative forcing on Indian monsoon is extremely complex and is strongly dependent on the model, aerosol distribution and characteristics specified in the model, modelling strategy employed as well as on spatial and temporal scales. The present study investigates (i) the aerosol direct radiative forcing impact on mean Indian summer monsoon when a combination of quasi-realistic mean annual cycles of scattering and absorbing aerosols derived from an aerosol transport model constrained with satellite observed Aerosol Optical Depth (AOD) is prescribed, (ii) the dominant feedback mechanism behind the simulated impact of all-aerosol direct radiative forcing on monsoon and (iii) the relative impacts of absorbing and scattering aerosols on mean Indian summer monsoon. We have used CAM3, an atmospheric GCM (AGCM) that has a comprehensive treatment of the aerosol-radiation interaction. This AGCM has been used to perform climate simulations with three different representations of aerosol direct radiative forcing due to the total, scattering aerosols and black carbon aerosols. We have also conducted experiments without any aerosol forcing. Aerosol direct impact due to scattering aerosols causes significant reduction in summer monsoon precipitation over India with a tendency for southward shift of Tropical Convergence Zones (TCZs) over the Indian region. Aerosol forcing reduces surface solar absorption over the primary rainbelt region of India and reduces the surface and lower tropospheric temperatures. Concurrent warming of the lower atmosphere over the warm oceanic region in the south reduces the land-ocean temperature contrast and weakens the monsoon overturning circulation and the advection of moisture into the landmass. This increases atmospheric convective stability, and decreases convection, clouds, precipitation and associated latent heat release. Our analysis reveals a

  15. Resolution and Content Improvements to MISR Aerosol and Land Surface Products

    Science.gov (United States)

    Garay, M. J.; Bull, M. A.; Diner, D. J.; Hansen, E. G.; Kalashnikova, O. V.

    2015-12-01

    Since early 2000, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite has been providing operational Level 2 (swath-based) aerosol optical depth (AOD) and particle property retrievals at 17.6 km spatial resolution and atmospherically corrected land surface products at 1.1 km resolution. The performance of the aerosol product has been validated against ground-based Aerosol Robotic Network (AERONET) observations, model comparisons, and climatological assessments. This product has played a major role in studies of the impacts of aerosols on climate and air quality. The surface product has found a variety of uses, particularly at regional scales for assessing vegetation and land surface change. A major development effort has led to the release of an update to the operational (Version 22) MISR Level 2 aerosol and land surface retrieval products, which has been in production since December 2007. The new release is designated Version 23. The resolution of the aerosol product has been increased to 4.4 km, allowing more detailed characterization of aerosol spatial variability, especially near local sources and in urban areas. The product content has been simplified and updated to include more robust measures of retrieval uncertainty and other fields to benefit users. The land surface product has also been updated to incorporate the Version 23 aerosol product as input and to improve spatial coverage, particularly over mountainous terrain and snow/ice-covered surfaces. We will describe the major upgrades incorporated in Version 23 and present validation of the aerosol product against both the standard AERONET historical database, as well as high spatial density AERONET-DRAGON deployments. Comparisons will also be shown relative to the Version 22 aerosol and land surface products. Applications enabled by these product updates will be discussed.

  16. Bayesian Inference in Polling Technique: 1992 Presidential Polls.

    Science.gov (United States)

    Satake, Eiki

    1994-01-01

    Explores the potential utility of Bayesian statistical methods in determining the predictability of multiple polls. Compares Bayesian techniques to the classical statistical method employed by pollsters. Considers these questions in the context of the 1992 presidential elections. (HB)

  17. Bayesian modeling of unknown diseases for biosurveillance.

    Science.gov (United States)

    Shen, Yanna; Cooper, Gregory F

    2009-11-14

    This paper investigates Bayesian modeling of unknown causes of events in the context of disease-outbreak detection. We introduce a Bayesian approach that models and detects both (1) known diseases (e.g., influenza and anthrax) by using informative prior probabilities and (2) unknown diseases (e.g., a new, highly contagious respiratory virus that has never been seen before) by using relatively non-informative prior probabilities. We report the results of simulation experiments which support that this modeling method can improve the detection of new disease outbreaks in a population. A key contribution of this paper is that it introduces a Bayesian approach for jointly modeling both known and unknown causes of events. Such modeling has broad applicability in medical informatics, where the space of known causes of outcomes of interest is seldom complete.

  18. Learning Bayesian Networks from Correlated Data

    Science.gov (United States)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola

    2016-05-01

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  19. Bayesian Methods for Radiation Detection and Dosimetry

    CERN Document Server

    Groer, Peter G

    2002-01-01

    We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...

  20. Event generator tuning using Bayesian optimization

    CERN Document Server

    Ilten, Philip; Yang, Yunjie

    2016-01-01

    Monte Carlo event generators contain a large number of parameters that must be determined by comparing the output of the generator with experimental data. Generating enough events with a fixed set of parameter values to enable making such a comparison is extremely CPU intensive, which prohibits performing a simple brute-force grid-based tuning of the parameters. Bayesian optimization is a powerful method designed for such black-box tuning applications. In this article, we show that Monte Carlo event generator parameters can be accurately obtained using Bayesian optimization and minimal expert-level physics knowledge. A tune of the PYTHIA 8 event generator using $e^+e^-$ events, where 20 parameters are optimized, can be run on a modern laptop in just two days. Combining the Bayesian optimization approach with expert knowledge should enable producing better tunes in the future, by making it faster and easier to study discrepancies between Monte Carlo and experimental data.

  1. Learning Bayesian Networks from Correlated Data.

    Science.gov (United States)

    Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H; Perls, Thomas T; Sebastiani, Paola

    2016-05-05

    Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.

  2. Bayesian Inference Methods for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand

    2013-01-01

    This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development...... of Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation...... analysis of the complex prior representation, where we show that the ability to induce sparse estimates of a given prior heavily depends on the inference method used and, interestingly, whether real or complex variables are inferred. We also show that the Bayesian estimators derived from the proposed...

  3. Bayesian Fusion of Multi-Band Images

    CERN Document Server

    Wei, Qi; Tourneret, Jean-Yves

    2013-01-01

    In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...

  4. Distributed Bayesian Networks for User Modeling

    DEFF Research Database (Denmark)

    Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang

    2006-01-01

    by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...... efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....

  5. Bayesian Image Reconstruction Based on Voronoi Diagrams

    CERN Document Server

    Cabrera, G F; Hitschfeld, N

    2007-01-01

    We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.

  6. Variational Bayesian Inference of Line Spectra

    DEFF Research Database (Denmark)

    Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri

    2016-01-01

    In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...

  7. Hessian PDF reweighting meets the Bayesian methods

    CERN Document Server

    Paukkunen, Hannu

    2014-01-01

    We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\\Delta\\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\\Delta\\chi^2$ criterion is properly included to the Bayesian likelihood function that is a simple exponential.

  8. Bayesian analysis of MEG visual evoked responses

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, D.M.; George, J.S.; Wood, C.C.

    1999-04-01

    The authors developed a method for analyzing neural electromagnetic data that allows probabilistic inferences to be drawn about regions of activation. The method involves the generation of a large number of possible solutions which both fir the data and prior expectations about the nature of probable solutions made explicit by a Bayesian formalism. In addition, they have introduced a model for the current distributions that produce MEG and (EEG) data that allows extended regions of activity, and can easily incorporate prior information such as anatomical constraints from MRI. To evaluate the feasibility and utility of the Bayesian approach with actual data, they analyzed MEG data from a visual evoked response experiment. They compared Bayesian analyses of MEG responses to visual stimuli in the left and right visual fields, in order to examine the sensitivity of the method to detect known features of human visual cortex organization. They also examined the changing pattern of cortical activation as a function of time.

  9. Bayesian Analysis of Perceived Eye Level

    Science.gov (United States)

    Orendorff, Elaine E.; Kalesinskas, Laurynas; Palumbo, Robert T.; Albert, Mark V.

    2016-01-01

    To accurately perceive the world, people must efficiently combine internal beliefs and external sensory cues. We introduce a Bayesian framework that explains the role of internal balance cues and visual stimuli on perceived eye level (PEL)—a self-reported measure of elevation angle. This framework provides a single, coherent model explaining a set of experimentally observed PEL over a range of experimental conditions. Further, it provides a parsimonious explanation for the additive effect of low fidelity cues as well as the averaging effect of high fidelity cues, as also found in other Bayesian cue combination psychophysical studies. Our model accurately estimates the PEL and explains the form of previous equations used in describing PEL behavior. Most importantly, the proposed Bayesian framework for PEL is more powerful than previous behavioral modeling; it permits behavioral estimation in a wider range of cue combination and perceptual studies than models previously reported. PMID:28018204

  10. Dynamic Bayesian Combination of Multiple Imperfect Classifiers

    CERN Document Server

    Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon

    2012-01-01

    Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...

  11. Topics in current aerosol research

    CERN Document Server

    Hidy, G M

    1971-01-01

    Topics in Current Aerosol Research deals with the fundamental aspects of aerosol science, with emphasis on experiment and theory describing highly dispersed aerosols (HDAs) as well as the dynamics of charged suspensions. Topics covered range from the basic properties of HDAs to their formation and methods of generation; sources of electric charges; interactions between fluid and aerosol particles; and one-dimensional motion of charged cloud of particles. This volume is comprised of 13 chapters and begins with an introduction to the basic properties of HDAs, followed by a discussion on the form

  12. Renin release

    DEFF Research Database (Denmark)

    Schweda, Frank; Friis, Ulla; Wagner, Charlotte;

    2007-01-01

    The aspartyl-protease renin is the key regulator of the renin-angiotensin-aldosterone system, which is critically involved in salt, volume, and blood pressure homeostasis of the body. Renin is mainly produced and released into circulation by the so-called juxtaglomerular epithelioid cells, located......, salt, and volume overload. In contrast, the events controlling the function of renin-secreting cells at the organ and cellular level are markedly less clear and remain mysterious in certain aspects. The unravelling of these mysteries has led to new and interesting insights into the process of renin...

  13. Structure learning for Bayesian networks as models of biological networks.

    Science.gov (United States)

    Larjo, Antti; Shmulevich, Ilya; Lähdesmäki, Harri

    2013-01-01

    Bayesian networks are probabilistic graphical models suitable for modeling several kinds of biological systems. In many cases, the structure of a Bayesian network represents causal molecular mechanisms or statistical associations of the underlying system. Bayesian networks have been applied, for example, for inferring the structure of many biological networks from experimental data. We present some recent progress in learning the structure of static and dynamic Bayesian networks from data.

  14. A Bayesian Analysis of Spectral ARMA Model

    Directory of Open Access Journals (Sweden)

    Manoel I. Silvestre Bezerra

    2012-01-01

    Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.

  15. Length Scales in Bayesian Automatic Adaptive Quadrature

    Directory of Open Access Journals (Sweden)

    Adam Gh.

    2016-01-01

    Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  16. Bayesian estimation and tracking a practical guide

    CERN Document Server

    Haug, Anton J

    2012-01-01

    A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation

  17. Bayesian long branch attraction bias and corrections.

    Science.gov (United States)

    Susko, Edward

    2015-03-01

    Previous work on the star-tree paradox has shown that Bayesian methods suffer from a long branch attraction bias. That work is extended to settings involving more taxa and partially resolved trees. The long branch attraction bias is confirmed to arise more broadly and an additional source of bias is found. A by-product of the analysis is methods that correct for biases toward particular topologies. The corrections can be easily calculated using existing Bayesian software. Posterior support for a set of two or more trees can thus be supplemented with corrected versions to cross-check or replace results. Simulations show the corrections to be highly effective.

  18. From retrodiction to Bayesian quantum imaging

    Science.gov (United States)

    Speirits, Fiona C.; Sonnleitner, Matthias; Barnett, Stephen M.

    2017-04-01

    We employ quantum retrodiction to develop a robust Bayesian algorithm for reconstructing the intensity values of an image from sparse photocount data, while also accounting for detector noise in the form of dark counts. This method yields not only a reconstructed image but also provides the full probability distribution function for the intensity at each pixel. We use simulated as well as real data to illustrate both the applications of the algorithm and the analysis options that are only available when the full probability distribution functions are known. These include calculating Bayesian credible regions for each pixel intensity, allowing an objective assessment of the reliability of the reconstructed image intensity values.

  19. A Bayesian Concept Learning Approach to Crowdsourcing

    DEFF Research Database (Denmark)

    Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;

    2011-01-01

    techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing......We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....

  20. Bayesian Optimisation Algorithm for Nurse Scheduling

    CERN Document Server

    Li, Jingpeng

    2008-01-01

    Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.

  1. Source strength of fungal spore aerosolization from moldy building material

    Energy Technology Data Exchange (ETDEWEB)

    Gorny, Rafa L.; Reponen, Tiina; Grinshpun, Sergey A.; Willeke, Klaus [Cincinnati Univ., Dept. of Environmental Health, Cincinnati, OH (United States)

    2001-07-01

    The release of Aspergillus versicolor, Cladosporium cladosporioides, and Penicillium melinii spores from agar and ceiling tile surfaces was tested under different controlled environmental conditions using a newly designed and constructed aerosolization chamber. This study revealed that all the investigated parameters, such as fungal species, air velocity above the surface, texture of the surface, and vibration of contaminated material, affected the fungal spore release. It was found that typical indoor air currents can release up to 200 spores cm {sup -2} from surface with fungal spores during 30-min experiments. The release of fungal spores from smooth agar surfaces was found to be inadequate for accurately predicting the emission from rough ceiling tile surfaces because the air turbulence increases the spore release from a rough surface. A vibration of a frequency of 1Hz at a power level of 14W resulted in a significant increase in the spore release rate. The release appears to depend on the morphology of the fungal colonies grown on ceiling tile surfaces including the thickness of conidiophores, the length of spore chains, and the shape of spores. The spores were found to be released continuously during each 30-min experiment. However, the release rate was usually highest during the first few minutes of exposure to air currents and mechanical vibration. About 71-88% of the spores released during a 30-min interval became airborne during the first 10min. (Author)

  2. Bayesian Just-So Stories in Psychology and Neuroscience

    Science.gov (United States)

    Bowers, Jeffrey S.; Davis, Colin J.

    2012-01-01

    According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…

  3. A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation

    DEFF Research Database (Denmark)

    Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri

    2013-01-01

    representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...

  4. Prior approval: the growth of Bayesian methods in psychology.

    Science.gov (United States)

    Andrews, Mark; Baguley, Thom

    2013-02-01

    Within the last few years, Bayesian methods of data analysis in psychology have proliferated. In this paper, we briefly review the history or the Bayesian approach to statistics, and consider the implications that Bayesian methods have for the theory and practice of data analysis in psychology.

  5. A default Bayesian hypothesis test for ANOVA designs

    NARCIS (Netherlands)

    Wetzels, R.; Grasman, R.P.P.P.; Wagenmakers, E.J.

    2012-01-01

    This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA desig

  6. From arguments to constraints on a Bayesian network

    NARCIS (Netherlands)

    Bex, F.J.; Renooij, S.

    2016-01-01

    In this paper, we propose a way to derive constraints for a Bayesian Network from structured arguments. Argumentation and Bayesian networks can both be considered decision support techniques, but are typically used by experts with different backgrounds. Bayesian network experts have the mathematical

  7. A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research

    NARCIS (Netherlands)

    Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t

  8. A SAS Interface for Bayesian Analysis with WinBUGS

    Science.gov (United States)

    Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki

    2008-01-01

    Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…

  9. Aerosol absorption and radiative forcing

    Directory of Open Access Journals (Sweden)

    P. Stier

    2007-05-01

    Full Text Available We present a comprehensive examination of aerosol absorption with a focus on evaluating the sensitivity of the global distribution of aerosol absorption to key uncertainties in the process representation. For this purpose we extended the comprehensive aerosol-climate model ECHAM5-HAM by effective medium approximations for the calculation of aerosol effective refractive indices, updated black carbon refractive indices, new cloud radiative properties considering the effect of aerosol inclusions, as well as by modules for the calculation of long-wave aerosol radiative properties and instantaneous aerosol forcing. The evaluation of the simulated aerosol absorption optical depth with the AERONET sun-photometer network shows a good agreement in the large scale global patterns. On a regional basis it becomes evident that the update of the BC refractive indices to Bond and Bergstrom (2006 significantly improves the previous underestimation of the aerosol absorption optical depth. In the global annual-mean, absorption acts to reduce the short-wave anthropogenic aerosol top-of-atmosphere (TOA radiative forcing clear-sky from –0.79 to –0.53 W m−2 (33% and all-sky from –0.47 to –0.13 W m−2 (72%. Our results confirm that basic assumptions about the BC refractive index play a key role for aerosol absorption and radiative forcing. The effect of the usage of more accurate effective medium approximations is comparably small. We demonstrate that the diversity in the AeroCom land-surface albedo fields contributes to the uncertainty in the simulated anthropogenic aerosol radiative forcings: the usage of an upper versus lower bound of the AeroCom land albedos introduces a global annual-mean TOA forcing range of 0.19 W m−2 (36% clear-sky and of 0.12 W m−2 (92% all-sky. The consideration of black carbon inclusions on cloud radiative properties results in a small global annual-mean all-sky absorption of 0.05 W

  10. Insoluble aerosol behavior inside the PCCS condenser tube under severe accident conditions

    Energy Technology Data Exchange (ETDEWEB)

    Watanabe, A.; Nemoto, K.; Akinaga, M. [Toshiba Corp., Kawasaki (Japan); Oikawa, H. [Toshiba Corp., Yokohama (Japan)

    1996-07-01

    The passive containment cooling system (PCCS), which has been incorporated into the advanced light water reactor (ALWR) design, has the capability of post accident decay heat removal by means of natural force driven condensation heat transfer. Since some uncertainties remain in the PCCS performance during a severe accident especially in the amount of aerosol deposition which causes the heat transfer degradation, the experiment had been performed previously simulating single condenser tube, postulated steam and noncondensable gas flow rate using prototypical soluble aerosol (CsI). The observed aerosol deposition rate onto the condenser tube surface was quite small under steam rich condition. However, during the severe accident, insoluble aerosols such as structural material might also be released and flow into the PCCS as well as soluble aerosol, and the deposition behavior has not been clarified. Thus, the experiment using a polystyrene LATEX was conducted under the same conditions in which the soluble aerosol test was performed. The experimental results showed similar trend as that of the soluble aerosol case, and especially in case of steam rich condition, the amount of deposition was below detection limit. The deposition rate in other cases are consistent with the prediction by existing theoretical correlation. Analytical sensitivity study varying inlet flow condition indicated no significant increase of aerosol deposition. These results suggest promising performance of PCCS under severe accident condition.

  11. Aerosol dynamics in porous media

    NARCIS (Netherlands)

    Ghazaryan, Lilya

    2014-01-01

    In this thesis, a computational model was developed for the simulation of aerosol formation through nucleation, followed by condensation and evaporation and filtration by porous material. Understanding aerosol dynamics in porous media can help improving engineering models that are used in various in

  12. Bayesian Fundamentalism or Enlightenment? On the explanatory status and theoretical contributions of Bayesian models of cognition.

    Science.gov (United States)

    Jones, Matt; Love, Bradley C

    2011-08-01

    The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls

  13. Bayesian inference-based environmental decision support systems for oil spill response strategy selection.

    Science.gov (United States)

    Davies, Andrew J; Hope, Max J

    2015-07-15

    Contingency plans are essential in guiding the response to marine oil spills. However, they are written before the pollution event occurs so must contain some degree of assumption and prediction and hence may be unsuitable for a real incident when it occurs. The use of Bayesian networks in ecology, environmental management, oil spill contingency planning and post-incident analysis is reviewed and analysed to establish their suitability for use as real-time environmental decision support systems during an oil spill response. It is demonstrated that Bayesian networks are appropriate for facilitating the re-assessment and re-validation of contingency plans following pollutant release, thus helping ensure that the optimum response strategy is adopted. This can minimise the possibility of sub-optimal response strategies causing additional environmental and socioeconomic damage beyond the original pollution event.

  14. Bayesian Vector Autoregressions with Stochastic Volatility

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1996-01-01

    This paper proposes a Bayesian approach to a vector autoregression with stochastic volatility, where the multiplicative evolution of the precision matrix is driven by a multivariate beta variate.Exact updating formulas are given to the nonlinear filtering of the precision matrix.Estimation of the au

  15. Bayesian Estimation Supersedes the "t" Test

    Science.gov (United States)

    Kruschke, John K.

    2013-01-01

    Bayesian estimation for 2 groups provides complete distributions of credible values for the effect size, group means and their difference, standard deviations and their difference, and the normality of the data. The method handles outliers. The decision rule can accept the null value (unlike traditional "t" tests) when certainty in the estimate is…

  16. Bayesian Meta-Analysis of Coefficient Alpha

    Science.gov (United States)

    Brannick, Michael T.; Zhang, Nanhua

    2013-01-01

    The current paper describes and illustrates a Bayesian approach to the meta-analysis of coefficient alpha. Alpha is the most commonly used estimate of the reliability or consistency (freedom from measurement error) for educational and psychological measures. The conventional approach to meta-analysis uses inverse variance weights to combine…

  17. Comprehension and computation in Bayesian problem solving

    Directory of Open Access Journals (Sweden)

    Eric D. Johnson

    2015-07-01

    Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.

  18. Bayesian Networks: Aspects of Approximate Inference

    NARCIS (Netherlands)

    Bolt, J.H.

    2008-01-01

    A Bayesian network can be used to model consisely the probabilistic knowledge with respect to a given problem domain. Such a network consists of an acyclic directed graph in which the nodes represent stochastic variables, supplemented with probabilities indicating the strength of the influences betw

  19. Bayesian Benefits for the Pragmatic Researcher

    NARCIS (Netherlands)

    Wagenmakers, E.-J.; Morey, R.D.; Lee, M.D.

    2016-01-01

    The practical advantages of Bayesian inference are demonstrated here through two concrete examples. In the first example, we wish to learn about a criminal’s IQ: a problem of parameter estimation. In the second example, we wish to quantify and track support in favor of the null hypothesis that Adam

  20. Communication cost in Distributed Bayesian Belief Networks

    NARCIS (Netherlands)

    Gosliga, S.P. van; Maris, M.G.

    2005-01-01

    In this paper, two different methods for information fusionare compared with respect to communication cost. These are the lambda-pi and the junction tree approach as the probability computing methods in Bayesian networks. The analysis is done within the scope of large distributed networks of computi

  1. Decision generation tools and Bayesian inference

    Science.gov (United States)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  2. Bayesian semiparametric dynamic Nelson-Siegel model

    NARCIS (Netherlands)

    C. Cakmakli

    2011-01-01

    This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric

  3. Multisnapshot Sparse Bayesian Learning for DOA

    DEFF Research Database (Denmark)

    Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki

    2016-01-01

    The directions of arrival (DOA) of plane waves are estimated from multisnapshot sensor array data using sparse Bayesian learning (SBL). The prior for the source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters, the unknown variances (i.e., the source p...

  4. Von Neumann Was Not a Quantum Bayesian

    CERN Document Server

    Stacey, Blake C

    2014-01-01

    Wikipedia has claimed for over two years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.

  5. Bayesian networks in neuroscience: a survey.

    Science.gov (United States)

    Bielza, Concha; Larrañaga, Pedro

    2014-01-01

    Bayesian networks are a type of probabilistic graphical models lie at the intersection between statistics and machine learning. They have been shown to be powerful tools to encode dependence relationships among the variables of a domain under uncertainty. Thanks to their generality, Bayesian networks can accommodate continuous and discrete variables, as well as temporal processes. In this paper we review Bayesian networks and how they can be learned automatically from data by means of structure learning algorithms. Also, we examine how a user can take advantage of these networks for reasoning by exact or approximate inference algorithms that propagate the given evidence through the graphical structure. Despite their applicability in many fields, they have been little used in neuroscience, where they have focused on specific problems, like functional connectivity analysis from neuroimaging data. Here we survey key research in neuroscience where Bayesian networks have been used with different aims: discover associations between variables, perform probabilistic reasoning over the model, and classify new observations with and without supervision. The networks are learned from data of any kind-morphological, electrophysiological, -omics and neuroimaging-, thereby broadening the scope-molecular, cellular, structural, functional, cognitive and medical- of the brain aspects to be studied.

  6. Bayesian Estimation of Thermonuclear Reaction Rates

    CERN Document Server

    Iliadis, Christian; Coc, Alain; Timmes, Frank; Starrfield, Sumner

    2016-01-01

    The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied in the past to this problem, all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extra-solar planets, gravitational waves, and type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present the first astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the d(p,$\\gamma$)$^3$He, $^3$He($^3$He,2p)$^4$He, and $^3$He($\\alpha$,$\\gamma$)$^7$Be reactions,...

  7. Bayesian Estimation of Thermonuclear Reaction Rates

    Science.gov (United States)

    Iliadis, C.; Anderson, K. S.; Coc, A.; Timmes, F. X.; Starrfield, S.

    2016-11-01

    The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p,γ)3He, 3He(3He,2p)4He, and 3He(α,γ)7Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.

  8. Bayesian regularization of diffusion tensor images

    DEFF Research Database (Denmark)

    Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif;

    2007-01-01

    several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing...

  9. Inverse Problems in a Bayesian Setting

    KAUST Repository

    Matthies, Hermann G.

    2016-02-13

    In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.

  10. Neural network classification - A Bayesian interpretation

    Science.gov (United States)

    Wan, Eric A.

    1990-01-01

    The relationship between minimizing a mean squared error and finding the optimal Bayesian classifier is reviewed. This provides a theoretical interpretation for the process by which neural networks are used in classification. A number of confidence measures are proposed to evaluate the performance of the neural network classifier within a statistical framework.

  11. On local optima in learning bayesian networks

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Kocka, Tomas; Pena, Jose

    2003-01-01

    This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...

  12. Automatic Thesaurus Construction Using Bayesian Networks.

    Science.gov (United States)

    Park, Young C.; Choi, Key-Sun

    1996-01-01

    Discusses automatic thesaurus construction and characterizes the statistical behavior of terms by using an inference network. Highlights include low-frequency terms and data sparseness, Bayesian networks, collocation maps and term similarity, constructing a thesaurus from a collocation map, and experiments with test collections. (Author/LRW)

  13. Posterior Predictive Model Checking in Bayesian Networks

    Science.gov (United States)

    Crawford, Aaron

    2014-01-01

    This simulation study compared the utility of various discrepancy measures within a posterior predictive model checking (PPMC) framework for detecting different types of data-model misfit in multidimensional Bayesian network (BN) models. The investigated conditions were motivated by an applied research program utilizing an operational complex…

  14. Diagnosis of Subtraction Bugs Using Bayesian Networks

    Science.gov (United States)

    Lee, Jihyun; Corter, James E.

    2011-01-01

    Diagnosis of misconceptions or "bugs" in procedural skills is difficult because of their unstable nature. This study addresses this problem by proposing and evaluating a probability-based approach to the diagnosis of bugs in children's multicolumn subtraction performance using Bayesian networks. This approach assumes a causal network relating…

  15. Basics of Bayesian Learning - Basically Bayes

    DEFF Research Database (Denmark)

    Larsen, Jan

    Tutorial presented at the IEEE Machine Learning for Signal Processing Workshop 2006, Maynooth, Ireland, September 8, 2006. The tutorial focuses on the basic elements of Bayesian learning and its relation to classical learning paradigms. This includes a critical discussion of the pros and cons...

  16. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...

  17. Bayesian Averaging is Well-Temperated

    DEFF Research Database (Denmark)

    Hansen, Lars Kai

    2000-01-01

    Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation...

  18. Bayesian calibration of car-following models

    NARCIS (Netherlands)

    Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.

    2010-01-01

    Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p

  19. Bayesian Analyses of Nonhomogeneous Autoregressive Processes

    Science.gov (United States)

    1986-09-01

    random coefficient autoregressive processes have a wide applicability in the analysis of economic, sociological, biological and industrial data...1980). Approximate Bayesian Methods. Trabajos Estadistica , Vol. 32, pp. 223-237. LIU, L. M. and G. C. TIAO (1980). Random Coefficient First

  20. Face detection by aggregated Bayesian network classifiers

    NARCIS (Netherlands)

    Pham, T.V.; Worring, M.; Smeulders, A.W.M.

    2002-01-01

    A face detection system is presented. A new classification method using forest-structured Bayesian networks is used. The method is used in an aggregated classifier to discriminate face from non-face patterns. The process of generating non-face patterns is integrated with the construction of the aggr

  1. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  2. Most frugal explanations in Bayesian networks

    NARCIS (Netherlands)

    Kwisthout, J.H.P.

    2015-01-01

    Inferring the most probable explanation to a set of variables, given a partial observation of the remaining variables, is one of the canonical computational problems in Bayesian networks, with widespread applications in AI and beyond. This problem, known as MAP, is computationally intractable (NP-ha

  3. Modelling crime linkage with Bayesian networks

    NARCIS (Netherlands)

    J. de Zoete; M. Sjerps; D. Lagnado; N. Fenton

    2015-01-01

    When two or more crimes show specific similarities, such as a very distinct modus operandi, the probability that they were committed by the same offender becomes of interest. This probability depends on the degree of similarity and distinctiveness. We show how Bayesian networks can be used to model

  4. Aerosols indirectly warm the Arctic

    Directory of Open Access Journals (Sweden)

    T. Mauritsen

    2010-07-01

    Full Text Available On average, airborne aerosol particles cool the Earth's surface directly by absorbing and scattering sunlight and indirectly by influencing cloud reflectivity, life time, thickness or extent. Here we show that over the central Arctic Ocean, where there is frequently a lack of aerosol particles upon which clouds may form, a small increase in aerosol loading may enhance cloudiness thereby likely causing a climatologically significant warming at the ice-covered Arctic surface. Under these low concentration conditions cloud droplets grow to drizzle sizes and fall, even in the absence of collisions and coalescence, thereby diminishing cloud water. Evidence from a case study suggests that interactions between aerosol, clouds and precipitation could be responsible for attaining the observed low aerosol concentrations.

  5. An Indigenously Developed Insecticidal Aerosol

    Directory of Open Access Journals (Sweden)

    R. N. Varma

    1969-10-01

    Full Text Available A total of 6 "Test" insecticidal aerosols (TA-I to VI indigenously produced were tested during the years 1966-67 as suitable replacements for imported aerosols.TA-I produced deep yellow staining and a yellowish spray mist. Its capacity was only 120 ml fluid. TA-III types II and III containing modified aerosol formulation with "Esso solvent 3245" and mineral turpentine oil (Burmah Shelland Freon 12 11 (all indigenouswere comparable to he "SRA" in insecticidial efficacy. The container was also manufactured in the country and it compared well with the "SRA" in construction, resistance against rough usage and mechanical function. They were both finally approved for introduction in the services as replacement for imported aerosols. TA-IV performed well in inscticidial assessment, but the aerosols formulation. TA-V and VI were similar to TA-III types II and III respectively.

  6. A merged aerosol dataset based on MODIS and MISR Aerosol Optical Depth products

    Science.gov (United States)

    Singh, Manoj K.; Gautam, Ritesh; Venkatachalam, Parvatham

    2016-05-01

    Aerosol Optical Depth (AOD) products available from MODIS and MISR observations are widely used for aerosol characterization, and global/environmental change studies. These products are based on different retrieval-algorithms, resolutions, sampling, and cloud-screening schemes, which have led to global/regional biases. Thus a merged product is desirable which bridges this gap by utilizing strengths from each of the sensors. In view of this, we have developed a "merged" AOD product based on MODIS and MISR AOD datasets, using Bayesian principles which takes error distributions from ground-based AOD measurements (from AERONET). Our methodology and resulting dataset are especially relevant in the scenario of combining multi-sensor retrievals for satellite-based climate data records; particularly for long-term studies involving AOD. Specifically for MISR AOD product, we also developed a methodology to produce a gap-filled dataset, using geostatistical methods (e.g. Kriging), taking advantage of available MODIS data. Merged and spatially-complete AOD datasets are inter-compared with other satellite products and with AERONET data at three stations- Kanpur, Jaipur and Gandhi College, in the Indo-Gangetic Plains. The RMSE of merged AOD (0.08-0.09) is lower than MISR (0.11-0.20) and MODIS (0.15-0.27). It is found that merged AOD has higher correlation with AERONET data (r within 0.92-0.95), compared to MISR (0.74-0.86) and MODIS (0.69-0.84) data. In terms of Expected Error, the accuracy of valid merged AOD is found to be superior as percent of merged AOD within error envelope are larger (71-92%), compared to MISR (43-61%) and MODIS (50-70%).

  7. A tutorial on Bayesian Normal linear regression

    Science.gov (United States)

    Klauenberg, Katy; Wübbeler, Gerd; Mickan, Bodo; Harris, Peter; Elster, Clemens

    2015-12-01

    Regression is a common task in metrology and often applied to calibrate instruments, evaluate inter-laboratory comparisons or determine fundamental constants, for example. Yet, a regression model cannot be uniquely formulated as a measurement function, and consequently the Guide to the Expression of Uncertainty in Measurement (GUM) and its supplements are not applicable directly. Bayesian inference, however, is well suited to regression tasks, and has the advantage of accounting for additional a priori information, which typically robustifies analyses. Furthermore, it is anticipated that future revisions of the GUM shall also embrace the Bayesian view. Guidance on Bayesian inference for regression tasks is largely lacking in metrology. For linear regression models with Gaussian measurement errors this tutorial gives explicit guidance. Divided into three steps, the tutorial first illustrates how a priori knowledge, which is available from previous experiments, can be translated into prior distributions from a specific class. These prior distributions have the advantage of yielding analytical, closed form results, thus avoiding the need to apply numerical methods such as Markov Chain Monte Carlo. Secondly, formulas for the posterior results are given, explained and illustrated, and software implementations are provided. In the third step, Bayesian tools are used to assess the assumptions behind the suggested approach. These three steps (prior elicitation, posterior calculation, and robustness to prior uncertainty and model adequacy) are critical to Bayesian inference. The general guidance given here for Normal linear regression tasks is accompanied by a simple, but real-world, metrological example. The calibration of a flow device serves as a running example and illustrates the three steps. It is shown that prior knowledge from previous calibrations of the same sonic nozzle enables robust predictions even for extrapolations.

  8. Bayesian structural equation modeling in sport and exercise psychology.

    Science.gov (United States)

    Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus

    2015-08-01

    Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.

  9. Universal Darwinism as a process of Bayesian inference

    CERN Document Server

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment". Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description clo...

  10. Speciation of Radiocesium and Radioiodine in Aerosols from Tsukuba after the Fukushima Nuclear Accident

    DEFF Research Database (Denmark)

    Xu, Sheng; Zhang, Luyuan; Freeman, Stewart P. H. T.

    2015-01-01

    Aerosol samples were collected from Tsukuba, Japan, soon after the 2011 Fukushima nuclear accident and analyzed for speciation of radiocesium and radioiodine to explore their chemical behavior and isotopic ratios after the release. Most Cs-134 and Cs-137 were bound in organic matter (53-91%) and ......Aerosol samples were collected from Tsukuba, Japan, soon after the 2011 Fukushima nuclear accident and analyzed for speciation of radiocesium and radioiodine to explore their chemical behavior and isotopic ratios after the release. Most Cs-134 and Cs-137 were bound in organic matter (53...

  11. Dust layer profiling using an aerosol dropsonde

    Science.gov (United States)

    Ulanowski, Zbigniew; Kaye, Paul Henry; Hirst, Edwin; Wieser, Andreas; Stanley, Warren

    2015-04-01

    Routine meteorological data is obtained in the atmosphere using disposable radiosondes, giving temperature, pressure, humidity and wind speed. Additional measurements are obtained from dropsondes, released from research aircraft. However, a crucial property not yet measured is the size and concentration of atmospheric particulates, including dust. Instead, indirect measurements are employed, relying on remote sensing, to meet the demands from areas such as climate research, air quality monitoring, civil emergencies etc. In addition, research aircraft can be used in situ, but airborne measurements are expensive, and aircraft use is restricted to near-horizontal profiling, which can be a limitation, as phenomena such as long-range transport depend on the vertical distribution of aerosol. The Centre for Atmospheric and Instrumentation Research at University of Hertfordshire develops light-scattering instruments for the characterization of aerosols and cloud particles. Recently a range of low-cost, miniature particle counters has been created, intended for use with systems such as disposable balloon-borne radiosondes, dropsondes, or in dense ground-based sensor networks. Versions for different particle size ranges exist. They have been used for vertical profiling of aerosols such as mineral dust or volcanic ash. A disadvantage of optical particle counters that sample through a narrow inlet is that they can become blocked, which can happen in cloud, for example. Hence, a different counter version has been developed, which can have open-path geometry, as the sensing zone is defined optically rather than being delimited by the flow system. This counter has been used for ground based air-quality monitoring around Heathrow airport. The counter has also been adapted for use with radiosondes or dropsondes. The dropsonde version has been successfully tested by launching it from research aircraft together with the so-called KITsonde, developed at the Karlsruhe Institute of

  12. Atmospheric and aerosol chemistry

    Energy Technology Data Exchange (ETDEWEB)

    McNeill, V. Faye [Columbia Univ., New York, NY (United States). Dept. of Chemical Engineering; Ariya, Parisa A. (ed.) [McGill Univ. Montreal, QC (Canada). Dept. of Chemistry; McGill Univ. Montreal, QC (Canada). Dept. of Atmospheric and Oceanic Sciences

    2014-09-01

    This series presents critical reviews of the present position and future trends in modern chemical research. Short and concise reports on chemistry, each written by the world renowned experts. Still valid and useful after 5 or 10 years. More information as well as the electronic version of the whole content available at: springerlink.com. Christian George, Barbara D'Anna, Hartmut Herrmann, Christian Weller, Veronica Vaida, D. J. Donaldson, Thorsten Bartels-Rausch, Markus Ammann Emerging Areas in Atmospheric Photochemistry. Lisa Whalley, Daniel Stone, Dwayne Heard New Insights into the Tropospheric Oxidation of Isoprene: Combining Field Measurements, Laboratory Studies, Chemical Modelling and Quantum Theory. Neil M. Donahue, Allen L. Robinson, Erica R. Trump, Ilona Riipinen, Jesse H. Kroll Volatility and Aging of Atmospheric Organic Aerosol. P. A. Ariya, G. Kos, R. Mortazavi, E. D. Hudson, V. Kanthasamy, N. Eltouny, J. Sun, C. Wilde Bio-Organic Materials in the Atmosphere and Snow: Measurement and Characterization V. Faye McNeill, Neha Sareen, Allison N. Schwier Surface-Active Organics in Atmospheric Aerosols.

  13. TNT Equivalency of Unconfined Aerosols of Propylene Oxide

    Directory of Open Access Journals (Sweden)

    A. Apparao

    2014-09-01

    Full Text Available The unconfined aerosols of propylene oxide (PO are formed by dispersing the fuel in air. These aerosols undergo detonation by suitable initiation and produce high impulse blast. Tri-nitro Toluene (TNT equivalence is an important parameter used to represent the power of explosive materials and compare their relative damage effects wrt TNT. The parameters commonly used for estimation of TNT equivalency are total energy of explosive source and properties of resulting blast wave, viz., blast peak overpressure and positive impulse. In the present study, the unconfined aerosols of 4.2 kg PO were formed by breaking open the cylindrical canister with the help of axially positioned central burster charge and then detonated using a secondary explosive charge after a preset time delay. The resulting blast profiles were recorded and the blast parameters were analysed. Being a non-ideal explosive source, the TNT equivalency depends on fraction of total energy utilised for blast formation, the rate of energy release, cloud dimensions, and concentration of fuel. Hence, various approaches based on energy release, experimental blast profiles, triangulated blast parameters, and ground reflected blast parameters were considered to determine the TNT equivalency of unconfined PO aerosols. It was observed that the TNT equivalency is not a single value but vary with distance. The paper provides various options for weapon designer to choose a suitable approach for considering TNT equivalency. The scaling laws established from the experimental data of unconfined aerosols of PO for blast peak over pressure and scaled impulse help in predicting the performance for different values of fuel weight and distance.Defence Science Journal, Vol. 64, No. 5, September 2014, pp.431-437, DOI:http://dx.doi.org/10.14429/dsj.64.6851

  14. International Cooperative for Aerosol Prediction Workshop on Aerosol Forecast Verification

    Science.gov (United States)

    Benedetti, Angela; Reid, Jeffrey S.; Colarco, Peter R.

    2011-01-01

    The purpose of this workshop was to reinforce the working partnership between centers who are actively involved in global aerosol forecasting, and to discuss issues related to forecast verification. Participants included representatives from operational centers with global aerosol forecasting requirements, a panel of experts on Numerical Weather Prediction and Air Quality forecast verification, data providers, and several observers from the research community. The presentations centered on a review of current NWP and AQ practices with subsequent discussion focused on the challenges in defining appropriate verification measures for the next generation of aerosol forecast systems.

  15. The GRAPE aerosol retrieval algorithm

    Directory of Open Access Journals (Sweden)

    G. E. Thomas

    2009-11-01

    Full Text Available The aerosol component of the Oxford-Rutherford Aerosol and Cloud (ORAC combined cloud and aerosol retrieval scheme is described and the theoretical performance of the algorithm is analysed. ORAC is an optimal estimation retrieval scheme for deriving cloud and aerosol properties from measurements made by imaging satellite radiometers and, when applied to cloud free radiances, provides estimates of aerosol optical depth at a wavelength of 550 nm, aerosol effective radius and surface reflectance at 550 nm. The aerosol retrieval component of ORAC has several incarnations – this paper addresses the version which operates in conjunction with the cloud retrieval component of ORAC (described by Watts et al., 1998, as applied in producing the Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE data-set.

    The algorithm is described in detail and its performance examined. This includes a discussion of errors resulting from the formulation of the forward model, sensitivity of the retrieval to the measurements and a priori constraints, and errors resulting from assumptions made about the atmospheric/surface state.

  16. Bayesian network learning for natural hazard assessments

    Science.gov (United States)

    Vogel, Kristin

    2016-04-01

    Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables

  17. The effects of mineral dust particles, aerosol regeneration and ice nucleation parameterizations on clouds and precipitation

    Directory of Open Access Journals (Sweden)

    A. Teller

    2012-03-01

    Full Text Available This study focuses on the effects of aerosol particles on the formation of convective clouds and precipitation in the Eastern Mediterranean sea with a special emphasis on the role of mineral dust particles in these processes. We used a new detailed numerical cloud microphysics scheme that has been implemented in the Weather Research and Forecast (WRF model in order to study aerosol-cloud interaction in 3-D configuration based on realistic meteorological data. Using a number of case studies we tested the contribution of mineral dust particles and different ice nucleation parameterizations to precipitation development. In this study we also investigated the importance of recycled (regenerated aerosols that had been released to the atmosphere following the evaporation of cloud droplets.

    The results showed that increased aerosol concentration due to the presence of mineral dust enhanced the formation of ice crystals. The dynamic evolution of the cloud system sets the time periods and regions in which heavy or light precipitation occurred in the domain. The precipitation rate, the time and duration of precipitation were affected by the aerosol properties only at small area scales (with areas of about 20 km2. Changes of the ice nucleation scheme from ice supersaturation dependent parameterization to a recent approach of aerosol concentration and temperature dependent parameterization modified the ice crystals concentrations but did not affect the total precipitation in the domain. Aerosol regeneration modified the concentration of cloud droplets at cloud base by dynamic recirculation of the aerosols but also had only a minor effect on precipitation.

    The major conclusion from this study is that the effect of mineral dust particles on clouds and total precipitation is limited by the properties of the atmospheric dynamics and the only effect of aerosol on precipitation may come from significant increase in the concentration

  18. OMI/Aura Near UV Aerosol Optical Depth and Single Scattering Albedo 1-orbit L2 Swath 13x24 km V003 NRT

    Data.gov (United States)

    National Aeronautics and Space Administration — The OMI/Aura level-2 near UV Aerosol data product 'OMAERUV', recently re-processed using an enhanced algorithm, is now released (April 2012) to the public. The data...

  19. eDPS Aerosol Collection

    Energy Technology Data Exchange (ETDEWEB)

    Venzie, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-13

    The eDPS Aerosol Collection project studies the fundamental physics of electrostatic aerosol collection for national security applications. The interpretation of aerosol data requires understanding and correcting for biases introduced from particle genesis through collection and analysis. The research and development undertaken in this project provides the basis for both the statistical correction of existing equipment and techniques; as well as, the development of new collectors and analytical techniques designed to minimize unwanted biases while improving the efficiency of locating and measuring individual particles of interest.

  20. Instrumentation for tropospheric aerosol characterization

    Energy Technology Data Exchange (ETDEWEB)

    Shi, Z.; Young, S.E.; Becker, C.H.; Coggiola, M.J. [SRI International, Menlo Park, CA (United States); Wollnik, H. [Giessen Univ. (Germany)

    1997-12-31

    A new instrument has been developed that determines the abundance, size distribution, and chemical composition of tropospheric and lower stratospheric aerosols with diameters down to 0.2 {mu}m. In addition to aerosol characterization, the instrument also monitors the chemical composition of the ambient gas. More than 25.000 aerosol particle mass spectra were recorded during the NASA-sponsored Subsonic Aircraft: Contrail and Cloud Effects Special Study (SUCCESS) field program using NASA`s DC-8 research aircraft. (author) 7 refs.

  1. Aerosol measurement program strategy for global aerosol backscatter model development

    Science.gov (United States)

    Bowdle, David A.

    1985-01-01

    The purpose was to propose a balanced program of aerosol backscatter research leading to the development of a global model of aerosol backscatter. Such a model is needed for feasibility studies and systems simulation studies for NASA's prospective satellite-based Doppler lidar wind measurement system. Systems of this kind measure the Doppler shift in the backscatter return from small atmospheric aerosol wind tracers (of order 1 micrometer diameter). The accuracy of the derived local wind estimates and the degree of global wind coverage for such a system are limited by the local availability and by the global scale distribution of natural aerosol particles. The discussions here refer primarily to backscatter model requirements at CO2 wavelengths, which have been selected for most of the Doppler lidar systems studies to date. Model requirements for other potential wavelengths would be similar.

  2. Surrogate/spent fuel sabotage : aerosol ratio test program and Phase 2 test results.

    Energy Technology Data Exchange (ETDEWEB)

    Borek, Theodore Thaddeus III; Thompson, N. Slater (U.S. Department of Energy); Sorenson, Ken Bryce; Hibbs, R.S. (U.S. Department of Energy); Nolte, Oliver (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Molecke, Martin Alan; Autrusson, Bruno (Institut de Radioprotection et de Surete Nucleaire, France); Young, F. I. (U.S. Nuclear Regulatory Commission); Koch, Wolfgang (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Brochard, Didier (Institut de Radioprotection et de Surete Nucleaire, France); Pretzsch, Gunter Guido (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Lange, Florentin (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany)

    2004-05-01

    A multinational test program is in progress to quantify the aerosol particulates produced when a high energy density device, HEDD, impacts surrogate material and actual spent fuel test rodlets. This program provides needed data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments; the program also provides significant political benefits in international cooperation. We are quantifying the spent fuel ratio, SFR, the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are crucial for predicting radiological impacts. This document includes a thorough description of the test program, including the current, detailed test plan, concept and design, plus a description of all test components, and requirements for future components and related nuclear facility needs. It also serves as a program status report as of the end of FY 2003. All available test results, observations, and analyses - primarily for surrogate material Phase 2 tests using cerium oxide sintered ceramic pellets are included. This spent fuel sabotage - aerosol test program is coordinated with the international Working Group for Sabotage Concerns of Transport and Storage Casks, WGSTSC, and supported by both the U.S. Department of Energy and Nuclear Regulatory Commission.

  3. Bayesian analysis of multi-state data with individual covariates for estimating genetic effects on demography

    Science.gov (United States)

    Converse, Sarah J.; Royle, J. Andrew; Urbanek, Richard P.

    2012-01-01

    Inbreeding depression is frequently a concern of managers interested in restoring endangered species. Decisions to reduce the potential for inbreeding depression by balancing genotypic contributions to reintroduced populations may exact a cost on long-term demographic performance of the population if those decisions result in reduced numbers of animals released and/or restriction of particularly successful genotypes (i.e., heritable traits of particular family lines). As part of an effort to restore a migratory flock of Whooping Cranes (Grus americana) to eastern North America using the offspring of captive breeders, we obtained a unique dataset which includes post-release mark-recapture data, as well as the pedigree of each released individual. We developed a Bayesian formulation of a multi-state model to analyze radio-telemetry, band-resight, and dead recovery data on reintroduced individuals, in order to track survival and breeding state transitions. We used studbook-based individual covariates to examine the comparative evidence for and degree of effects of inbreeding, genotype, and genotype quality on post-release survival of reintroduced individuals. We demonstrate implementation of the Bayesian multi-state model, which allows for the integration of imperfect detection, multiple data types, random effects, and individual- and time-dependent covariates. Our results provide only weak evidence for an effect of the quality of an individual's genotype in captivity on post-release survival as well as for an effect of inbreeding on post-release survival. We plan to integrate our results into a decision-analytic modeling framework that can explicitly examine tradeoffs between the effects of inbreeding and the effects of genotype and demographic stochasticity on population establishment.

  4. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  5. A-Train Aerosol Observations Preliminary Comparisons with AeroCom Models and Pathways to Observationally Based All-Sky Estimates

    Science.gov (United States)

    Redemann, J.; Livingston, J.; Shinozuka, Y.; Kacenelenbogen, M.; Russell, P.; LeBlanc, S.; Vaughan, M.; Ferrare, R.; Hostetler, C.; Rogers, R.; Burton, S.; Torres, O.; Remer, L.; Stier, P.; Schutgens, N.

    2014-01-01

    We have developed a technique for combining CALIOP aerosol backscatter, MODIS spectral AOD (aerosol optical depth), and OMI AAOD (absorption aerosol optical depth) retrievals for the purpose of estimating full spectral sets of aerosol radiative properties, and ultimately for calculating the 3-D distribution of direct aerosol radiative forcing. We present results using one year of data collected in 2007 and show comparisons of the aerosol radiative property estimates to collocated AERONET retrievals. Use of the recently released MODIS Collection 6 data for aerosol optical depths derived with the dark target and deep blue algorithms has extended the coverage of the multi-sensor estimates towards higher latitudes. We compare the spatio-temporal distribution of our multi-sensor aerosol retrievals and calculations of seasonal clear-sky aerosol radiative forcing based on the aerosol retrievals to values derived from four models that participated in the latest AeroCom model intercomparison initiative. We find significant inter-model differences, in particular for the aerosol single scattering albedo, which can be evaluated using the multi-sensor A-Train retrievals. We discuss the major challenges that exist in extending our clear-sky results to all-sky conditions. On the basis of comparisons to suborbital measurements, we present some of the limitations of the MODIS and CALIOP retrievals in the presence of adjacent or underlying clouds. Strategies for meeting these challenges are discussed.

  6. Rigorous bounds on aerosol optical properties from measurement and/or model constraints

    Science.gov (United States)

    McGraw, Robert; Fierce, Laura

    2016-04-01

    Sparse-particle aerosol models are an attractive alternative to sectional and modal methods for representation of complex, generally mixed particle populations. In the quadrature method of moments (QMOM) a small set of abscissas and weights, determined from distributional moments, provides the sparse set. Linear programming (LP) yields a generalization of the QMOM that is especially convenient for sparse particle selection. In this paper we use LP to obtain rigorous, nested upper and lower bounds to aerosol optical properties in terms of a prescribed Bayesian-like sequence of model or simulated measurement constraints. Examples of such constraints include remotely-sensed light extinction at different wavelengths, modeled particulate mass, etc. Successive reduction in bound separation with each added constraint provides a quantitative measure of its contextual information content. The present study is focused on univariate populations as a first step towards development of new simulation algorithms for tracking the physical and optical properties of multivariate particle populations.

  7. Bayesian system reliability assessment under fuzzy environments

    Energy Technology Data Exchange (ETDEWEB)

    Wu, H.-C

    2004-03-01

    The Bayesian system reliability assessment under fuzzy environments is proposed in this paper. In order to apply the Bayesian approach, the fuzzy parameters are assumed as fuzzy random variables with fuzzy prior distributions. The (conventional) Bayes estimation method will be used to create the fuzzy Bayes point estimator of system reliability by invoking the well-known theorem called 'Resolution Identity' in fuzzy sets theory. On the other hand, we also provide the computational procedures to evaluate the membership degree of any given Bayes point estimate of system reliability. In order to achieve this purpose, we transform the original problem into a nonlinear programming problem. This nonlinear programming problem is then divided into four subproblems for the purpose of simplifying computation. Finally, the subproblems can be solved by using any commercial optimizers, e.g. GAMS or LINGO.

  8. Bayesian information fusion networks for biosurveillance applications.

    Science.gov (United States)

    Mnatsakanyan, Zaruhi R; Burkom, Howard S; Coberly, Jacqueline S; Lombardo, Joseph S

    2009-01-01

    This study introduces new information fusion algorithms to enhance disease surveillance systems with Bayesian decision support capabilities. A detection system was built and tested using chief complaints from emergency department visits, International Classification of Diseases Revision 9 (ICD-9) codes from records of outpatient visits to civilian and military facilities, and influenza surveillance data from health departments in the National Capital Region (NCR). Data anomalies were identified and distribution of time offsets between events in the multiple data streams were established. The Bayesian Network was built to fuse data from multiple sources and identify influenza-like epidemiologically relevant events. Results showed increased specificity compared with the alerts generated by temporal anomaly detection algorithms currently deployed by NCR health departments. Further research should be done to investigate correlations between data sources for efficient fusion of the collected data.

  9. Bayesian Population Projections for the United Nations.

    Science.gov (United States)

    Raftery, Adrian E; Alkema, Leontine; Gerland, Patrick

    2014-02-01

    The United Nations regularly publishes projections of the populations of all the world's countries broken down by age and sex. These projections are the de facto standard and are widely used by international organizations, governments and researchers. Like almost all other population projections, they are produced using the standard deterministic cohort-component projection method and do not yield statements of uncertainty. We describe a Bayesian method for producing probabilistic population projections for most countries that the United Nations could use. It has at its core Bayesian hierarchical models for the total fertility rate and life expectancy at birth. We illustrate the method and show how it can be extended to address concerns about the UN's current assumptions about the long-term distribution of fertility. The method is implemented in the R packages bayesTFR, bayesLife, bayesPop and bayesDem.

  10. Bayesian peak picking for NMR spectra.

    Science.gov (United States)

    Cheng, Yichen; Gao, Xin; Liang, Faming

    2014-02-01

    Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein-DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  11. Neuroadaptive Bayesian Optimization and Hypothesis Testing.

    Science.gov (United States)

    Lorenz, Romy; Hampshire, Adam; Leech, Robert

    2017-03-01

    Cognitive neuroscientists are often interested in broad research questions, yet use overly narrow experimental designs by considering only a small subset of possible experimental conditions. This limits the generalizability and reproducibility of many research findings. Here, we propose an alternative approach that resolves these problems by taking advantage of recent developments in real-time data analysis and machine learning. Neuroadaptive Bayesian optimization is a powerful strategy to efficiently explore more experimental conditions than is currently possible with standard methodology. We argue that such an approach could broaden the hypotheses considered in cognitive science, improving the generalizability of findings. In addition, Bayesian optimization can be combined with preregistration to cover exploration, mitigating researcher bias more broadly and improving reproducibility.

  12. QBism, the Perimeter of Quantum Bayesianism

    CERN Document Server

    Fuchs, Christopher A

    2010-01-01

    This article summarizes the Quantum Bayesian point of view of quantum mechanics, with special emphasis on the view's outer edges---dubbed QBism. QBism has its roots in personalist Bayesian probability theory, is crucially dependent upon the tools of quantum information theory, and most recently, has set out to investigate whether the physical world might be of a type sketched by some false-started philosophies of 100 years ago (pragmatism, pluralism, nonreductionism, and meliorism). Beyond conceptual issues, work at Perimeter Institute is focused on the hard technical problem of finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when an agent considers gambling on the consequences of...

  13. Bayesian network modelling of upper gastrointestinal bleeding

    Science.gov (United States)

    Aisha, Nazziwa; Shohaimi, Shamarina; Adam, Mohd Bakri

    2013-09-01

    Bayesian networks are graphical probabilistic models that represent causal and other relationships between domain variables. In the context of medical decision making, these models have been explored to help in medical diagnosis and prognosis. In this paper, we discuss the Bayesian network formalism in building medical support systems and we learn a tree augmented naive Bayes Network (TAN) from gastrointestinal bleeding data. The accuracy of the TAN in classifying the source of gastrointestinal bleeding into upper or lower source is obtained. The TAN achieves a high classification accuracy of 86% and an area under curve of 92%. A sensitivity analysis of the model shows relatively high levels of entropy reduction for color of the stool, history of gastrointestinal bleeding, consistency and the ratio of blood urea nitrogen to creatinine. The TAN facilitates the identification of the source of GIB and requires further validation.

  14. Bayesian parameter estimation for effective field theories

    Science.gov (United States)

    Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.

    2016-07-01

    We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.

  15. BONNSAI: correlated stellar observables in Bayesian methods

    CERN Document Server

    Schneider, F R N; Fossati, L; Langer, N; de Koter, A

    2016-01-01

    In an era of large spectroscopic surveys of stars and big data, sophisticated statistical methods become more and more important in order to infer fundamental stellar parameters such as mass and age. Bayesian techniques are powerful methods because they can match all available observables simultaneously to stellar models while taking prior knowledge properly into account. However, in most cases it is assumed that observables are uncorrelated which is generally not the case. Here, we include correlations in the Bayesian code BONNSAI by incorporating the covariance matrix in the likelihood function. We derive a parametrisation of the covariance matrix that, in addition to classical uncertainties, only requires the specification of a correlation parameter that describes how observables co-vary. Our correlation parameter depends purely on the method with which observables have been determined and can be analytically derived in some cases. This approach therefore has the advantage that correlations can be accounte...

  16. Subgroup finding via Bayesian additive regression trees.

    Science.gov (United States)

    Sivaganesan, Siva; Müller, Peter; Huang, Bin

    2017-03-09

    We provide a Bayesian decision theoretic approach to finding subgroups that have elevated treatment effects. Our approach separates the modeling of the response variable from the task of subgroup finding and allows a flexible modeling of the response variable irrespective of potential subgroups of interest. We use Bayesian additive regression trees to model the response variable and use a utility function defined in terms of a candidate subgroup and the predicted response for that subgroup. Subgroups are identified by maximizing the expected utility where the expectation is taken with respect to the posterior predictive distribution of the response, and the maximization is carried out over an a priori specified set of candidate subgroups. Our approach allows subgroups based on both quantitative and categorical covariates. We illustrate the approach using simulated data set study and a real data set. Copyright © 2017 John Wiley & Sons, Ltd.

  17. Bayesian modelling of geostatistical malaria risk data

    Directory of Open Access Journals (Sweden)

    L. Gosoniu

    2006-11-01

    Full Text Available Bayesian geostatistical models applied to malaria risk data quantify the environment-disease relations, identify significant environmental predictors of malaria transmission and provide model-based predictions of malaria risk together with their precision. These models are often based on the stationarity assumption which implies that spatial correlation is a function of distance between locations and independent of location. We relax this assumption and analyse malaria survey data in Mali using a Bayesian non-stationary model. Model fit and predictions are based on Markov chain Monte Carlo simulation methods. Model validation compares the predictive ability of the non-stationary model with the stationary analogue. Results indicate that the stationarity assumption is important because it influences the significance of environmental factors and the corresponding malaria risk maps.

  18. Bayesian modelling of geostatistical malaria risk data.

    Science.gov (United States)

    Gosoniu, L; Vounatsou, P; Sogoba, N; Smith, T

    2006-11-01

    Bayesian geostatistical models applied to malaria risk data quantify the environment-disease relations, identify significant environmental predictors of malaria transmission and provide model-based predictions of malaria risk together with their precision. These models are often based on the stationarity assumption which implies that spatial correlation is a function of distance between locations and independent of location. We relax this assumption and analyse malaria survey data in Mali using a Bayesian non-stationary model. Model fit and predictions are based on Markov chain Monte Carlo simulation methods. Model validation compares the predictive ability of the non-stationary model with the stationary analogue. Results indicate that the stationarity assumption is important because it influences the significance of environmental factors and the corresponding malaria risk maps.

  19. Quantum-like Representation of Bayesian Updating

    Science.gov (United States)

    Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Khrennikov, Andrei; Basieva, Irina

    2011-03-01

    Recently, applications of quantum mechanics to coginitive psychology have been discussed, see [1]-[11]. It was known that statistical data obtained in some experiments of cognitive psychology cannot be described by classical probability model (Kolmogorov's model) [12]-[15]. Quantum probability is one of the most advanced mathematical models for non-classical probability. In the paper of [11], we proposed a quantum-like model describing decision-making process in a two-player game, where we used the generalized quantum formalism based on lifting of density operators [16]. In this paper, we discuss the quantum-like representation of Bayesian inference, which has been used to calculate probabilities for decision making under uncertainty. The uncertainty is described in the form of quantum superposition, and Bayesian updating is explained as a reduction of state by quantum measurement.

  20. Bayesian Cosmological inference beyond statistical isotropy

    Science.gov (United States)

    Souradeep, Tarun; Das, Santanu; Wandelt, Benjamin

    2016-10-01

    With advent of rich data sets, computationally challenge of inference in cosmology has relied on stochastic sampling method. First, I review the widely used MCMC approach used to infer cosmological parameters and present a adaptive improved implementation SCoPE developed by our group. Next, I present a general method for Bayesian inference of the underlying covariance structure of random fields on a sphere. We employ the Bipolar Spherical Harmonic (BipoSH) representation of general covariance structure on the sphere. We illustrate the efficacy of the method with a principled approach to assess violation of statistical isotropy (SI) in the sky maps of Cosmic Microwave Background (CMB) fluctuations. The general, principled, approach to a Bayesian inference of the covariance structure in a random field on a sphere presented here has huge potential for application to other many aspects of cosmology and astronomy, as well as, more distant areas of research like geosciences and climate modelling.

  1. Machine learning a Bayesian and optimization perspective

    CERN Document Server

    Theodoridis, Sergios

    2015-01-01

    This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...

  2. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  3. Bayesian Peak Picking for NMR Spectra

    KAUST Repository

    Cheng, Yichen

    2014-02-01

    Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.

  4. Bayesian Model comparison of Higgs couplings

    CERN Document Server

    Bergstrom, Johannes

    2014-01-01

    We investigate the possibility of contributions from physics beyond the Standard Model (SM) to the Higgs couplings, in the light of the LHC data. The work is performed within an interim framework where the magnitude of the Higgs production and decay rates are rescaled though Higgs coupling scale factors. We perform Bayesian parameter inference on these scale factors, concluding that there is good compatibility with the SM. Furthermore, we carry out Bayesian model comparison on all models where any combination of scale factors can differ from their SM values and find that typically models with fewer free couplings are strongly favoured. We consider the evidence that each coupling individually equals the SM value, making the minimal assumptions on the other couplings. Finally, we make a comparison of the SM against a single "not-SM" model, and find that there is moderate to strong evidence for the SM.

  5. The NIFTY way of Bayesian signal inference

    Energy Technology Data Exchange (ETDEWEB)

    Selig, Marco, E-mail: mselig@mpa-Garching.mpg.de [Max Planck Institut für Astrophysik, Karl-Schwarzschild-Straße 1, D-85748 Garching, Germany, and Ludwig-Maximilians-Universität München, Geschwister-Scholl-Platz 1, D-80539 München (Germany)

    2014-12-05

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D{sup 3}PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

  6. Software Health Management with Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole; Schumann, JOhann

    2011-01-01

    Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.

  7. Integrative bayesian network analysis of genomic data.

    Science.gov (United States)

    Ni, Yang; Stingo, Francesco C; Baladandayuthapani, Veerabhadran

    2014-01-01

    Rapid development of genome-wide profiling technologies has made it possible to conduct integrative analysis on genomic data from multiple platforms. In this study, we develop a novel integrative Bayesian network approach to investigate the relationships between genetic and epigenetic alterations as well as how these mutations affect a patient's clinical outcome. We take a Bayesian network approach that admits a convenient decomposition of the joint distribution into local distributions. Exploiting the prior biological knowledge about regulatory mechanisms, we model each local distribution as linear regressions. This allows us to analyze multi-platform genome-wide data in a computationally efficient manner. We illustrate the performance of our approach through simulation studies. Our methods are motivated by and applied to a multi-platform glioblastoma dataset, from which we reveal several biologically relevant relationships that have been validated in the literature as well as new genes that could potentially be novel biomarkers for cancer progression.

  8. Learning Bayesian networks using genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    Chen Fei; Wang Xiufeng; Rao Yimei

    2007-01-01

    A new method to evaluate the fitness of the Bayesian networks according to the observed data is provided. The main advantage of this criterion is that it is suitable for both the complete and incomplete cases while the others not.Moreover it facilitates the computation greatly. In order to reduce the search space, the notation of equivalent class proposed by David Chickering is adopted. Instead of using the method directly, the novel criterion, variable ordering, and equivalent class are combined,moreover the proposed mthod avoids some problems caused by the previous one. Later, the genetic algorithm which allows global convergence, lack in the most of the methods searching for Bayesian network is applied to search for a good model in thisspace. To speed up the convergence, the genetic algorithm is combined with the greedy algorithm. Finally, the simulation shows the validity of the proposed approach.

  9. A Bayesian nonparametric meta-analysis model.

    Science.gov (United States)

    Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G

    2015-03-01

    In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.

  10. Bayesian image reconstruction: Application to emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Nunez, J.; Llacer, J.

    1989-02-01

    In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.

  11. Bayesian inference for pulsar timing models

    CERN Document Server

    Vigeland, Sarah J

    2013-01-01

    The extremely regular, periodic radio emission from millisecond pulsars make them useful tools for studying neutron star astrophysics, general relativity, and low-frequency gravitational waves. These studies require that the observed pulse time of arrivals are fit to complicated timing models that describe numerous effects such as the astrometry of the source, the evolution of the pulsar's spin, the presence of a binary companion, and the propagation of the pulses through the interstellar medium. In this paper, we discuss the benefits of using Bayesian inference to obtain these timing solutions. These include the validation of linearized least-squares model fits when they are correct, and the proper characterization of parameter uncertainties when they are not; the incorporation of prior parameter information and of models of correlated noise; and the Bayesian comparison of alternative timing models. We describe our computational setup, which combines the timing models of tempo2 with the nested-sampling integ...

  12. Structure Learning in Bayesian Sensorimotor Integration.

    Directory of Open Access Journals (Sweden)

    Tim Genewein

    2015-08-01

    Full Text Available Previous studies have shown that sensorimotor processing can often be described by Bayesian learning, in particular the integration of prior and feedback information depending on its degree of reliability. Here we test the hypothesis that the integration process itself can be tuned to the statistical structure of the environment. We exposed human participants to a reaching task in a three-dimensional virtual reality environment where we could displace the visual feedback of their hand position in a two dimensional plane. When introducing statistical structure between the two dimensions of the displacement, we found that over the course of several days participants adapted their feedback integration process in order to exploit this structure for performance improvement. In control experiments we found that this adaptation process critically depended on performance feedback and could not be induced by verbal instructions. Our results suggest that structural learning is an important meta-learning component of Bayesian sensorimotor integration.

  13. Bayesian parameter estimation for effective field theories

    CERN Document Server

    Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A

    2015-01-01

    We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.

  14. Bayesian data analysis tools for atomic physics

    CERN Document Server

    Trassinelli, Martino

    2016-01-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested fit to calculate the different probability distrib...

  15. A Bayesian approach to person perception.

    Science.gov (United States)

    Clifford, C W G; Mareschal, I; Otsuka, Y; Watson, T L

    2015-11-01

    Here we propose a Bayesian approach to person perception, outlining the theoretical position and a methodological framework for testing the predictions experimentally. We use the term person perception to refer not only to the perception of others' personal attributes such as age and sex but also to the perception of social signals such as direction of gaze and emotional expression. The Bayesian approach provides a formal description of the way in which our perception combines current sensory evidence with prior expectations about the structure of the environment. Such expectations can lead to unconscious biases in our perception that are particularly evident when sensory evidence is uncertain. We illustrate the ideas with reference to our recent studies on gaze perception which show that people have a bias to perceive the gaze of others as directed towards themselves. We also describe a potential application to the study of the perception of a person's sex, in which a bias towards perceiving males is typically observed.

  16. Bayesian analysis of multiple direct detection experiments

    CERN Document Server

    Arina, Chiara

    2013-01-01

    Bayesian methods offer a coherent and efficient framework for implementing uncertainties into induction problems. In this article, we review how this approach applies to the analysis of dark matter direct detection experiments. In particular we discuss the exclusion limit of XENON100 and the debated hints of detection under the hypothesis of a WIMP signal. Within parameter inference, marginalizing consistently over uncertainties to extract robust posterior probability distributions, we find that the claimed tension between XENON100 and the other experiments can be partially alleviated in isospin violating scenario, while elastic scattering model appears to be compatible with the classical approach. We then move to model comparison, for which Bayesian methods are particularly well suited. Firstly, we investigate the annual modulation seen in CoGeNT data, finding that there is weak evidence for a modulation. Modulation models due to other physics compare unfavorably with the WIMP models, paying the price for th...

  17. Bayesianism and inference to the best explanation

    Directory of Open Access Journals (Sweden)

    Valeriano IRANZO

    2008-01-01

    Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.

  18. Narrowband interference parameterization for sparse Bayesian recovery

    KAUST Repository

    Ali, Anum

    2015-09-11

    This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.

  19. The Bayesian Who Knew Too Much

    CERN Document Server

    Benétreau-Dupin, Yann

    2014-01-01

    In several papers, John Norton has argued that Bayesianism cannot handle ignorance adequately due to its inability to distinguish between neutral and disconfirming evidence. He argued that this inability sows confusion in, e.g., anthropic reasoning in cosmology or the Doomsday argument, by allowing one to draw unwarranted conclusions from a lack of knowledge. Norton has suggested criteria for a candidate for representation of neutral support. Imprecise credences (families of credal probability functions) constitute a Bayesian-friendly framework that allows us to avoid inadequate neutral priors and better handle ignorance. The imprecise model generally agrees with Norton's representation of ignorance but requires that his criterion of self-duality be reformulated or abandoned

  20. Group sequential control of overall toxicity incidents in clinical trials - non-Bayesian and Bayesian approaches.

    Science.gov (United States)

    Yu, Jihnhee; Hutson, Alan D; Siddiqui, Adnan H; Kedron, Mary A

    2016-02-01

    In some small clinical trials, toxicity is not a primary endpoint; however, it often has dire effects on patients' quality of life and is even life-threatening. For such clinical trials, rigorous control of the overall incidence of adverse events is desirable, while simultaneously collecting safety information. In this article, we propose group sequential toxicity monitoring strategies to control overall toxicity incidents below a certain level as opposed to performing hypothesis testing, which can be incorporated into an existing study design based on the primary endpoint. We consider two sequential methods: a non-Bayesian approach in which stopping rules are obtained based on the 'future' probability of an excessive toxicity rate; and a Bayesian adaptation modifying the proposed non-Bayesian approach, which can use the information obtained at interim analyses. Through an extensive Monte Carlo study, we show that the Bayesian approach often provides better control of the overall toxicity rate than the non-Bayesian approach. We also investigate adequate toxicity estimation after the studies. We demonstrate the applicability of our proposed methods in controlling the symptomatic intracranial hemorrhage rate for treating acute ischemic stroke patients.

  1. The Size-Weight Illusion is not anti-Bayesian after all: a unifying Bayesian account.

    Science.gov (United States)

    Peters, Megan A K; Ma, Wei Ji; Shams, Ladan

    2016-01-01

    When we lift two differently-sized but equally-weighted objects, we expect the larger to be heavier, but the smaller feels heavier. However, traditional Bayesian approaches with "larger is heavier" priors predict the smaller object should feel lighter; this Size-Weight Illusion (SWI) has thus been labeled "anti-Bayesian" and has stymied psychologists for generations. We propose that previous Bayesian approaches neglect the brain's inference process about density. In our Bayesian model, objects' perceived heaviness relationship is based on both their size and inferred density relationship: observers evaluate competing, categorical hypotheses about objects' relative densities, the inference about which is then used to produce the final estimate of weight. The model can qualitatively and quantitatively reproduce the SWI and explain other researchers' findings, and also makes a novel prediction, which we confirmed. This same computational mechanism accounts for other multisensory phenomena and illusions; that the SWI follows the same process suggests that competitive-prior Bayesian inference can explain human perception across many domains.

  2. Stratospheric aerosol geoengineering

    Science.gov (United States)

    Robock, Alan

    2015-03-01

    The Geoengineering Model Intercomparison Project, conducting climate model experiments with standard stratospheric aerosol injection scenarios, has found that insolation reduction could keep the global average temperature constant, but global average precipitation would reduce, particularly in summer monsoon regions around the world. Temperature changes would also not be uniform; the tropics would cool, but high latitudes would warm, with continuing, but reduced sea ice and ice sheet melting. Temperature extremes would still increase, but not as much as without geoengineering. If geoengineering were halted all at once, there would be rapid temperature and precipitation increases at 5-10 times the rates from gradual global warming. The prospect of geoengineering working may reduce the current drive toward reducing greenhouse gas emissions, and there are concerns about commercial or military control. Because geoengineering cannot safely address climate change, global efforts to reduce greenhouse gas emissions and to adapt are crucial to address anthropogenic global warming.

  3. Aerosol Transmission of Filoviruses

    Directory of Open Access Journals (Sweden)

    Berhanu Mekibib

    2016-05-01

    Full Text Available Filoviruses have become a worldwide public health concern because of their potential for introductions into non-endemic countries through international travel and the international transport of infected animals or animal products. Since it was first identified in 1976, in the Democratic Republic of Congo (formerly Zaire and Sudan, the 2013–2015 western African Ebola virus disease (EVD outbreak is the largest, both by number of cases and geographical extension, and deadliest, recorded so far in medical history. The source of ebolaviruses for human index case(s in most outbreaks is presumptively associated with handling of bush meat or contact with fruit bats. Transmission among humans occurs easily when a person comes in contact with contaminated body fluids of patients, but our understanding of other transmission routes is still fragmentary. This review deals with the controversial issue of aerosol transmission of filoviruses.

  4. Stratospheric aerosol geoengineering

    Energy Technology Data Exchange (ETDEWEB)

    Robock, Alan [Department of Environmental Sciences, Rutgers University, 14 College Farm Road, New Brunswick, NJ 08901 (United States)

    2015-03-30

    The Geoengineering Model Intercomparison Project, conducting climate model experiments with standard stratospheric aerosol injection scenarios, has found that insolation reduction could keep the global average temperature constant, but global average precipitation would reduce, particularly in summer monsoon regions around the world. Temperature changes would also not be uniform; the tropics would cool, but high latitudes would warm, with continuing, but reduced sea ice and ice sheet melting. Temperature extremes would still increase, but not as much as without geoengineering. If geoengineering were halted all at once, there would be rapid temperature and precipitation increases at 5–10 times the rates from gradual global warming. The prospect of geoengineering working may reduce the current drive toward reducing greenhouse gas emissions, and there are concerns about commercial or military control. Because geoengineering cannot safely address climate change, global efforts to reduce greenhouse gas emissions and to adapt are crucial to address anthropogenic global warming.

  5. Bayesian Particle Tracking of Traffic Flows

    OpenAIRE

    2014-01-01

    We develop a Bayesian particle filter for tracking traffic flows that is capable of capturing non-linearities and discontinuities present in flow dynamics. Our model includes a hidden state variable that captures sudden regime shifts between traffic free flow, breakdown and recovery. We develop an efficient particle learning algorithm for real time on-line inference of states and parameters. This requires a two step approach, first, resampling the current particles, with a mixture predictive ...

  6. Bayesian belief networks in business continuity.

    Science.gov (United States)

    Phillipson, Frank; Matthijssen, Edwin; Attema, Thomas

    2014-01-01

    Business continuity professionals aim to mitigate the various challenges to the continuity of their company. The goal is a coherent system of measures that encompass detection, prevention and recovery. Choices made in one part of the system affect other parts as well as the continuity risks of the company. In complex organisations, however, these relations are far from obvious. This paper proposes the use of Bayesian belief networks to expose these relations, and presents a modelling framework for this approach.

  7. Bayesian Spatial Modelling with R-INLA

    OpenAIRE

    Finn Lindgren; Håvard Rue

    2015-01-01

    The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...

  8. Bayesian Probabilities and the Histories Algebra

    OpenAIRE

    Marlow, Thomas

    2006-01-01

    We attempt a justification of a generalisation of the consistent histories programme using a notion of probability that is valid for all complete sets of history propositions. This consists of introducing Cox's axioms of probability theory and showing that our candidate notion of probability obeys them. We also give a generalisation of Bayes' theorem and comment upon how Bayesianism should be useful for the quantum gravity/cosmology programmes.

  9. Bayesian Estimation and Inference Using Stochastic Electronics.

    Science.gov (United States)

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  10. Bayesian variable selection with spherically symmetric priors

    OpenAIRE

    De Kock, M. B.; Eggers, H. C.

    2014-01-01

    We propose that Bayesian variable selection for linear parametrisations with Gaussian iid likelihoods be based on the spherical symmetry of the diagonalised parameter space. Our r-prior results in closed forms for the evidence for four examples, including the hyper-g prior and the Zellner-Siow prior, which are shown to be special cases. Scenarios of a single variable dispersion parameter and of fixed dispersion are studied, and asymptotic forms comparable to the traditional information criter...

  11. Bayesian nonparametric duration model with censorship

    Directory of Open Access Journals (Sweden)

    Joseph Hakizamungu

    2007-10-01

    Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.

  12. Variations on Bayesian Prediction and Inference

    Science.gov (United States)

    2016-05-09

    Variations on Bayesian prediction and inference” Ryan Martin Department of Mathematics, Statistics , and Computer Science University of Illinois at Chicago...using statistical ideas/methods. We recently learned that this new project will be supported, in part, by the National Science Foundation. 2.2 Problem 2...41. Kalli, M., Griffin, J. E., Walker, S. G. (2011). Slice sampling mixture models. Statistics and Computing 21, 93–105. Koenker, R. (2005). Quantile

  13. Distributed Estimation using Bayesian Consensus Filtering

    Science.gov (United States)

    2014-06-06

    Communication of probability distributions and computational methods for implementing the BCF algorithm are discussed along with a numerical example. I...Bayesian filters over Kalman filter–based methods for estimation of nonlinear target dynamic models is that no approximation is needed during the...states across the network. 2014 American Control Conference (ACC) June 4-6, 2014. Portland, Oregon, USA 978-1-4799-3274-0/$31.00 ©2014 AACC 634 Finally, we

  14. Bayesian Variable Selection via Particle Stochastic Search.

    Science.gov (United States)

    Shi, Minghui; Dunson, David B

    2011-02-01

    We focus on Bayesian variable selection in regression models. One challenge is to search the huge model space adequately, while identifying high posterior probability regions. In the past decades, the main focus has been on the use of Markov chain Monte Carlo (MCMC) algorithms for these purposes. In this article, we propose a new computational approach based on sequential Monte Carlo (SMC), which we refer to as particle stochastic search (PSS). We illustrate PSS through applications to linear regression and probit models.

  15. Inhalation toxicity of lithium combustion aerosols in rats

    Energy Technology Data Exchange (ETDEWEB)

    Greenspan, B.J.; Allen, M.D.; Rebar, A.H.

    1986-01-01

    Studies of the acute inhalation toxicity of lithium combustion aerosols were undertaken to aid in evaluating the health hazards associated with the proposed use of lithium metal in fusion reactors. Male and female F344/Lov rats, 9-12 wk of age, were exposed once for 4 h to concentrations of 2600, 2300, 1400, or 620 mg/m/sup 3/ of aerosol (MMAD = 0.69 ..mu..m, sigma/sub g/ = 1.45) that was approximately 80% lithium carbonate and 20% lithium hydroxide to determine the acute toxic effects. Fourteen-day LC50 values (with 95% confidence limits) of 1700 (1300-2000) mg/m/sup 3/ for the male rats and 2000 (1700-2400) mg/m/sup 3/ for the female rate were calculated. Clinical signs of anorexia, dehydration, respiratory difficulty, and perioral and perinasal encrustation were observed. Body weights were decreased the first day after exposure in relation to the exposure concentration. In animals observed for an additional 2 wk, body weights, organ weights, and clinical signs began to return to pre-exposure values. Histopathologic examination of the respiratory tracts from the animals revealed ulcerative or necrotic laryngitis, focal to segmental ulcerative rhinitis often accompanied by areas of squamous metaplasia, and, in some cases, a suppurative bronchopneumonia or aspiration pneumonia, probably secondary to the laryngeal lesions. The results of these studies indicate the moderate acute toxicity of lithium carbonate aerosols and will aid in the risk analysis of accidental releases of lithium combustion aerosols.

  16. Black carbon aerosols and the third polar ice cap

    Energy Technology Data Exchange (ETDEWEB)

    Menon, Surabi; Koch, Dorothy; Beig, Gufran; Sahu, Saroj; Fasullo, John; Orlikowski, Daniel

    2010-04-15

    Recent thinning of glaciers over the Himalayas (sometimes referred to as the third polar region) have raised concern on future water supplies since these glaciers supply water to large river systems that support millions of people inhabiting the surrounding areas. Black carbon (BC) aerosols, released from incomplete combustion, have been increasingly implicated as causing large changes in the hydrology and radiative forcing over Asia and its deposition on snow is thought to increase snow melt. In India BC emissions from biofuel combustion is highly prevalent and compared to other regions, BC aerosol amounts are high. Here, we quantify the impact of BC aerosols on snow cover and precipitation from 1990 to 2010 over the Indian subcontinental region using two different BC emission inventories. New estimates indicate that Indian BC emissions from coal and biofuel are large and transport is expected to expand rapidly in coming years. We show that over the Himalayas, from 1990 to 2000, simulated snow/ice cover decreases by {approx}0.9% due to aerosols. The contribution of the enhanced Indian BC to this decline is {approx}36%, similar to that simulated for 2000 to 2010. Spatial patterns of modeled changes in snow cover and precipitation are similar to observations (from 1990 to 2000), and are mainly obtained with the newer BC estimates.

  17. Black carbon aerosols and the third polar ice cap

    Directory of Open Access Journals (Sweden)

    S. Menon

    2009-12-01

    Full Text Available Recent thinning of glaciers over the Himalayas (sometimes referred to as the third polar region have raised concern on future water supplies since these glaciers supply water to large river systems that support millions of people inhabiting the surrounding areas. Black carbon (BC aerosols, released from incomplete combustion, have been increasingly implicated as causing large changes in the hydrology and radiative forcing over Asia and its deposition on snow is thought to increase snow melt. In India BC from biofuel combustion is highly prevalent and compared to other regions, BC aerosol amounts are high. Here, we quantify the impact of BC aerosols on snow cover and precipitation from 1990 to 2010 over the Indian subcontinental region using two different BC emission inventories. New estimates indicate that Indian BC from coal and biofuel are large and transport is expected to expand rapidly in coming years. We show that over the Himalayas, from 1990 to 2000, simulated snow/ice cover decreases by ~0.9% due to aerosols. The contribution of the enhanced Indian BC to this decline is ~30%, similar to that simulated for 2000 to 2010. Spatial patterns of modeled changes in snow cover and precipitation are similar to observations (from 1990 to 2000, and are mainly obtained with the newer BC estimates.

  18. Black carbon aerosols and the third polar ice cap

    Directory of Open Access Journals (Sweden)

    S. Menon

    2010-05-01

    Full Text Available Recent thinning of glaciers over the Himalayas (sometimes referred to as the third polar region have raised concern on future water supplies since these glaciers supply water to large river systems that support millions of people inhabiting the surrounding areas. Black carbon (BC aerosols, released from incomplete combustion, have been increasingly implicated as causing large changes in the hydrology and radiative forcing over Asia and its deposition on snow is thought to increase snow melt. In India BC emissions from biofuel combustion is highly prevalent and compared to other regions, BC aerosol amounts are high. Here, we quantify the impact of BC aerosols on snow cover and precipitation from 1990 to 2010 over the Indian subcontinental region using two different BC emission inventories. New estimates indicate that Indian BC emissions from coal and biofuel are large and transport is expected to expand rapidly in coming years. We show that over the Himalayas, from 1990 to 2000, simulated snow/ice cover decreases by ~0.9% due to aerosols. The contribution of the enhanced Indian BC to this decline is ~36%, similar to that simulated for 2000 to 2010. Spatial patterns of modeled changes in snow cover and precipitation are similar to observations (from 1990 to 2000, and are mainly obtained with the newer BC estimates.

  19. Biological aerosol detection with combined passive-active infrared measurements

    Science.gov (United States)

    Ifarraguerri, Agustin I.; Vanderbeek, Richard G.; Ben-David, Avishai

    2004-12-01

    A data collection experiment was performed in November of 2003 to measure aerosol signatures using multiple sensors, all operating in the long-wave infrared. The purpose of this data collection experiment was to determine whether combining passive hyperspectral and LIDAR measurements can substantially improve biological aerosol detection performance. Controlled releases of dry aerosols, including road dust, egg albumin and two strains of Bacillus Subtilis var. Niger (BG) spores were performed using the ECBC/ARTEMIS open-path aerosol test chamber located in the Edgewood Area of Aberdeen Proving Grounds, MD. The chamber provides a ~ 20' path without optical windows. Ground truth devices included 3 aerodynamic particle sizers, an optical particle size spectrometer, 6 nephelometers and a high-volume particle sampler. Two sensors were used to make measurements during the test: the AIRIS long-wave infrared imaging spectrometer and the FAL CO2 LIDAR. The AIRIS and FAL data sets were analyzed for detection performance relative to the ground truth. In this paper we present experimental results from the individual sensors as well as results from passive-active sensor fusion. The sensor performance is presented in the form of receiver operating characteristic curves.

  20. Fluorescence from atmospheric aerosol detected by a lidar indicates biogenic particles in the lowermost stratosphere

    Directory of Open Access Journals (Sweden)

    F. Immler

    2005-01-01

    Full Text Available With a lidar system that was installed in Lindenberg/Germany, we observed in June 2003 an extended aerosol layer at 13km altitude in the lowermost stratosphere. This layer created an inelastic backscatter signal that we detected with a water vapour Raman channel, but that was not produced by Raman scattering. Also, we find evidence for inelastic scattering from a smoke plume from a forest fire that we observed in the troposphere. We interpret the unexpected properties of these aerosols as fluorescence induced by the laser beam at organic components of the aerosol particles. Fluorescence from ambient aerosol had not yet been considered detectable by lidar systems. However, organic compounds such as polycyclic aromatic hydrocarbons sticking to the aerosol particles, or bioaerosol such as bacteria, spores or pollen fluoresce when excited with UV-radiation in a way that is detectable by our lidar system. Therefore, we conclude that fluorescence from organic material released by biomass burning creates, inelastic backscatter signals that we measured with our instrument and thus demonstrate a new and powerful way to characterize aerosols by a remote sensing technique. The stratospheric aerosol layer that we have observed in Lindenberg for three consecutive days is likely to be a remnant from Siberian forest fire plumes lifted across the tropopause and transported around the globe.

  1. In-canopy gradients, composition, and sources of optically active aerosols over the Amazon forest

    Science.gov (United States)

    Guyon, P.; Graham, B.; Roberts, G. C.; Mayol-Bracero, O. L.; Andreae, M. O.; Artaxo, P.; Maenhaut, W.

    2003-04-01

    As part of the European contribution to the Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA-EUSTACH), size-fractionated aerosol samples were collected at a primary rainforest site in the Brazilian Amazon during the wet and dry seasons. Daytime-nighttime segregated sampling was carried out at three different heights (above, within and below canopy level) on a 54 m meteorological tower. The samples were analyzed for up to 19 trace elements, equivalent black carbon (BCe) and mass concentrations. Additionally, measurements of scattering and absorption coefficients were performed. Absolute principal component analysis revealed that the wet and dry season aerosols contained the same three main aerosol components, namely a natural biogenic, a pyrogenic, and a soil dust component, but that these were present in different (absolute and relative) amounts. The elements related to biomass burning and soil dust generally exhibited highest concentrations above the canopy and during daytime, whilst forest-derived aerosol was more concentrated underneath the canopy and during nighttime. These variations can be largely attributed to daytime convective mixing and the formation of a shallow nocturnal boundary layer, along with the possibility of enhanced nighttime release of biogenic aerosol particles. All three components contributed significantly to light extinction, suggesting that, in addition to pyrogenic particles, biogenic and soil dust aerosols must be taken into account when modeling the physical and optical properties of aerosols in forested regions such the Amazon Basin.

  2. The effect of volcanic aerosols on the thermal infrared budget of the lower stratosphere

    Science.gov (United States)

    Charlock, T. P.

    1983-01-01

    The thermal IR heating of the stratosphere due to volcanic aerosols such as those released by the eruption of El Chichon is investigated by means of clear-sky model computations using a LOWTRAN5 radiance code (Kneizys et al., 1980) modified by Charlock (1983) to increase its vertical resolution. The results are presented graphically for 4-km-thick aerosol layers at altitudes 18, 22, and 25 km and at latitudes 0 deg and 35 deg N, and the effects of tropospheric cloud height (0-10 km) are taken into account. The aerosol-induced IR divergence is shown to depend on aerosol height and to be highly and nonlinearly sensitive to the location of underlying water clouds.

  3. The kinetics of aerosol particle formation and removal in NPP severe accidents

    Science.gov (United States)

    Zatevakhin, Mikhail A.; Arefiev, Valentin K.; Semashko, Sergey E.; Dolganov, Rostislav A.

    2016-06-01

    Severe Nuclear Power Plant (NPP) accidents are accompanied by release of a massive amount of energy, radioactive products and hydrogen into the atmosphere of the NPP containment. A valid estimation of consequences of such accidents can only be carried out through the use of the integrated codes comprising a description of the basic processes which determine the consequences. A brief description of a coupled aerosol and thermal-hydraulic code to be used for the calculation of the aerosol kinetics within the NPP containment in case of a severe accident is given. The code comprises a KIN aerosol unit integrated into the KUPOL-M thermal-hydraulic code. Some features of aerosol behavior in severe NPP accidents are briefly described.

  4. A Bayesian sequential design with binary outcome.

    Science.gov (United States)

    Zhu, Han; Yu, Qingzhao; Mercante, Donald E

    2017-03-02

    Several researchers have proposed solutions to control type I error rate in sequential designs. The use of Bayesian sequential design becomes more common; however, these designs are subject to inflation of the type I error rate. We propose a Bayesian sequential design for binary outcome using an alpha-spending function to control the overall type I error rate. Algorithms are presented for calculating critical values and power for the proposed designs. We also propose a new stopping rule for futility. Sensitivity analysis is implemented for assessing the effects of varying the parameters of the prior distribution and maximum total sample size on critical values. Alpha-spending functions are compared using power and actual sample size through simulations. Further simulations show that, when total sample size is fixed, the proposed design has greater power than the traditional Bayesian sequential design, which sets equal stopping bounds at all interim analyses. We also find that the proposed design with the new stopping for futility rule results in greater power and can stop earlier with a smaller actual sample size, compared with the traditional stopping rule for futility when all other conditions are held constant. Finally, we apply the proposed method to a real data set and compare the results with traditional designs.

  5. Sparse Bayesian learning in ISAR tomography imaging

    Institute of Scientific and Technical Information of China (English)

    SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang

    2015-01-01

    Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.

  6. Bayesian Inference of a Multivariate Regression Model

    Directory of Open Access Journals (Sweden)

    Marick S. Sinay

    2014-01-01

    Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.

  7. Bayesian Recurrent Neural Network for Language Modeling.

    Science.gov (United States)

    Chien, Jen-Tzung; Ku, Yuan-Chu

    2016-02-01

    A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.

  8. Bayesian posterior distributions without Markov chains.

    Science.gov (United States)

    Cole, Stephen R; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B

    2012-03-01

    Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976-1983) assessing the relation between residential exposure to magnetic fields and the development of childhood cancer. Results from rejection sampling (odds ratio (OR) = 1.69, 95% posterior interval (PI): 0.57, 5.00) were similar to MCMC results (OR = 1.69, 95% PI: 0.58, 4.95) and approximations from data-augmentation priors (OR = 1.74, 95% PI: 0.60, 5.06). In example 2, the authors apply rejection sampling to a cohort study of 315 human immunodeficiency virus seroconverters (1984-1998) to assess the relation between viral load after infection and 5-year incidence of acquired immunodeficiency syndrome, adjusting for (continuous) age at seroconversion and race. In this more complex example, rejection sampling required a notably longer run time than MCMC sampling but remained feasible and again yielded similar results. The transparency of the proposed approach comes at a price of being less broadly applicable than MCMC.

  9. Bayesian and Dempster–Shafer fusion

    Indian Academy of Sciences (India)

    Subhash Challa; Don Koks

    2004-04-01

    The Kalman Filter is traditionally viewed as a prediction–correction filtering algorithm. In this work we show that it can be viewed as a Bayesian fusion algorithm and derive it using Bayesian arguments. We begin with an outline of Bayes theory, using it to discuss well-known quantities such as priors, likelihood and posteriors, and we provide the basic Bayesian fusion equation. We derive the Kalman Filter from this equation using a novel method to evaluate the Chapman–Kolmogorov prediction integral. We then use the theory to fuse data from multiple sensors. Vying with this approach is the Dempster–Shafer theory, which deals with measures of “belief”, and is based on the nonclassical idea of “mass” as opposed to probability. Although these two measures look very similar, there are some differences. We point them out through outlining the ideas of the Dempster– Shafer theory and presenting the basic Dempster–Shafer fusion equation. Finally we compare the two methods, and discuss the relative merits and demerits using an illustrative example.

  10. Bayesian Analysis of Individual Level Personality Dynamics

    Directory of Open Access Journals (Sweden)

    Edward Cripps

    2016-07-01

    Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  11. Particle identification in ALICE: a Bayesian approach

    CERN Document Server

    Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Vargas Trevino, Aurora Diozcora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym

    2016-01-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...

  12. A Bayesian method for pulsar template generation

    CERN Document Server

    Imgrund, M; Kramer, M; Lesch, H

    2015-01-01

    Extracting Times of Arrival from pulsar radio signals depends on the knowledge of the pulsars pulse profile and how this template is generated. We examine pulsar template generation with Bayesian methods. We will contrast the classical generation mechanism of averaging intensity profiles with a new approach based on Bayesian inference. We introduce the Bayesian measurement model imposed and derive the algorithm to reconstruct a "statistical template" out of noisy data. The properties of these "statistical templates" are analysed with simulated and real measurement data from PSR B1133+16. We explain how to put this new form of template to use in analysing secondary parameters of interest and give various examples: We implement a nonlinear filter for determining ToAs of pulsars. Applying this method to data from PSR J1713+0747 we derive ToAs self consistently, meaning all epochs were timed and we used the same epochs for template generation. While the average template contains fluctuations and noise as unavoida...

  13. Bayesian inference of the initial conditions from large-scale structure surveys

    Science.gov (United States)

    Leclercq, Florent

    2016-10-01

    Analysis of three-dimensional cosmological surveys has the potential to answer outstanding questions on the initial conditions from which structure appeared, and therefore on the very high energy physics at play in the early Universe. We report on recently proposed statistical data analysis methods designed to study the primordial large-scale structure via physical inference of the initial conditions in a fully Bayesian framework, and applications to the Sloan Digital Sky Survey data release 7. We illustrate how this approach led to a detailed characterization of the dynamic cosmic web underlying the observed galaxy distribution, based on the tidal environment.

  14. Do atmospheric aerosols form glasses?

    OpenAIRE

    Zobrist, B.; Marcolli, C.; Pedernera, D. A.; Koop, T.

    2008-01-01

    A new process is presented by which water soluble organics might influence ice nucleation, ice growth, chemical reactions and water uptake of aerosols in the upper troposphere: the formation of glassy aerosol particles. Glasses are disordered amorphous (non-crystalline) solids that form when a liquid is cooled without crystallization until the viscosity increases exponentially and molecular diffusion practically ceases. The glass transition temperatures, Tg

  15. PAC-Bayesian Policy Evaluation for Reinforcement Learning

    CERN Document Server

    Fard, Mahdi MIlani; Szepesvari, Csaba

    2012-01-01

    Bayesian priors offer a compact yet general means of incorporating domain knowledge into many learning tasks. The correctness of the Bayesian analysis and inference, however, largely depends on accuracy and correctness of these priors. PAC-Bayesian methods overcome this problem by providing bounds that hold regardless of the correctness of the prior distribution. This paper introduces the first PAC-Bayesian bound for the batch reinforcement learning problem with function approximation. We show how this bound can be used to perform model-selection in a transfer learning scenario. Our empirical results confirm that PAC-Bayesian policy evaluation is able to leverage prior distributions when they are informative and, unlike standard Bayesian RL approaches, ignore them when they are misleading.

  16. Fuzzy Naive Bayesian for constructing regulated network with weights.

    Science.gov (United States)

    Zhou, Xi Y; Tian, Xue W; Lim, Joon S

    2015-01-01

    In the data mining field, classification is a very crucial technology, and the Bayesian classifier has been one of the hotspots in classification research area. However, assumptions of Naive Bayesian and Tree Augmented Naive Bayesian (TAN) are unfair to attribute relations. Therefore, this paper proposes a new algorithm named Fuzzy Naive Bayesian (FNB) using neural network with weighted membership function (NEWFM) to extract regulated relations and weights. Then, we can use regulated relations and weights to construct a regulated network. Finally, we will classify the heart and Haberman datasets by the FNB network to compare with experiments of Naive Bayesian and TAN. The experiment results show that the FNB has a higher classification rate than Naive Bayesian and TAN.

  17. Learning Local Components to Understand Large Bayesian Networks

    DEFF Research Database (Denmark)

    Zeng, Yifeng; Xiang, Yanping; Cordero, Jorge

    2009-01-01

    Bayesian networks are known for providing an intuitive and compact representation of probabilistic information and allowing the creation of models over a large and complex domain. Bayesian learning and reasoning are nontrivial for a large Bayesian network. In parallel, it is a tough job for users...... (domain experts) to extract accurate information from a large Bayesian network due to dimensional difficulty. We define a formulation of local components and propose a clustering algorithm to learn such local components given complete data. The algorithm groups together most inter-relevant attributes...... in a domain. We evaluate its performance on three benchmark Bayesian networks and provide results in support. We further show that the learned components may represent local knowledge more precisely in comparison to the full Bayesian networks when working with a small amount of data....

  18. Bayesian astrostatistics: a backward look to the future

    CERN Document Server

    Loredo, Thomas J

    2012-01-01

    This perspective chapter briefly surveys: (1) past growth in the use of Bayesian methods in astrophysics; (2) current misconceptions about both frequentist and Bayesian statistical inference that hinder wider adoption of Bayesian methods by astronomers; and (3) multilevel (hierarchical) Bayesian modeling as a major future direction for research in Bayesian astrostatistics, exemplified in part by presentations at the first ISI invited session on astrostatistics, commemorated in this volume. It closes with an intentionally provocative recommendation for astronomical survey data reporting, motivated by the multilevel Bayesian perspective on modeling cosmic populations: that astronomers cease producing catalogs of estimated fluxes and other source properties from surveys. Instead, summaries of likelihood functions (or marginal likelihood functions) for source properties should be reported (not posterior probability density functions), including nontrivial summaries (not simply upper limits) for candidate objects ...

  19. A Bayesian Approach for Localization of Acoustic Emission Source in Plate-Like Structures

    Directory of Open Access Journals (Sweden)

    Gang Yan

    2015-01-01

    Full Text Available This paper presents a Bayesian approach for localizing acoustic emission (AE source in plate-like structures with consideration of uncertainties from modeling error and measurement noise. A PZT sensor network is deployed to monitor and acquire AE wave signals released by possible damage. By using continuous wavelet transform (CWT, the time-of-flight (TOF information of the AE wave signals is extracted and measured. With a theoretical TOF model, a Bayesian parameter identification procedure is developed to obtain the AE source location and the wave velocity at a specific frequency simultaneously and meanwhile quantify their uncertainties. It is based on Bayes’ theorem that the posterior distributions of the parameters about the AE source location and the wave velocity are obtained by relating their priors and the likelihood of the measured time difference data. A Markov chain Monte Carlo (MCMC algorithm is employed to draw samples to approximate the posteriors. Also, a data fusion scheme is performed to fuse results identified at multiple frequencies to increase accuracy and reduce uncertainty of the final localization results. Experimental studies on a stiffened aluminum panel with simulated AE events by pensile lead breaks (PLBs are conducted to validate the proposed Bayesian AE source localization approach.

  20. Modelling the dynamics of an experimental host-pathogen microcosm within a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    David Lunn

    Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.

  1. CALIPSO Observations of Aerosol Properties Near Clouds

    Science.gov (United States)

    Marshak, Alexander; Varnai, Tamas; Yang, Weidong

    2010-01-01

    Clouds are surrounded by a transition zone of rapidly changing aerosol properties. Characterizing this zone is important for better understanding aerosol-cloud interactions and aerosol radiative effects as well as for improving satellite measurements of aerosol properties. We present a statistical analysis of a global dataset of CALIPSO (Cloud-Aerosol Lidar and infrared Pathfinder Satellite Observation) Lidar observations over oceans. The results show that the transition zone extends as far as 15 km away from clouds and it is ubiquitous over all oceans. The use of only high confidence level cloud-aerosol discrimination (CAD) data confirms the findings. However, the results underline the need for caution to avoid biases in studies of satellite aerosol products, aerosol-cloud interactions, and aerosol direct radiative effects.

  2. Devices and methods for generating an aerosol

    KAUST Repository

    Bisetti, Fabrizio

    2016-03-03

    Aerosol generators and methods of generating aerosols are provided. The aerosol can be generated at a stagnation interface between a hot, wet stream and a cold, dry stream. The aerosol has the benefit that the properties of the aerosol can be precisely controlled. The stagnation interface can be generated, for example, by the opposed flow of the hot stream and the cold stream. The aerosol generator and the aerosol generation methods are capable of producing aerosols with precise particle sizes and a narrow size distribution. The properties of the aerosol can be controlled by controlling one or more of the stream temperatures, the saturation level of the hot stream, and the flow times of the streams.

  3. Small sample Bayesian analyses in assessment of weapon performance

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Abundant test data are required in assessment of weapon performance.When weapon test data are insufficient,Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations.The several Bayesian approaches are discussed and some limitations are founded.An improvement is put forward after limitations of Bayesian approaches available are analyzed and t he improved approach is applied to assessment of some new weapon performance.

  4. The Bayesian bridge between simple and universal kriging

    Energy Technology Data Exchange (ETDEWEB)

    Omre, H.; Halvorsen, K.B. (Norwegian Computing Center, Oslo (Norway))

    1989-10-01

    Kriging techniques are suited well for evaluation of continuous, spatial phenomena. Bayesian statistics are characterized by using prior qualified guesses on the model parameters. By merging kriging techniques and Bayesian theory, prior guesses may be used in a spatial setting. Partial knowledge of model parameters defines a continuum of models between what is named simple and universal kriging in geostatistical terminology. The Bayesian approach to kriging is developed and discussed, and a case study concerning depth conversion of seismic reflection times is presented.

  5. Improved Sampling for Diagnostic Reasoning in Bayesian Networks

    OpenAIRE

    Hulme, Mark

    2013-01-01

    Bayesian networks offer great potential for use in automating large scale diagnostic reasoning tasks. Gibbs sampling is the main technique used to perform diagnostic reasoning in large richly interconnected Bayesian networks. Unfortunately Gibbs sampling can take an excessive time to generate a representative sample. In this paper we describe and test a number of heuristic strategies for improving sampling in noisy-or Bayesian networks. The strategies include Monte Carlo Markov chain sampling...

  6. Bayesian Network Enhanced with Structural Reliability Methods: Methodology

    OpenAIRE

    Straub, Daniel; Der Kiureghian, Armen

    2012-01-01

    We combine Bayesian networks (BNs) and structural reliability methods (SRMs) to create a new computational framework, termed enhanced Bayesian network (eBN), for reliability and risk analysis of engineering structures and infrastructure. BNs are efficient in representing and evaluating complex probabilistic dependence structures, as present in infrastructure and structural systems, and they facilitate Bayesian updating of the model when new information becomes available. On the other hand, SR...

  7. Sparse Event Modeling with Hierarchical Bayesian Kernel Methods

    Science.gov (United States)

    2016-01-05

    SECURITY CLASSIFICATION OF: The research objective of this proposal was to develop a predictive Bayesian kernel approach to model count data based on...several predictive variables. Such an approach, which we refer to as the Poisson Bayesian kernel model, is able to model the rate of occurrence of... kernel methods made use of: (i) the Bayesian property of improving predictive accuracy as data are dynamically obtained, and (ii) the kernel function

  8. Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data

    Science.gov (United States)

    2015-07-01

    Efficient Algorithms for Bayesian Network Parameter Learning from Incomplete Data Guy Van den Broeck∗ and Karthika Mohan∗ and Arthur Choi and Adnan...We propose a family of efficient algorithms for learning the parameters of a Bayesian network from incomplete data. Our approach is based on recent...algorithms like EM (which require inference). 1 INTRODUCTION When learning the parameters of a Bayesian network from data with missing values, the

  9. Bayesian Statistics in Software Engineering: Practical Guide and Case Studies

    OpenAIRE

    Furia, Carlo A.

    2016-01-01

    Statistics comes in two main flavors: frequentist and Bayesian. For historical and technical reasons, frequentist statistics has dominated data analysis in the past; but Bayesian statistics is making a comeback at the forefront of science. In this paper, we give a practical overview of Bayesian statistics and illustrate its main advantages over frequentist statistics for the kinds of analyses that are common in empirical software engineering, where frequentist statistics still is standard. We...

  10. Bayesian non- and semi-parametric methods and applications

    CERN Document Server

    Rossi, Peter

    2014-01-01

    This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number

  11. Bayesian missing data problems EM, data augmentation and noniterative computation

    CERN Document Server

    Tan, Ming T; Ng, Kai Wang

    2009-01-01

    Bayesian Missing Data Problems: EM, Data Augmentation and Noniterative Computation presents solutions to missing data problems through explicit or noniterative sampling calculation of Bayesian posteriors. The methods are based on the inverse Bayes formulae discovered by one of the author in 1995. Applying the Bayesian approach to important real-world problems, the authors focus on exact numerical solutions, a conditional sampling approach via data augmentation, and a noniterative sampling approach via EM-type algorithms. After introducing the missing data problems, Bayesian approach, and poste

  12. Bayesian integer frequency offset estimator for MIMO-OFDM systems

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Carrier frequency offset (CFO) in MIMO-OFDM systems can be decoupled into two parts: fraction frequency offset (FFO) and integer frequency offset (IFO). The problem of IFO estimation is addressed and a new IFO estimator based on the Bayesian philosophy is proposed. Also, it is shown that the Bayesian IFO estimator is optimal among all the IFO estimators. Furthermore, the Bayesian estimator can take advantage of oversampling so that better performance can be obtained. Finally, numerical results show the optimality of the Bayesian estimator and validate the theoretical analysis.

  13. Doing bayesian data analysis a tutorial with R and BUGS

    CERN Document Server

    Kruschke, John K

    2011-01-01

    There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all

  14. The bugs book a practical introduction to Bayesian analysis

    CERN Document Server

    Lunn, David; Best, Nicky; Thomas, Andrew; Spiegelhalter, David

    2012-01-01

    Introduction: Probability and ParametersProbabilityProbability distributionsCalculating properties of probability distributionsMonte Carlo integrationMonte Carlo Simulations Using BUGSIntroduction to BUGSDoodleBUGSUsing BUGS to simulate from distributionsTransformations of random variablesComplex calculations using Monte CarloMultivariate Monte Carlo analysisPredictions with unknown parametersIntroduction to Bayesian InferenceBayesian learningPosterior predictive distributionsConjugate Bayesian inferenceInference about a discrete parameterCombinations of conjugate analysesBayesian and classica

  15. Diagnosis of combined faults in Rotary Machinery by Non-Naive Bayesian approach

    Science.gov (United States)

    Asr, Mahsa Yazdanian; Ettefagh, Mir Mohammad; Hassannejad, Reza; Razavi, Seyed Naser

    2017-02-01

    When combined faults happen in different parts of the rotating machines, their features are profoundly dependent. Experts are completely familiar with individuals faults characteristics and enough data are available from single faults but the problem arises, when the faults combined and the separation of characteristics becomes complex. Therefore, the experts cannot declare exact information about the symptoms of combined fault and its quality. In this paper to overcome this drawback, a novel method is proposed. The core idea of the method is about declaring combined fault without using combined fault features as training data set and just individual fault features are applied in training step. For this purpose, after data acquisition and resampling the obtained vibration signals, Empirical Mode Decomposition (EMD) is utilized to decompose multi component signals to Intrinsic Mode Functions (IMFs). With the use of correlation coefficient, proper IMFs for feature extraction are selected. In feature extraction step, Shannon energy entropy of IMFs was extracted as well as statistical features. It is obvious that most of extracted features are strongly dependent. To consider this matter, Non-Naive Bayesian Classifier (NNBC) is appointed, which release the fundamental assumption of Naive Bayesian, i.e., the independence among features. To demonstrate the superiority of NNBC, other counterpart methods, include Normal Naive Bayesian classifier, Kernel Naive Bayesian classifier and Back Propagation Neural Networks were applied and the classification results are compared. An experimental vibration signals, collected from automobile gearbox, were used to verify the effectiveness of the proposed method. During the classification process, only the features, related individually to healthy state, bearing failure and gear failures, were assigned for training the classifier. But, combined fault features (combined gear and bearing failures) were examined as test data. The achieved

  16. Impacts of elevated-aerosol-layer and aerosol type on the correlation of AOD and particulate matter with ground-based and satellite measurements in Nanjing, southeast China.

    Science.gov (United States)

    Han, Yong; Wu, Yonghua; Wang, Tijian; Zhuang, Bingliang; Li, Shu; Zhao, Kun

    2015-11-01

    Assessment of the correlation between aerosol optical depth (AOD) and particulate matter (PM) is critical to satellite remote sensing of air quality, e.g. ground PM10 and ground PM2.5. This study evaluates the impacts of aloft-aerosol-plume and aerosol-type on the correlation of AOD-PM by using synergistic measurement of a polarization-sensitive Raman-Mie lidar, CIMEL sunphotometer (SP) and TEOM PM samplers, as well as the satellite MODIS and CALIPSO, during April to July 2011 in Nanjing city (32.05(○)N/118.77(○)E), southeast China. Aloft-aerosol-layer and aerosol types (e.g. dust and non-dust or urban aerosol) are identified with the range-resolved polarization lidar and SP measurements. The results indicate that the correlations for AOD-PM10 and AOD-PM2.5 can be much improved when screening out the aloft-aerosol-layer. The linear regression slopes show significant differences for the dust and non-dust dominant aerosols in the planetary boundary layer (PBL). In addition, we evaluate the recent released MODIS-AOD product (Collection 6) from the "dark-target" (DT) and "deep-blue" (DB) algorithms and their correlation with the PM in Nanjing urban area. The results verify that the MODIS-DT AODs show a good correlation (R = 0.89) with the SP-AOD but with a systematic overestimate. In contrast, the MODIS-DB AOD shows a moderate correlation (R = 0.66) with the SP-AOD but with a smaller regression intercept (0.07). Furthermore, the moderately high correlations between the MODIS-AOD and PM10 (PM2.5) are indicated, which suggests the feasibility of PM estimate using the MODIS-AOD in Nanjing city.

  17. Quantitative measurement of aerosol deposition on skin, hair and clothing for dosimetric assessment. Final report

    DEFF Research Database (Denmark)

    Fogh, C.L.; Byrne, M.A.; Andersson, Kasper Grann

    1999-01-01

    the deposition and subsequent fate of contaminant aerosol on skin, hair and clothing. The main technique applied involves the release and subsequent deposition on volunteers in test rooms of particles of differentsizes labelled with neutron activatable rare earth tracers. Experiments indicate that the deposition...

  18. Massive Volcanic SO2 Oxidation and Sulphate Aerosol Deposition in Cenozoic North America

    Science.gov (United States)

    Volcanic eruptions release a large amount of sulphur dioxide (SO2) into the atmosphere. SO2 is oxidized to sulphate and can subsequently form sulphate aerosol, which can affect the Earth's radiation balance, biologic productivity and high-altitude ozone co...

  19. Significant atmospheric aerosol pollution caused by world food cultivation

    Science.gov (United States)

    Bauer, Susanne E.; Tsigaridis, Kostas; Miller, Ron

    2016-05-01

    Particulate matter is a major concern for public health, causing cancer and cardiopulmonary mortality. Therefore, governments in most industrialized countries monitor and set limits for particulate matter. To assist policy makers, it is important to connect the chemical composition and severity of particulate pollution to its sources. Here we show how agricultural practices, livestock production, and the use of nitrogen fertilizers impact near-surface air quality. In many densely populated areas, aerosols formed from gases that are released by fertilizer application and animal husbandry dominate over the combined contributions from all other anthropogenic pollution. Here we test reduction scenarios of combustion-based and agricultural emissions that could lower air pollution. For a future scenario, we find opposite trends, decreasing nitrate aerosol formation near the surface while total tropospheric loads increase. This suggests that food production could be increased to match the growing global population without sacrificing air quality if combustion emission is decreased.

  20. Significant Atmospheric Aerosol Pollution Caused by World Food Cultivation

    Science.gov (United States)

    Bauer, Susanne E.; Tsigaridis, Kostas; Miller, Ron

    2016-01-01

    Particulate matter is a major concern for public health, causing cancer and cardiopulmonary mortality. Therefore, governments in most industrialized countries monitor and set limits for particulate matter. To assist policy makers, it is important to connect the chemical composition and severity of particulate pollution to its sources. Here we show how agricultural practices, livestock production, and the use of nitrogen fertilizers impact near-surface air quality. In many densely populated areas, aerosols formed from gases that are released by fertilizer application and animal husbandry dominate over the combined contributions from all other anthropogenic pollution. Here we test reduction scenarios of combustion-based and agricultural emissions that could lower air pollution. For a future scenario, we find opposite trends, decreasing nitrate aerosol formation near the surface while total tropospheric loads increase. This suggests that food production could be increased to match the growing global population without sacrificing air quality if combustion emission is decreased.

  1. Study of Aerosol Chemical Composition Based on Aerosol Optical Properties

    Science.gov (United States)

    Berry, Austin; Aryal, Rudra

    2015-03-01

    We investigated the variation of aerosol absorption optical properties obtained from the CIMEL Sun-Photometer measurements over three years (2012-2014) at three AERONET sites GSFC; MD Science_Center and Tudor Hill, Bermuda. These sites were chosen based on the availability of data and locations that can receive different types of aerosols from land and ocean. These absorption properties, mainly the aerosol absorption angstrom exponent, were analyzed to examine the corresponding aerosol chemical composition. We observed that the retrieved absorption angstrom exponents over the two sites, GSFC and MD Science Center, are near 1 (the theoretical value for black carbon) and with low single scattering albedo values during summer seasons indicating presence of black carbon. Strong variability of aerosol absorption properties were observed over Tudor Hill and will be analyzed based on the air mass embedded from ocean side and land side. We will also present the seasonal variability of these properties based on long-range air mass sources at these three sites. Brent Holben, NASA GSFC, AERONET, Jon Rodriguez.

  2. News/Press Releases

    Data.gov (United States)

    Office of Personnel Management — A press release, news release, media release, press statement is written communication directed at members of the news media for the purpose of announcing programs...

  3. Global simulations of aerosol processing in clouds

    Directory of Open Access Journals (Sweden)

    C. Hoose

    2008-12-01

    Full Text Available An explicit and detailed representation of in-droplet and in-crystal aerosol particles in stratiform clouds has been introduced in the global aerosol-climate model ECHAM5-HAM. The new scheme allows an evaluation of the cloud cycling of aerosols and an estimation of the relative contributions of nucleation and collision scavenging, as opposed to evaporation of hydrometeors in the global aerosol processing by clouds. On average an aerosol particle is cycled through stratiform clouds 0.5 times. The new scheme leads to important changes in the simulated fraction of aerosol scavenged in clouds, and consequently in the aerosol wet deposition. In general, less aerosol is scavenged into clouds with the new prognostic treatment than what is prescribed in standard ECHAM5-HAM. Aerosol concentrations, size distributions, scavenged fractions and cloud droplet concentrations are evaluated and compared to different observations. While the scavenged fraction and the aerosol number concentrations in the marine boundary layer are well represented in the new model, aerosol optical thickness, cloud droplet number concentrations in the marine boundary layer and the aerosol volume in the accumulation and coarse modes over the oceans are overestimated. Sensitivity studies suggest that a better representation of below-cloud scavenging, higher in-cloud collision coefficients, or a reduced water uptake by seasalt aerosols could reduce these biases.

  4. Global simulations of aerosol processing in clouds

    Directory of Open Access Journals (Sweden)

    C. Hoose

    2008-07-01

    Full Text Available An explicit and detailed representation of in-droplet and in-crystal aerosol particles in stratiform clouds has been introduced in the global aerosol-climate model ECHAM5-HAM. The new scheme allows an evaluation of the cloud cycling of aerosols and an estimation of the relative contributions of nucleation and collision scavenging, as opposed to evaporation of hydrometeors in the global aerosol processing by clouds. On average an aerosol particle is cycled through stratiform clouds 0.5 times. The new scheme leads to important changes in the simulated fraction of aerosol scavenged in clouds, and consequently in the aerosol wet deposition. In general, less aerosol is scavenged into clouds with the new prognostic treatment than what is prescribed in standard ECHAM5-HAM. Aerosol concentrations, size distributions, scavenged fractions and cloud droplet concentrations are evaluated and compared to different observations. While the scavenged fraction and the aerosol number concentrations in the marine boundary layer are well represented in the new model, aerosol optical thickness, cloud droplet number concentrations in the marine boundary layer and the aerosol volume in the accumulation and coarse modes over the oceans are overestimated. Sensitivity studies suggest that a better representation of below-cloud scavenging, higher in-cloud collision coefficients, or a reduced water uptake by seasalt aerosols could reduce these biases.

  5. Optical properties and cross-sections of biological aerosols

    Science.gov (United States)

    Thrush, E.; Brown, D. M.; Salciccioli, N.; Gomes, J.; Brown, A.; Siegrist, K.; Thomas, M. E.; Boggs, N. T.; Carter, C. C.

    2010-04-01

    There is an urgent need to develop standoff sensing of biological agents in aerosolized clouds. In support of the Joint Biological Standoff Detection System (JBSDS) program, lidar systems have been a dominant technology and have shown significant capability in field tests conducted in the Joint Ambient Breeze Tunnel (JABT) at Dugway Proving Ground (DPG). The release of biological agents in the open air is forbidden. Therefore, indirect methods must be developed to determine agent cross-sections in order to validate sensor against biological agents. A method has been developed that begins with laboratory measurements of thin films and liquid suspensions of biological material to obtain the complex index of refraction from the ultraviolet (UV) to the long wave infrared (LWIR). Using that result and the aerosols' particle size distribution as inputs to Mie calculations yields the backscatter and extinction cross-sections as a function of wavelength. Recent efforts to model field measurements from the UV to the IR have been successful. Measurements with aerodynamic and geometric particle sizers show evidence of particle clustering. Backscatter simulations of these aerosols show these clustered particles dominate the aerosol backscatter and depolarization signals. In addition, these large particles create spectral signatures in the backscatter signal due to material absorption. Spectral signatures from the UV to the IR have been observed in simulations of field releases. This method has been demonstrated for a variety of biological simulant materials such as Ovalbumin (OV), Erwinia (EH), Bacillus atrophaeus (BG) and male specific bacteriophage (MS2). These spectral signatures may offer new methods for biological discrimination for both stand-off sensing and point detection systems.

  6. Black carbon aerosol mixing state, organic aerosols and aerosol optical properties over the UK

    Science.gov (United States)

    McMeeking, G. R.; Morgan, W. T.; Flynn, M.; Highwood, E. J.; Turnbull, K.; Haywood, J.; Coe, H.

    2011-05-01

    Black carbon (BC) aerosols absorb sunlight thereby leading to a positive radiative forcing and a warming of climate and can also impact human health through their impact on the respiratory system. The state of mixing of BC with other aerosol species, particularly the degree of internal/external mixing, has been highlighted as a major uncertainty in assessing its radiative forcing and hence its climate impact, but few in situ observations of mixing state exist. We present airborne single particle soot photometer (SP2) measurements of refractory BC (rBC) mass concentrations and mixing state coupled with aerosol composition and optical properties measured in urban plumes and regional pollution over the UK. All data were obtained using instrumentation flown on the UK's BAe-146-301 large Atmospheric Research Aircraft (ARA) operated by the Facility for Airborne Atmospheric Measurements (FAAM). We measured sub-micron aerosol composition using an aerosol mass spectrometer (AMS) and used positive matrix factorization to separate hydrocarbon-like (HOA) and oxygenated organic aerosols (OOA). We found a higher number fraction of thickly coated rBC particles in air masses with large OOA relative to HOA, higher ozone-to-nitrogen oxides (NOx) ratios and large concentrations of total sub-micron aerosol mass relative to rBC mass concentrations. The more ozone- and OOA-rich air masses were associated with transport from continental Europe, while plumes from UK cities had higher HOA and NOx and fewer thickly coated rBC particles. We did not observe any significant change in the rBC mass absorption efficiency calculated from rBC mass and light absorption coefficients measured by a particle soot absorption photometer despite observing significant changes in aerosol composition and rBC mixing state. The contributions of light scattering and absorption to total extinction (quantified by the single scattering albedo; SSA) did change for different air masses, with lower SSA observed in

  7. Black carbon aerosol mixing state, organic aerosols and aerosol optical properties over the United Kingdom

    Science.gov (United States)

    McMeeking, G. R.; Morgan, W. T.; Flynn, M.; Highwood, E. J.; Turnbull, K.; Haywood, J.; Coe, H.

    2011-09-01

    Black carbon (BC) aerosols absorb sunlight thereby leading to a positive radiative forcing and a warming of climate and can also impact human health through their impact on the respiratory system. The state of mixing of BC with other aerosol species, particularly the degree of internal/external mixing, has been highlighted as a major uncertainty in assessing its radiative forcing and hence its climate impact, but few in situ observations of mixing state exist. We present airborne single particle soot photometer (SP2) measurements of refractory BC (rBC) mass concentrations and mixing state coupled with aerosol composition and optical properties measured in urban plumes and regional pollution over the United Kingdom. All data were obtained using instrumentation flown on the UK's BAe-146-301 large Atmospheric Research Aircraft (ARA) operated by the Facility for Airborne Atmospheric Measurements (FAAM). We measured sub-micron aerosol composition using an aerosol mass spectrometer (AMS) and used positive matrix factorization to separate hydrocarbon-like (HOA) and oxygenated organic aerosols (OOA). We found a higher number fraction of thickly coated rBC particles in air masses with large OOA relative to HOA, higher ozone-to-nitrogen oxides (NOx) ratios and large concentrations of total sub-micron aerosol mass relative to rBC mass concentrations. The more ozone- and OOA-rich air masses were associated with transport from continental Europe, while plumes from UK cities had higher HOA and NOx and fewer thickly coated rBC particles. We did not observe any significant change in the rBC mass absorption efficiency calculated from rBC mass and light absorption coefficients measured by a particle soot absorption photometer despite observing significant changes in aerosol composition and rBC mixing state. The contributions of light scattering and absorption to total extinction (quantified by the single scattering albedo; SSA) did change for different air masses, with lower SSA

  8. Black carbon aerosol mixing state, organic aerosols and aerosol optical properties over the United Kingdom

    Directory of Open Access Journals (Sweden)

    G. R. McMeeking

    2011-09-01

    Full Text Available Black carbon (BC aerosols absorb sunlight thereby leading to a positive radiative forcing and a warming of climate and can also impact human health through their impact on the respiratory system. The state of mixing of BC with other aerosol species, particularly the degree of internal/external mixing, has been highlighted as a major uncertainty in assessing its radiative forcing and hence its climate impact, but few in situ observations of mixing state exist. We present airborne single particle soot photometer (SP2 measurements of refractory BC (rBC mass concentrations and mixing state coupled with aerosol composition and optical properties measured in urban plumes and regional pollution over the United Kingdom. All data were obtained using instrumentation flown on the UK's BAe-146-301 large Atmospheric Research Aircraft (ARA operated by the Facility for Airborne Atmospheric Measurements (FAAM. We measured sub-micron aerosol composition using an aerosol mass spectrometer (AMS and used positive matrix factorization to separate hydrocarbon-like (HOA and oxygenated organic aerosols (OOA. We found a higher number fraction of thickly coated rBC particles in air masses with large OOA relative to HOA, higher ozone-to-nitrogen oxides (NOx ratios and large concentrations of total sub-micron aerosol mass relative to rBC mass concentrations. The more ozone- and OOA-rich air masses were associated with transport from continental Europe, while plumes from UK cities had higher HOA and NOx and fewer thickly coated rBC particles. We did not observe any significant change in the rBC mass absorption efficiency calculated from rBC mass and light absorption coefficients measured by a particle soot absorption photometer despite observing significant changes in aerosol composition and rBC mixing state. The contributions of light scattering and absorption to total extinction (quantified by the single scattering albedo; SSA did change for

  9. Computational fluid dynamics analysis of aerosol deposition in pebble beds

    Science.gov (United States)

    Mkhosi, Margaret Msongi

    2007-12-01

    The Pebble Bed Modular Reactor is a high temperature gas cooled reactor which uses helium gas as a coolant. The reactor uses spherical graphite pebbles as fuel. The fuel design is inherently resistant to the release of the radioactive material up to high temperatures; therefore, the plant can withstand a broad spectrum of accidents with limited release of radionuclides to the environment. Despite safety features of the concepts, these reactors still contain large inventories of radioactive materials. The transport of most of the radioactive materials in an accident occurs in the form of aerosol particles. In this dissertation, the limits of applicability of existing computational fluid dynamics code FLUENT to the prediction of aerosol transport have been explored. The code was run using the Reynolds Averaged Navier-Stokes turbulence models to determine the effects of different turbulence models on the prediction of aerosol particle deposition. Analyses were performed for up to three unit cells in the orthorhombic configuration. For low flow conditions representing natural circulation driven flow, the laminar flow model was used and the results were compared with existing experimental data for packed beds. The results compares well with experimental data in the low flow regime. For conditions corresponding to normal operating of the reactor, analyses were performed using the standard k-ɛ turbulence model. From the inertial deposition results, a correlation that can be used to estimate the deposition of aerosol particles within pebble beds given inlet flow conditions has been developed. These results were converted into a dimensionless form as a function of a modified Stokes number. Based on results obtained in the laminar regime and for individual pebbles, the correlation developed for the inertial impaction component of deposition is believed to be credible. The form of the correlation developed also allows these results to be applied to pebble beds of different

  10. Factors Affecting Aerosol Radiative Forcing

    Science.gov (United States)

    Wang, Jingxu; Lin, Jintai; Ni, Ruijing

    2016-04-01

    Rapid industrial and economic growth has meant a large amount of aerosols in the atmosphere with strong radiative forcing (RF) upon the climate system. Over parts of the globe, the negative forcing of aerosols has overcompensated for the positive forcing of greenhouse gases. Aerosol RF is determined by emissions and various chemical-transport-radiative processes in the atmosphere, a multi-factor problem whose individual contributors have not been well quantified. In this study, we analyze the major factors affecting RF of secondary inorganic aerosols (SIOAs, including sulfate, nitrate and ammonium), primary organic aerosol (POA), and black carbon (BC). We analyze the RF of aerosols produced by 11 major regions across the globe, including but not limited to East Asia, Southeast Asia, South Asia, North America, and Western Europe. Factors analyzed include population size, per capita gross domestic production (GDP), emission intensity (i.e., emissions per unit GDP), chemical efficiency (i.e., mass per unit emissions) and radiative efficiency (i.e., RF per unit mass). We find that among the 11 regions, East Asia produces the largest emissions and aerosol RF, due to relatively high emission intensity and a tremendous population size. South Asia produce the second largest RF of SIOA and BC and the highest RF of POA, in part due to its highest chemical efficiency among all regions. Although Southeast Asia also has large emissions, its aerosol RF is alleviated by its lowest chemical efficiency. The chemical efficiency and radiative efficiency of BC produced by the Middle East-North Africa are the highest across the regions, whereas its RF is lowered by a small per capita GDP. Both North America and Western Europe have low emission intensity, compensating for the effects on RF of large population sizes and per capita GDP. There has been a momentum to transfer industries to Southeast Asia and South Asia, and such transition is expected to continue in the coming years. The

  11. MISR Aerosol Climatology Product V001

    Data.gov (United States)

    National Aeronautics and Space Administration — This product is 1)the microphysical and scattering characteristics of pure aerosol upon which routine retrievals are based;2)mixtures of pure aerosol to be compared...

  12. Miniature Sensor for Aerosol Mass Measurements Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR project seeks to develop a miniature sensor for mass measurement of size-classified aerosols. A cascade impactor will be used to classify aerosol sample...

  13. Bayesian approach to avoiding track seduction

    Science.gov (United States)

    Salmond, David J.; Everett, Nicholas O.

    2002-08-01

    The problem of maintaining track on a primary target in the presence spurious objects is addressed. Recursive and batch filtering approaches are developed. For the recursive approach, a Bayesian track splitting filter is derived which spawns candidate tracks if there is a possibility of measurement misassociation. The filter evaluates the probability of each candidate track being associated with the primary target. The batch filter is a Markov-chain Monte Carlo (MCMC) algorithm which fits the observed data sequence to models of target dynamics and measurement-track association. Simulation results are presented.

  14. An Approximate Bayesian Fundamental Frequency Estimator

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2012-01-01

    Joint fundamental frequency and model order estimation is an important problem in several applications such as speech and music processing. In this paper, we develop an approximate estimation algorithm of these quantities using Bayesian inference. The inference about the fundamental frequency...... and the model order is based on a probability model which corresponds to a minimum of prior information. From this probability model, we give the exact posterior distributions on the fundamental frequency and the model order, and we also present analytical approximations of these distributions which lower...

  15. Bayesian variable selection for latent class models.

    Science.gov (United States)

    Ghosh, Joyee; Herring, Amy H; Siega-Riz, Anna Maria

    2011-09-01

    In this article, we develop a latent class model with class probabilities that depend on subject-specific covariates. One of our major goals is to identify important predictors of latent classes. We consider methodology that allows estimation of latent classes while allowing for variable selection uncertainty. We propose a Bayesian variable selection approach and implement a stochastic search Gibbs sampler for posterior computation to obtain model-averaged estimates of quantities of interest such as marginal inclusion probabilities of predictors. Our methods are illustrated through simulation studies and application to data on weight gain during pregnancy, where it is of interest to identify important predictors of latent weight gain classes.

  16. Bayesian regression of piecewise homogeneous Poisson processes

    Directory of Open Access Journals (Sweden)

    Diego Sevilla

    2015-12-01

    Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015

  17. Bayesian approach to inverse statistical mechanics.

    Science.gov (United States)

    Habeck, Michael

    2014-05-01

    Inverse statistical mechanics aims to determine particle interactions from ensemble properties. This article looks at this inverse problem from a Bayesian perspective and discusses several statistical estimators to solve it. In addition, a sequential Monte Carlo algorithm is proposed that draws the interaction parameters from their posterior probability distribution. The posterior probability involves an intractable partition function that is estimated along with the interactions. The method is illustrated for inverse problems of varying complexity, including the estimation of a temperature, the inverse Ising problem, maximum entropy fitting, and the reconstruction of molecular interaction potentials.

  18. Bayesian Prediction for The Winds of Winter

    OpenAIRE

    Vale, Richard

    2014-01-01

    Predictions are made for the number of chapters told from the point of view of each character in the next two novels in George R. R. Martin's \\emph{A Song of Ice and Fire} series by fitting a random effects model to a matrix of point-of-view chapters in the earlier novels using Bayesian methods. {\\textbf{SPOILER WARNING: readers who have not read all five existing novels in the series should not read further, as major plot points will be spoiled.}}

  19. Bayesian feature selection to estimate customer survival

    OpenAIRE

    Figini, Silvia; Giudici, Paolo; Brooks, S P

    2006-01-01

    We consider the problem of estimating the lifetime value of customers, when a large number of features are present in the data. In order to measure lifetime value we use survival analysis models to estimate customer tenure. In such a context, a number of classical modelling challenges arise. We will show how our proposed Bayesian methods perform, and compare it with classical churn models on a real case study. More specifically, based on data from a media service company, our aim will be to p...

  20. Structure-based bayesian sparse reconstruction

    KAUST Repository

    Quadeer, Ahmed Abdul

    2012-12-01

    Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.

  1. Bayesian model selection in Gaussian regression

    CERN Document Server

    Abramovich, Felix

    2009-01-01

    We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting estimator. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for "nearly-orthogonal" and "multicollinear" designs.

  2. Bayesian Analysis of Type Ia Supernova Data

    Institute of Scientific and Technical Information of China (English)

    王晓峰; 周旭; 李宗伟; 陈黎

    2003-01-01

    Recently, the distances to type Ia supernova (SN Ia) at z ~ 0.5 have been measured with the motivation of estimating cosmological parameters. However, different sleuthing techniques tend to give inconsistent measurements for SN Ia distances (~0.3 mag), which significantly affects the determination of cosmological parameters.A Bayesian "hyper-parameter" procedure is used to analyse jointly the current SN Ia data, which considers the relative weights of different datasets. For a flat Universe, the combining analysis yields ΩM = 0.20 ± 0.07.

  3. Bayesian mixture models for partially verified data

    DEFF Research Database (Denmark)

    Kostoulas, Polychronis; Browne, William J.; Nielsen, Søren Saxmose;

    2013-01-01

    for some individuals, in order to minimize this loss in the discriminatory power. The distribution of the continuous antibody response against MAP has been obtained for healthy, MAP-infected and MAP-infectious cows of different age groups. The overall power of the milk-ELISA to discriminate between healthy......Bayesian mixture models can be used to discriminate between the distributions of continuous test responses for different infection stages. These models are particularly useful in case of chronic infections with a long latent period, like Mycobacterium avium subsp. paratuberculosis (MAP) infection...

  4. Filtering in hybrid dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Andersen, Morten Nonboe; Andersen, Rasmus Ørum; Wheeler, Kevin

    2004-01-01

    We demonstrate experimentally that inference in a complex hybrid Dynamic Bayesian Network (DBN) is possible using the 2-Time Slice DBN (2T-DBN) from (Koller & Lerner, 2000) to model fault detection in a watertank system. In (Koller & Lerner, 2000) a generic Particle Filter (PF) is used...... that the choice of network structure is very important for the performance of the generic PF and the EKF algorithms, but not for the UKF algorithms. Furthermore, we investigate the influence of data noise in the watertank simulation. Theory and implementation is based on the theory presented in (v.d. Merwe et al...

  5. Filtering in hybrid dynamic Bayesian networks (center)

    DEFF Research Database (Denmark)

    Andersen, Morten Nonboe; Andersen, Rasmus Ørum; Wheeler, Kevin

    We demonstrate experimentally that inference in a complex hybrid Dynamic Bayesian Network (DBN) is possible using the 2-Time Slice DBN (2T-DBN) from (Koller & Lerner, 2000) to model fault detection in a watertank system. In (Koller & Lerner, 2000) a generic Particle Filter (PF) is used...... that the choice of network structure is very important for the performance of the generic PF and the EKF algorithms, but not for the UKF algorithms. Furthermore, we investigate the influence of data noise in the watertank simulation. Theory and implementation is based on the theory presented in (v.d. Merwe et al...

  6. Filtering in hybrid dynamic Bayesian networks (left)

    DEFF Research Database (Denmark)

    Andersen, Morten Nonboe; Andersen, Rasmus Ørum; Wheeler, Kevin

    We demonstrate experimentally that inference in a complex hybrid Dynamic Bayesian Network (DBN) is possible using the 2-Time Slice DBN (2T-DBN) from (Koller & Lerner, 2000) to model fault detection in a watertank system. In (Koller & Lerner, 2000) a generic Particle Filter (PF) is used...... that the choice of network structure is very important for the performance of the generic PF and the EKF algorithms, but not for the UKF algorithms. Furthermore, we investigate the influence of data noise in the watertank simulation. Theory and implementation is based on the theory presented in (v.d. Merwe et al...

  7. Multisnapshot Sparse Bayesian Learning for DOA

    Science.gov (United States)

    Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki; Nannuru, Santosh

    2016-10-01

    The directions of arrival (DOA) of plane waves are estimated from multi-snapshot sensor array data using Sparse Bayesian Learning (SBL). The prior source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters the unknown variances (i.e. the source powers). For a complex Gaussian likelihood with hyperparameter the unknown noise variance, the corresponding Gaussian posterior distribution is derived. For a given number of DOAs, the hyperparameters are automatically selected by maximizing the evidence and promote sparse DOA estimates. The SBL scheme for DOA estimation is discussed and evaluated competitively against LASSO ($\\ell_1$-regularization), conventional beamforming, and MUSIC

  8. Bayesian parameter estimation by continuous homodyne detection

    Science.gov (United States)

    Kiilerich, Alexander Holm; Mølmer, Klaus

    2016-09-01

    We simulate the process of continuous homodyne detection of the radiative emission from a quantum system, and we investigate how a Bayesian analysis can be employed to determine unknown parameters that govern the system evolution. Measurement backaction quenches the system dynamics at all times and we show that the ensuing transient evolution is more sensitive to system parameters than the steady state of the system. The parameter sensitivity can be quantified by the Fisher information, and we investigate numerically and analytically how the temporal noise correlations in the measurement signal contribute to the ultimate sensitivity limit of homodyne detection.

  9. Radioactive Contraband Detection: A Bayesian Approach

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Sale, K; Chambers, D; Axelrod, M; Meyer, A

    2009-03-16

    Radionuclide emissions from nuclear contraband challenge both detection and measurement technologies to capture and record each event. The development of a sequential Bayesian processor incorporating both the physics of gamma-ray emissions and the measurement of photon energies offers a physics-based approach to attack this challenging problem. It is shown that a 'physics-based' structure can be used to develop an effective detection technique, but also motivates the implementation of this approach using or particle filters to enhance and extract the required information.

  10. Bayesian parameter estimation by continuous homodyne detection

    DEFF Research Database (Denmark)

    Kiilerich, Alexander Holm; Molmer, Klaus

    2016-01-01

    and we show that the ensuing transient evolution is more sensitive to system parameters than the steady state of the system. The parameter sensitivity can be quantified by the Fisher information, and we investigate numerically and analytically how the temporal noise correlations in the measurement signal......We simulate the process of continuous homodyne detection of the radiative emission from a quantum system, and we investigate how a Bayesian analysis can be employed to determine unknown parameters that govern the system evolution. Measurement backaction quenches the system dynamics at all times...

  11. Bayesian global analysis of neutrino oscillation data

    CERN Document Server

    Bergstrom, Johannes; Maltoni, Michele; Schwetz, Thomas

    2015-01-01

    We perform a Bayesian analysis of current neutrino oscillation data. When estimating the oscillation parameters we find that the results generally agree with those of the $\\chi^2$ method, with some differences involving $s_{23}^2$ and CP-violating effects. We discuss the additional subtleties caused by the circular nature of the CP-violating phase, and how it is possible to obtain correlation coefficients with $s_{23}^2$. When performing model comparison, we find that there is no significant evidence for any mass ordering, any octant of $s_{23}^2$ or a deviation from maximal mixing, nor the presence of CP-violation.

  12. Case studies in Bayesian microbial risk assessments

    Directory of Open Access Journals (Sweden)

    Turner Joanne

    2009-12-01

    Full Text Available Abstract Background The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. Methods We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs. Results We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5. The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11. In the second

  13. A Bayesian Framework for Combining Valuation Estimates

    CERN Document Server

    Yee, Kenton K

    2007-01-01

    Obtaining more accurate equity value estimates is the starting point for stock selection, value-based indexing in a noisy market, and beating benchmark indices through tactical style rotation. Unfortunately, discounted cash flow, method of comparables, and fundamental analysis typically yield discrepant valuation estimates. Moreover, the valuation estimates typically disagree with market price. Can one form a superior valuation estimate by averaging over the individual estimates, including market price? This article suggests a Bayesian framework for combining two or more estimates into a superior valuation estimate. The framework justifies the common practice of averaging over several estimates to arrive at a final point estimate.

  14. Origins of atmospheric aerosols. Basic concepts on aerosol main physical properties; L`aerosol atmospherique: ses origines quelques notions sur les principales proprietes physiques des aerosols

    Energy Technology Data Exchange (ETDEWEB)

    Renoux, A. [Paris-12 Univ., 94 - Creteil (France). Laboratoire de Physique des aerosols et de transferts des contaminations

    1996-12-31

    Natural and anthropogenic sources of atmospheric aerosols are reviewed and indications of their concentrations and granulometry are given. Calculation of the lifetime of an atmospheric aerosol of a certain size is presented and the various modes of aerosol granulometry and their relations with photochemical and physico-chemical processes in the atmosphere are discussed. The main physical, electrical and optical properties of aerosols are also presented: diffusion coefficient, dynamic mobility and relaxation time, Stokes number, limit rate of fall, electrical mobility, optical diffraction

  15. Bayesian Learning and the Psychology of Rule Induction

    Science.gov (United States)

    Endress, Ansgar D.

    2013-01-01

    In recent years, Bayesian learning models have been applied to an increasing variety of domains. While such models have been criticized on theoretical grounds, the underlying assumptions and predictions are rarely made concrete and tested experimentally. Here, I use Frank and Tenenbaum's (2011) Bayesian model of rule-learning as a case study to…

  16. Bayesian Data-Model Fit Assessment for Structural Equation Modeling

    Science.gov (United States)

    Levy, Roy

    2011-01-01

    Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…

  17. Non-homogeneous dynamic Bayesian networks for continuous data

    NARCIS (Netherlands)

    Grzegorczyk, Marco; Husmeier, Dirk

    2011-01-01

    Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with c

  18. Survey of Bayesian Models for Modelling of Stochastic Temporal Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ng, B

    2006-10-12

    This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.

  19. Bayesian Compressed Sensing with Unknown Measurement Noise Level

    DEFF Research Database (Denmark)

    Hansen, Thomas Lundgaard; Jørgensen, Peter Bjørn; Pedersen, Niels Lovmand

    2013-01-01

    In sparse Bayesian learning (SBL) approximate Bayesian inference is applied to find sparse estimates from observations corrupted by additive noise. Current literature only vaguely considers the case where the noise level is unknown a priori. We show that for most state-of-the-art reconstruction a...

  20. Bayesian Student Modeling and the Problem of Parameter Specification.

    Science.gov (United States)

    Millan, Eva; Agosta, John Mark; Perez de la Cruz, Jose Luis

    2001-01-01

    Discusses intelligent tutoring systems and the application of Bayesian networks to student modeling. Considers reasons for not using Bayesian networks, including the computational complexity of the algorithms and the difficulty of knowledge acquisition, and proposes an approach to simplify knowledge acquisition that applies causal independence to…

  1. Towards an inclusion driven learning of Bayesian Networks

    NARCIS (Netherlands)

    Castelo, R.; Kocka, T.

    2002-01-01

    Two or more Bayesian Networks are Markov equivalent when their corresponding acyclic digraphs encode the same set of conditional independence (= CI) restrictions. Therefore, the search space of Bayesian Networks may be organized in classes of equivalence, where each of them consists of a particular

  2. Bayesian Inference Networks and Spreading Activation in Hypertext Systems.

    Science.gov (United States)

    Savoy, Jacques

    1992-01-01

    Describes a method based on Bayesian networks for searching hypertext systems. Discussion covers the use of Bayesian networks for structuring index terms and representing user information needs; use of link semantics based on constrained spreading activation to find starting points for browsing; and evaluation of a prototype system. (64…

  3. Implementing Relevance Feedback in the Bayesian Network Retrieval Model.

    Science.gov (United States)

    de Campos, Luis M.; Fernandez-Luna, Juan M.; Huete, Juan F.

    2003-01-01

    Discussion of relevance feedback in information retrieval focuses on a proposal for the Bayesian Network Retrieval Model. Bases the proposal on the propagation of partial evidences in the Bayesian network, representing new information obtained from the user's relevance judgments to compute the posterior relevance probabilities of the documents…

  4. What Is the Probability You Are a Bayesian?

    Science.gov (United States)

    Wulff, Shaun S.; Robinson, Timothy J.

    2014-01-01

    Bayesian methodology continues to be widely used in statistical applications. As a result, it is increasingly important to introduce students to Bayesian thinking at early stages in their mathematics and statistics education. While many students in upper level probability courses can recite the differences in the Frequentist and Bayesian…

  5. Hopes and Cautions in Implementing Bayesian Structural Equation Modeling

    Science.gov (United States)

    MacCallum, Robert C.; Edwards, Michael C.; Cai, Li

    2012-01-01

    Muthen and Asparouhov (2012) have proposed and demonstrated an approach to model specification and estimation in structural equation modeling (SEM) using Bayesian methods. Their contribution builds on previous work in this area by (a) focusing on the translation of conventional SEM models into a Bayesian framework wherein parameters fixed at zero…

  6. Universal Darwinism As a Process of Bayesian Inference.

    Science.gov (United States)

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  7. Bayesian Item Selection in Constrained Adaptive Testing Using Shadow Tests

    Science.gov (United States)

    Veldkamp, Bernard P.

    2010-01-01

    Application of Bayesian item selection criteria in computerized adaptive testing might result in improvement of bias and MSE of the ability estimates. The question remains how to apply Bayesian item selection criteria in the context of constrained adaptive testing, where large numbers of specifications have to be taken into account in the item…

  8. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    Science.gov (United States)

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  9. Universal Darwinism as a process of Bayesian inference

    Directory of Open Access Journals (Sweden)

    John Oberon Campbell

    2016-06-01

    Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  10. Mechanistic curiosity will not kill the Bayesian cat

    NARCIS (Netherlands)

    Borsboom, Denny; Wagenmakers, Eric-Jan; Romeijn, Jan-Willem

    2011-01-01

    Jones & Love (J&L) suggest that Bayesian approaches to the explanation of human behavior should be constrained by mechanistic theories. We argue that their proposal misconstrues the relation between process models, such as the Bayesian model, and mechanisms. While mechanistic theories can answer spe

  11. ATI TDA 5A aerosol generator evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Gilles, D.A.

    1998-07-27

    Oil based aerosol ``Smoke`` commonly used for testing the efficiency and penetration of High Efficiency Particulate Air filters (HEPA) and HEPA systems can produce flammability hazards that may not have been previously considered. A combustion incident involving an aerosol generator has caused an investigation into the hazards of the aerosol used to test HEPA systems at Hanford.

  12. DARE: a dedicated aerosols retrieval instrument

    NARCIS (Netherlands)

    Court, A.J.; Smorenburg, K.; Courrèges-Lacoste, G.B.; Visser, H.; Leeuw, G. de; Decae, R.

    2004-01-01

    Satellite remote sensing of aerosols is a largely unresolved problem. A dedicated instrument aimed at aerosols would be able to reduce the large uncertainties connected to this kind of remote sensing. TNO is performing a study of a space based instrument for aerosol measurements, together with the s

  13. Highly Resolved Paleoclimatic Aerosol Records

    DEFF Research Database (Denmark)

    Kettner, Ernesto

    In ice cores a plethora of proxies for paleoclimatic conditions is archived. Air trapped in the ice during firnification allows for direct measurements of the concentrations and isotope ratios of paleoatmospheric gases while, the isotopic composition of the ice matrix itself is related...... to paleotemperatures. Impurities in the matrix are comprised of particulate and soluble aerosols, each carrying information on its source’s activitiy and|or proximity. Opposed to gases and water isotopes, the seasonality of many aerosols is not smoothed out in the firn column so that large concentration gradients...... with frequently changing signs are preserved. Therefore, these aerosol records can be used for dating by annual layer counting. However, with increasing depth the annual layer thicknesses decreases due to pressure and ice flow and accurate dating is possible only as long as the rapid variations can be resolved...

  14. Wind reduction by aerosol particles

    Science.gov (United States)

    Jacobson, Mark Z.; Kaufman, Yoram J.

    2006-12-01

    Aerosol particles are known to affect radiation, temperatures, stability, clouds, and precipitation, but their effects on spatially-distributed wind speed have not been examined to date. Here, it is found that aerosol particles, directly and through their enhancement of clouds, may reduce near-surface wind speeds below them by up to 8% locally. This reduction may explain a portion of observed ``disappearing winds'' in China, and it decreases the energy available for wind-turbine electricity. In California, slower winds reduce emissions of wind-driven soil dust and sea spray. Slower winds and cooler surface temperatures also reduce moisture advection and evaporation. These factors, along with the second indirect aerosol effect, may reduce California precipitation by 2-5%, contributing to a strain on water supply.

  15. Aerosol retrieval experiments in the ESA Aerosol_cci project

    Directory of Open Access Journals (Sweden)

    T. Holzer-Popp

    2013-08-01

    Full Text Available Within the ESA Climate Change Initiative (CCI project Aerosol_cci (2010–2013, algorithms for the production of long-term total column aerosol optical depth (AOD datasets from European Earth Observation sensors are developed. Starting with eight existing pre-cursor algorithms three analysis steps are conducted to improve and qualify the algorithms: (1 a series of experiments applied to one month of global data to understand several major sensitivities to assumptions needed due to the ill-posed nature of the underlying inversion problem, (2 a round robin exercise of "best" versions of each of these algorithms (defined using the step 1 outcome applied to four months of global data to identify mature algorithms, and (3 a comprehensive validation exercise applied to one complete year of global data produced by the algorithms selected as mature based on the round robin exercise. The algorithms tested included four using AATSR, three using MERIS and one using PARASOL. This paper summarizes the first step. Three experiments were conducted to assess the potential impact of major assumptions in the various aerosol retrieval algorithms. In the first experiment a common set of four aerosol components was used to provide all algorithms with the same assumptions. The second experiment introduced an aerosol property climatology, derived from a combination of model and sun photometer observations, as a priori information in the retrievals on the occurrence of the common aerosol components. The third experiment assessed the impact of using a common nadir cloud mask for AATSR and MERIS algorithms in order to characterize the sensitivity to remaining cloud contamination in the retrievals against the baseline dataset versions. The impact of the algorithm changes was assessed for one month (September 2008 of data: qualitatively by inspection of monthly mean AOD maps and quantitatively by comparing daily gridded satellite data against daily averaged AERONET sun

  16. Aerosol retrieval experiments in the ESA Aerosol_cci project

    Science.gov (United States)

    Holzer-Popp, T.; de Leeuw, G.; Griesfeller, J.; Martynenko, D.; Klüser, L.; Bevan, S.; Davies, W.; Ducos, F.; Deuzé, J. L.; Graigner, R. G.; Heckel, A.; von Hoyningen-Hüne, W.; Kolmonen, P.; Litvinov, P.; North, P.; Poulsen, C. A.; Ramon, D.; Siddans, R.; Sogacheva, L.; Tanre, D.; Thomas, G. E.; Vountas, M.; Descloitres, J.; Griesfeller, J.; Kinne, S.; Schulz, M.; Pinnock, S.

    2013-08-01

    Within the ESA Climate Change Initiative (CCI) project Aerosol_cci (2010-2013), algorithms for the production of long-term total column aerosol optical depth (AOD) datasets from European Earth Observation sensors are developed. Starting with eight existing pre-cursor algorithms three analysis steps are conducted to improve and qualify the algorithms: (1) a series of experiments applied to one month of global data to understand several major sensitivities to assumptions needed due to the ill-posed nature of the underlying inversion problem, (2) a round robin exercise of "best" versions of each of these algorithms (defined using the step 1 outcome) applied to four months of global data to identify mature algorithms, and (3) a comprehensive validation exercise applied to one complete year of global data produced by the algorithms selected as mature based on the round robin exercise. The algorithms tested included four using AATSR, three using MERIS and one using PARASOL. This paper summarizes the first step. Three experiments were conducted to assess the potential impact of major assumptions in the various aerosol retrieval algorithms. In the first experiment a common set of four aerosol components was used to provide all algorithms with the same assumptions. The second experiment introduced an aerosol property climatology, derived from a combination of model and sun photometer observations, as a priori information in the retrievals on the occurrence of the common aerosol components. The third experiment assessed the impact of using a common nadir cloud mask for AATSR and MERIS algorithms in order to characterize the sensitivity to remaining cloud contamination in the retrievals against the baseline dataset versions. The impact of the algorithm changes was assessed for one month (September 2008) of data: qualitatively by inspection of monthly mean AOD maps and quantitatively by comparing daily gridded satellite data against daily averaged AERONET sun photometer

  17. Quantification of the release of inorganic elements from biofuels

    DEFF Research Database (Denmark)

    Frandsen, Flemming; van Lith, Simone Cornelia; Korbee, Rob

    2007-01-01

    -scale and pilot-scale fixed-bed release data. In conclusion, it is recommended to perform the described lab-scale tests in order to obtain reliable quantitative data on the release of inorganic elements under grate-firing or suspension-firing conditions. Advanced fuel characterization by use of chemical...... fractionation and simultaneous thermal analysis, and global equilibrium analysis of thermal fuel conversion systems, provide valuable information on the association of inorganic elements in the fuel, the transformations and release of inorganic species upon heating, and the possible forms in which the inorganic...... elements are thermodynamically stable as a function of temperature. This information is needed for the interpretation of the lab-scale release data. Thus, for the purpose of modeling ash or aerosol formation, fuel characterization methods should be combined with lab-scale release measurements. Pilot...

  18. BAYESIAN DEMONSTRATION TEST METHOD WITH MIXED BETA DISTRIBUTION

    Institute of Scientific and Technical Information of China (English)

    MING Zhimao; TAO Junyong; CHEN Xun; ZHANG Yunan

    2008-01-01

    A complex mechatronics system Bayesian plan of demonstration test is studied based on the mixed beta distribution. During product design and improvement various information is appropriately considered by introducing inheritance factor, moreover, the inheritance factor is thought as a random variable, and the Bayesian decision of the qualification test plan is obtained, and the correctness of a Bayesian model presented is verified. The results show that the quantity of the test is too conservative according to classical methods under small binomial samples. Although traditional Bayesian analysis can consider test information of related or similar products, it ignores differences between such products. The method has solved the above problem, furthermore, considering the requirement in many practical projects, the differences among this method, the classical method and Bayesian with beta distribution are compared according to the plan of reliability acceptance test.

  19. Bayesian Approach to Neuro-Rough Models for Modelling HIV

    CERN Document Server

    Marwala, Tshilidzi

    2007-01-01

    This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.

  20. Stochastic margin-based structure learning of Bayesian network classifiers.

    Science.gov (United States)

    Pernkopf, Franz; Wohlmayr, Michael

    2013-02-01

    The margin criterion for parameter learning in graphical models gained significant impact over the last years. We use the maximum margin score for discriminatively optimizing the structure of Bayesian network classifiers. Furthermore, greedy hill-climbing and simulated annealing search heuristics are applied to determine the classifier structures. In the experiments, we demonstrate the advantages of maximum margin optimized Bayesian network structures in terms of classification performance compared to traditionally used discriminative structure learning methods. Stochastic simulated annealing requires less score evaluations than greedy heuristics. Additionally, we compare generative and discriminative parameter learning on both generatively and discriminatively structured Bayesian network classifiers. Margin-optimized Bayesian network classifiers achieve similar classification performance as support vector machines. Moreover, missing feature values during classification can be handled by discriminatively optimized Bayesian network classifiers, a case where purely discriminative classifiers usually require mechanisms to complete unknown feature values in the data first.