A Bayesian Analysis of the Radioactive Releases of Fukushima
DEFF Research Database (Denmark)
Tomioka, Ryota; Mørup, Morten
2012-01-01
The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know...... the types of nuclides and their levels of concentration from the recorded mixture of radiations to take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. From the sparsely sampled...... detailed account also of the latent structure present in the data of the Fukushima Daiichi disaster....
Nanostructured Aerosol Particles: Fabrication, Pulmonary Drug Delivery, and Controlled Release
Directory of Open Access Journals (Sweden)
Xingmao Jiang
2011-01-01
Full Text Available Pulmonary drug delivery is the preferred route of administration in the treatment of respiratory diseases and some nonrespiratory diseases. Recent research has focused on developing structurally stable high-dosage drug delivery systems without premature release. To maximize the deposition in the desired lung regions, several factors must be considered in the formulation. The special issue includes seven papers deal with aerosol-assisted fabrication of nanostructured particles, aerosol deposition, nanoparticles pulmonary exposure, and controlled release.
Aerosols released from solvent fire accidents in reprocessing plants
International Nuclear Information System (INIS)
Thermodynamic, aerosol characterizing and radiological data of solvent fires in reprocessing plants have been established in experiments. These are the main results: Depending on the ventilation in the containment, kerosene-TBP mixtures burn at a rate up to 120 kg/m2 h. The aqueous phase of inorganic-organic mixtures might be released during the fire. The gaseous reaction products contain unburnable acidic compounds. Solvents with TBP-nitrate complex shows higher (up to 25%) burning rates than pure solvents (kerosene-TBP). The nitrate complex decomposes violently at about 1300C with a release of acid and unburnable gases. Up to 20% of the burned kerosene-TBP solvents are released during the fire in the form of soot particles, phosphoric acid and TBP decomposition products. The particles have an aerodynamic mass median diameter of about 0.5 μm and up to 1.5% of the uranium fixed in the TBP-nitrate complex is released during solvent fires. (orig.)
Tang, Qingxin; Bo, Yanchen; Zhu, Yuxin
2016-04-01
Merging multisensor aerosol optical depth (AOD) products is an effective way to produce more spatiotemporally complete and accurate AOD products. A spatiotemporal statistical data fusion framework based on a Bayesian maximum entropy (BME) method was developed for merging satellite AOD products in East Asia. The advantages of the presented merging framework are that it not only utilizes the spatiotemporal autocorrelations but also explicitly incorporates the uncertainties of the AOD products being merged. The satellite AOD products used for merging are the Moderate Resolution Imaging Spectroradiometer (MODIS) Collection 5.1 Level-2 AOD products (MOD04_L2) and the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Deep Blue Level 2 AOD products (SWDB_L2). The results show that the average completeness of the merged AOD data is 95.2%,which is significantly superior to the completeness of MOD04_L2 (22.9%) and SWDB_L2 (20.2%). By comparing the merged AOD to the Aerosol Robotic Network AOD records, the results show that the correlation coefficient (0.75), root-mean-square error (0.29), and mean bias (0.068) of the merged AOD are close to those (the correlation coefficient (0.82), root-mean-square error (0.19), and mean bias (0.059)) of the MODIS AOD. In the regions where both MODIS and SeaWiFS have valid observations, the accuracy of the merged AOD is higher than those of MODIS and SeaWiFS AODs. Even in regions where both MODIS and SeaWiFS AODs are missing, the accuracy of the merged AOD is also close to the accuracy of the regions where both MODIS and SeaWiFS have valid observations.
Experimental program in core melt aerosol release and transport
International Nuclear Information System (INIS)
A survey of the requirements for an experimental demonstration of core-melt aerosol release has indicated that the most practical technique is that referred to as skull melting by rf induction. The implied skull would be a preformed ZrO2 or ThO2 shell composed of presintered powdered oxide. The advantages of this method include freedom from foreign container materials, a cold wall environment that ensures furnace integrity, and an almost unrestricted use of steam or other atmosphere as the cover gas. The major emphases of the project will be first to investigate chemical states and adsorption processes for simulant fission products, particularly iodine and cesium, and second, to measure the coagglomeration and total attenuation rate of all vaporized species with the structural material aerosols. The initial part of the effort has been dedicated to the development of a demonstration scale (1.0-kg), water-cooled, skull container with segmented copper components. A second part of the effort has been concerned with the design of a full 10- to 20-kg scale furnace and the selection of a 250-kW-rf power unit to match the furnace
Large-Scale Spray Releases: Initial Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.
2012-12-01
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and
Small-Scale Spray Releases: Initial Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Mahoney, Lenna A.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, Garrett N.; Kurath, Dean E.; Buchmiller, William C.; Smith, Dennese M.; Blanchard, Jeremy; Song, Chen; Daniel, Richard C.; Wells, Beric E.; Tran, Diana N.; Burns, Carolyn A.
2012-11-01
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and
Small-Scale Spray Releases: Initial Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Mahoney, Lenna A.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, Garrett N.; Kurath, Dean E.; Buchmiller, William C.; Smith, Dennese M.; Blanchard, Jeremy; Song, Chen; Daniel, Richard C.; Wells, Beric E.; Tran, Diana N.; Burns, Carolyn A.
2013-05-29
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and net generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of antifoam agents was assessed with most of the simulants. Orifices included round holes and
Energy Technology Data Exchange (ETDEWEB)
Losert, Sabrina; Hess, Adrian [Empa Swiss Federal Laboratories for Materials Science and Technology, Laboratory for Analytical Chemistry (Switzerland); Ilari, Gabriele [Empa Swiss Federal Laboratories for Materials Science and Technology, Electron Microscopy Center (Switzerland); Goetz, Natalie von, E-mail: natalie.von.goetz@chem.ethz.ch; Hungerbuehler, Konrad [ETH Zürich Swiss Federal Institute of Technology Zürich, Institute for Chemical and Bioengineering (Switzerland)
2015-07-15
Nanoparticle-containing sprays are a critical class of consumer products, since human exposure may occur by inhalation of nanoparticles (NP) in the generated aerosols. In this work, the suspension and the released aerosol of six different commercially available consumer spray products were analyzed. Next to a broad spectrum of analytical methods for the characterization of the suspension, a standardized setup for the analysis of aerosol has been used. In addition, a new online coupling technique (SMPS–ICPMS) for the simultaneous analysis of particle size and elemental composition of aerosol particles has been applied. Results obtained with this new method were confirmed by other well-established techniques. Comparison of particles in the original suspensions and in the generated aerosol showed that during spraying single particles of size less than 20 nm had been formed, even though in none of the suspensions particles of size less than 280 nm were present (Aerosol size range scanned: 7–300 nm). Both pump sprays and propellant gas sprays were analyzed and both released particles in the nm size range. Also, both water-based and organic solvent-based sprays released NP. However, a trend was observed that spraying an aqueous suspension contained in a pump spray dispenser after drying resulted in bigger agglomerates than spraying organic suspensions in propellant gas dispensers.
International Nuclear Information System (INIS)
Nanoparticle-containing sprays are a critical class of consumer products, since human exposure may occur by inhalation of nanoparticles (NP) in the generated aerosols. In this work, the suspension and the released aerosol of six different commercially available consumer spray products were analyzed. Next to a broad spectrum of analytical methods for the characterization of the suspension, a standardized setup for the analysis of aerosol has been used. In addition, a new online coupling technique (SMPS–ICPMS) for the simultaneous analysis of particle size and elemental composition of aerosol particles has been applied. Results obtained with this new method were confirmed by other well-established techniques. Comparison of particles in the original suspensions and in the generated aerosol showed that during spraying single particles of size less than 20 nm had been formed, even though in none of the suspensions particles of size less than 280 nm were present (Aerosol size range scanned: 7–300 nm). Both pump sprays and propellant gas sprays were analyzed and both released particles in the nm size range. Also, both water-based and organic solvent-based sprays released NP. However, a trend was observed that spraying an aqueous suspension contained in a pump spray dispenser after drying resulted in bigger agglomerates than spraying organic suspensions in propellant gas dispensers.
Large-Scale Spray Releases: Additional Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.
2013-08-01
One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used
Small-Scale Spray Releases: Additional Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Schonewill, Philip P.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, G. N.; Mahoney, Lenna A.; Tran, Diana N.; Burns, Carolyn A.; Kurath, Dean E.
2013-08-01
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are largely absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale. The small-scale testing and resultant data are described in Mahoney et al. (2012b) and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used to mimic the
Ofner, J.; Zetzsch, C.
2009-04-01
The release of reactive halogen species from sea-salt aerosol offers a class of reactants for heterogeneous reactions of utmost importance. These heterogeneous reactions have been overlooked so far, although they may occur with internal and external mixtures of sea-salt aerosol and organic aerosol or organic matter. Such reactions might constitute sources of gaseous organohalogen compounds or halogenated organic aerosol in the atmospheric boundary layer. Infrared and UV/VIS spectroscopy provide an insight into chemical processes at reactive sites of the organic phase on a molecular level. Model studies of heterogeneous reactions of halogens with different kinds of (secondary) organic aerosols and organic matter were performed using a 700L smog chamber with a solar simulator. The model compounds alpha-pinene, catechol and humic acid have been chosen as precursors/material for the condensed, organic phase of the aerosol. After formation of the secondary organic aerosol or preparation of the organic material and the sea-salt solution the reaction was carried out using molecular chlorine and bromine in the presence of simulated sunlight. Chemical transformation of the organic material was studied using attenuated total reflection Fourier transform infrared spectroscopy (ATR-FTIR) on a ZnSe crystal and diffuse reflectance UV/VIS spectroscopy. An electrostatic precipitator was developed to deposit the aerosol particles on the ATR crystal as a thin film. On the other hand, longpath-FTIR spectroscopy with a 40m White-cell allows us to monitor both the condensed and gas phase of the aerosol in situ in the smog chamber directly. These spectroscopic techniques enable us to characterize different organic aerosol particles and their functional groups at reactive sites on these particles as well as to study aerosol formation and transformation directly. The heterogeneous reaction of reactive halogen species with organic material at atmospheric conditions leads to small reactive
Pereira, Gabriel; Freitas, Saulo R.; Moraes, Elisabete Caria; Ferreira, Nelson Jesus; Shimabukuro, Yosio Edemir; Rao, Vadlamudi Brahmananda; Longo, Karla M.
2009-12-01
Contemporary human activities such as tropical deforestation, land clearing for agriculture, pest control and grassland management lead to biomass burning, which in turn leads to land-cover changes. However, biomass burning emissions are not correctly measured and the methods to assess these emissions form a part of current research area. The traditional methods for estimating aerosols and trace gases released into the atmosphere generally use emission factors associated with fuel loading and moisture characteristics and other parameters that are hard to estimate in near real-time applications. In this paper, fire radiative power (FRP) products were extracted from Moderate Resolution Imaging Spectroradiometer (MODIS) and from the Geostationary Operational Environmental Satellites (GOES) fire products and new South America generic biomes FRE-based smoke aerosol emission coefficients were derived and applied in 2002 South America fire season. The inventory estimated by MODIS and GOES FRP measurements were included in Coupled Aerosol-Tracer Transport model coupled to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS) and evaluated with ground truth collected in Large Scale Biosphere-Atmosphere Smoke, Aerosols, Clouds, rainfall, and Climate (SMOCC) and Radiation, Cloud, and Climate Interactions (RaCCI). Although the linear regression showed that GOES FRP overestimates MODIS FRP observations, the use of a common external parameter such as MODIS aerosol optical depth product could minimize the difference between sensors. The relationship between the PM 2.5μm (Particulate Matter with diameter less than 2.5 μm) and CO (Carbon Monoxide) model shows a good agreement with SMOCC/RaCCI data in the general pattern of temporal evolution. The results showed high correlations, with values between 0.80 and 0.95 (significant at 0.5 level by student t test), for the CATT-BRAMS simulations with PM 2.5μm and CO.
International Nuclear Information System (INIS)
This paper reports on a series of tests conducted to study the mechanical release behavior of sodium aerosols containing nonvolatile fission products during a sodium-concrete reaction in which release behavior due to hydrodynamic breakup of the hydrogen bubble is predominant at the sodium pool surface. In the tests, nonradioactive materials, namely, strontium oxide, europium oxide, and ruthenium particles, whose sizes range from a few microns to several tens of microns, are used as nonvolatile fission product stimulants. The following results are obtained: The sodium aerosol release rate during the sodium-concrete reaction is larger than that of natural evaporation. The difference, however, becomes smaller with increasing sodium temperature: nearly ten times smaller at 400 degrees C and three times at 700 degrees C. The retention factors for the nonvolatile materials in the sodium pool increase to the range of 0.5 to 104 with an increase in the sodium temperature from 400 to 700 degrees C
Ofner, J.; Balzer, N.; Buxmann, J.; Grothe, H.; Krüger, H.; Platt, U.; Schmitt-Kopplin, P.; Zetzsch, C.
2011-12-01
Reactive halogen species are released by various sources like photo-activated sea-salt aerosol or salt pans and salt lakes. These heterogeneous release mechanisms have been overlooked so far, although their potential of interaction with organic aerosols like Secondary Organic Aerosol (SOA), Biomass Burning Organic Aerosol (BBOA) or Atmospheric Humic LIke Substances (HULIS) is completely unknown. Such reactions can constitute sources of gaseous organo-halogen compounds or halogenated organic particles in the atmospheric boundary layer. To study the interaction of organic aerosols with reactive halogen species (RHS), SOA was produced from α-pinene, catechol and guaiacol using an aerosol smog-chamber. The model SOAs were characterized in detail using a variety of physico-chemical methods (Ofner et al., 2011). Those aerosols were exposed to molecular halogens in the presence of UV/VIS irradiation and to halogens, released from simulated natural halogen sources like salt pans, in order to study the complex aerosol-halogen interaction. The heterogeneous reaction of RHS with those model aerosols leads to different gaseous species like CO2, CO and small reactive/toxic molecules like phosgene (COCl2). Hydrogen containing groups on the aerosol particles are destroyed to form HCl or HBr, and a significant formation of C-Br bonds could be verified in the particle phase. Carbonyl containing functional groups of the aerosol are strongly affected by the halogenation process. While changes of functional groups and gaseous species were visible using FTIR spectroscopy, optical properties were studied using Diffuse Reflectance UV/VIS spectroscopy. Overall, the optical properties of the processed organic aerosols are significantly changed. While chlorine causes a "bleaching" of the aerosol particles, bromine shifts the maximum of UV/VIS absorption to the red end of the UV/VIS spectrum. Further physico-chemical changes were recognized according to the aerosol size-distributions or the
Aerosols generated by releases of pressurized powders and solutions in static air
International Nuclear Information System (INIS)
Safety assessments and environmental impact statements for nuclear fuel cycle facilities require an estimate of potential airborne releases caused by accidents. Aerosols generated by accidents are being investigated by Pacific Northwest Laboratory to develop the source terms for these releases. An upper boundary accidental release event would be a pressurized release of powder or liquid in static air. Experiments were run using various source sizes and pressures and measuring the mass airborne and the particle size distribution of aerosols produced by these pressurized releases. Two powder and two liquid sources were used: TiO2 and depleted uranium dioxide (DUO); and aqueous uranine (sodium fluorescein) and uranyl nitrate solutions. Results of the experiments showed that pressurization level and source size were significant variables for the airborne powder releases. For this experimental configuration, the liquid releases were a function of pressure, but volume did not appear to be a significant variable. During the experiments 100 g and 350 g of DUO (1 μm dia) and TiO2 (1.7 μm dia) powders and 100 cm3 and 350 cm3 of uranine and uranyl nitrate solutions were released at pressures ranging from 50 to 500 psig. The average of the largest fractions of powder airborne was about 24%. The maximum amount of liquid source airborne was significantly less, about 0.15%. The median aerodynamic equivalent diameters (AED) for collected airborne powders ranged from 5 to 19 μm; liquids ranged from 2 to 29 μm. All of the releases produced a significant fraction of respirable particles of 10 μm and less. 12 references, 10 figures, 23 tables
Control of releases of radioactive aerosols from object ''Ukryttya'' in 2014
International Nuclear Information System (INIS)
The results of control of radioactive particulate emission are presented from the object ''Ukryttya'' in 2014. The maximal rate of unorganized releases of beta-radiating products of Chernobyl accident was in winter period and reached 3.6 MBq/day. The concentration of long-lived beta-radiating aerosols released in atmosphere from system ''Bypass'' was within the range 0.3 - 5 Bq/m3 (maximal concentration was 14 Bq/m3). Them carriers were particles with active median aerodynamic diameter (AMAD) 0.6 - 6 μm. Mean ratio of concentrations were: 137Cs/241Am = 97 i 241Am/154Eu = 6.2. The concentration of 212Pb - daughter products of thoron consisted as a rule 0.8 - 4 Bq/m3. Maximal concentration of 212Pb aerosols was 9 Bq/m3. The relation of concentrations of daughter products of radon and thoron and 212Pb were about 4. They have AMAD 0.06 - 0.3 μm. A volume activity and dispersity of radioactive aerosols in releases from object ''Ukryttya'' remain constant the last ten years
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M. (Peterson AFB, CO); Ray, J. P.
2006-01-01
Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.
Large methane releases lead to strong aerosol forcing and reduced cloudiness
Directory of Open Access Journals (Sweden)
T. Kurtén
2011-03-01
Full Text Available The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH_{4} levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is predicted to significantly decrease hydroxyl radical (OH concentrations, while moderately increasing ozone (O_{3}. These changes lead to a 70% increase in the atmospheric lifetime of methane, and an 18% decrease in global mean cloud droplet number concentrations (CDNC. The CDNC change causes a radiative forcing that is comparable in magnitude to the longwave radiative forcing ("enhanced greenhouse effect" of the added methane. Together, the indirect CH_{4}-O_{3} and CH_{4}-OH-aerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously large temperature changes associated with historic methane releases.
Large methane releases lead to strong aerosol forcing and reduced cloudiness
Directory of Open Access Journals (Sweden)
T. Kurtén
2011-07-01
Full Text Available The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH_{4} levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is predicted to significantly decrease hydroxyl radical (OH concentrations, while moderately increasing ozone (O_{3}. These changes lead to a 70 % increase in the atmospheric lifetime of methane, and an 18 % decrease in global mean cloud droplet number concentrations (CDNC. The CDNC change causes a radiative forcing that is comparable in magnitude to the longwave radiative forcing ("enhanced greenhouse effect" of the added methane. Together, the indirect CH_{4}-O_{3} and CH_{4}-OH-aerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously large temperature changes associated with historic methane releases.
Sparse Bayesian learning machine for real-time management of reservoir releases
Khalil, Abedalrazq; McKee, Mac; Kemblowski, Mariush; Asefa, Tirusew
2005-11-01
Water scarcity and uncertainties in forecasting future water availabilities present serious problems for basin-scale water management. These problems create a need for intelligent prediction models that learn and adapt to their environment in order to provide water managers with decision-relevant information related to the operation of river systems. This manuscript presents examples of state-of-the-art techniques for forecasting that combine excellent generalization properties and sparse representation within a Bayesian paradigm. The techniques are demonstrated as decision tools to enhance real-time water management. A relevance vector machine, which is a probabilistic model, has been used in an online fashion to provide confident forecasts given knowledge of some state and exogenous conditions. In practical applications, online algorithms should recognize changes in the input space and account for drift in system behavior. Support vectors machines lend themselves particularly well to the detection of drift and hence to the initiation of adaptation in response to a recognized shift in system structure. The resulting model will normally have a structure and parameterization that suits the information content of the available data. The utility and practicality of this proposed approach have been demonstrated with an application in a real case study involving real-time operation of a reservoir in a river basin in southern Utah.
Large methane releases lead to strong aerosol forcing and reduced cloudiness
DEFF Research Database (Denmark)
Kurten, T.; Zhou, L.; Makkonen, R.;
2011-01-01
contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH4) levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is......The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller...... forcing that is comparable in magnitude to the long-wave radiative forcing ("enhanced greenhouse effect") of the added methane. Together, the indirect CH4-O-3 and CH4-OHaerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously...
Tools to estimate PM2.5 mass have expanded in recent years, and now include: 1) stationary monitor readings, 2) Community Multi-Scale Air Quality (CMAQ) model estimates, 3) Hierarchical Bayesian (HB) estimates from combined stationary monitor readings and CMAQ model output; and, ...
Energy Technology Data Exchange (ETDEWEB)
Kleppe, John; Norris, William; Etezadi, Mehdi
2006-07-19
This contract was awarded in response to a proposal in which a deployable plume and aerosol release prediction and tracking system would be designed, fabricated, and tested. The system would gather real time atmospheric data and input it into a real time atmospheric model that could be used for plume predition and tracking. The system would be able to be quickly deployed by aircraft to points of interest or positioned for deployment by vehicles. The system would provide three dimensional (u, v, and w) wind vector data, inversion height measurements, surface wind information, classical weather station data, and solar radiation. The on-board real time computer model would provide the prediction of the behavior of plumes and released aerosols.
Fission product partitioning in aerosol release from simulated spent nuclear fuel
Di Lemma, F.G.; Colle, J.Y.; Rasmussen, G.; Konings, R.J.M.
2015-01-01
Aerosols created by the vaporization of simulated spent nuclear fuel (simfuel) were produced by laser heating techniques and characterised by a wide range of post-analyses. In particular attention has been focused on determining the fission product behaviour in the aerosols, in order to improve the
International Nuclear Information System (INIS)
A program of laboratory investigations has been undertaken at Argonne National Laboratory, under sponsorship of the Electric Power Research Institute, in which the interaction between molten core materials and concrete is studied, with particular emphasis on measurements of the magnitude and chemical species present in the aerosol releases. The experiment technique used in these investigations is direct electrical heating in which a high electric current is passed through the core debris to sustain the high-temperature melt condition for potentially long periods of time. In the scoping experiments completed to date, this technique has been successfully used for corium masses of 5 and 20 kg, generating an internal heating rate of 1 kw/kg and achieving melt temperatures of 2000C. Experiments have been performed both with a concrete base and also with a cooled base with the addition of H2/CO sparging gas to represent chemical processes in a stratified layer. An aerosol and gas sampling system is being used to collect aerosol samples. Test results are now becoming available including masses of aerosols, x-ray diffraction, and scanning electron microscope analyses
Containment behaviour in the event of core melt with gaseous and aerosol releases (CONGA)
Energy Technology Data Exchange (ETDEWEB)
Friesen, E. E-mail: Eckart.friesen@off1.siemens.de; Meseth, J.; Guentay, S.; Suckow, D.; Lopez Jimenez, J.; Herranz, L.; Peyres, V.; De Santi, G.F.; Krasenbrink, A.; Valisi, M.; Mazzocchi, L
2001-11-01
The CONGA project concentrated on theoretical and experimental studies investigating the behaviour of advanced light water reactor containments containing passive containment heat removal systems and catalytic recombiners expected to be effectively operational during a hypothetical severe accident involving large quantities of aerosol particles and noncondensable gases. The central point of interest was the investigation of the effect of aerosol deposition on the condensation heat transfer of specially designed finned-type heat exchangers (HX) as well as the recombination efficiency of catalytic recombiners. A conceptual double-wall Italian PWR design and a SWR1000 design from Siemens were considered specifically as the reference Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) designs. An assessment of selected accident scenarios was performed in order to define the range of boundary conditions necessary to perform the experimental studies of the other work packages. Experimental investigations indicated that aerosol deposition accounted for up to 37% loss in the heat removal capacity of the two-tube-layer BWR HX units. However, no significant heat transfer degradation could be observed for the PWR HX units. These results can be attributed to the important differences in the designs and operating conditions of the two units. The tests to study the effect of hydrogen (simulated by helium) on the heat transfer rate for heat exchanger units designed for BWR and PWR applications indicated a degradation less than 30% under various conditions. This was found to be acceptable within the over capacity designed for the heat exchangers or containment characteristics. The tests performed to study the long-term aerosol behaviour in the pressure suppression chamber of the current operating BWRs indicated that the water pool scrubs the aerosol particles effectively and reduces the ultimate aerosol load expected on the off-gas system. The efficiency of the
Containment behaviour in the event of core melt with gaseous and aerosol releases (CONGA)
International Nuclear Information System (INIS)
The CONGA project concentrated on theoretical and experimental studies investigating the behaviour of advanced light water reactor containments containing passive containment heat removal systems and catalytic recombiners expected to be effectively operational during a hypothetical severe accident involving large quantities of aerosol particles and noncondensable gases. The central point of interest was the investigation of the effect of aerosol deposition on the condensation heat transfer of specially designed finned-type heat exchangers (HX) as well as the recombination efficiency of catalytic recombiners. A conceptual double-wall Italian PWR design and a SWR1000 design from Siemens were considered specifically as the reference Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) designs. An assessment of selected accident scenarios was performed in order to define the range of boundary conditions necessary to perform the experimental studies of the other work packages. Experimental investigations indicated that aerosol deposition accounted for up to 37% loss in the heat removal capacity of the two-tube-layer BWR HX units. However, no significant heat transfer degradation could be observed for the PWR HX units. These results can be attributed to the important differences in the designs and operating conditions of the two units. The tests to study the effect of hydrogen (simulated by helium) on the heat transfer rate for heat exchanger units designed for BWR and PWR applications indicated a degradation less than 30% under various conditions. This was found to be acceptable within the over capacity designed for the heat exchangers or containment characteristics. The tests performed to study the long-term aerosol behaviour in the pressure suppression chamber of the current operating BWRs indicated that the water pool scrubs the aerosol particles effectively and reduces the ultimate aerosol load expected on the off-gas system. The efficiency of the
Energy Technology Data Exchange (ETDEWEB)
Bourham, Mohamed A.; Gilligan, John G.
1999-08-14
Safety considerations in large future fusion reactors like ITER are important before licensing the reactor. Several scenarios are considered hazardous, which include safety of plasma-facing components during hard disruptions, high heat fluxes and thermal stresses during normal operation, accidental energy release, and aerosol formation and transport. Disruption events, in large tokamaks like ITER, are expected to produce local heat fluxes on plasma-facing components, which may exceed 100 GW/m{sup 2} over a period of about 0.1 ms. As a result, the surface temperature dramatically increases, which results in surface melting and vaporization, and produces thermal stresses and surface erosion. Plasma-facing components safety issues extends to cover a wide range of possible scenarios, including disruption severity and the impact of plasma-facing components on disruption parameters, accidental energy release and short/long term LOCA's, and formation of airborne particles by convective current transport during a LOVA (water/air ingress disruption) accident scenario. Study, and evaluation of, disruption-induced aerosol generation and mobilization is essential to characterize database on particulate formation and distribution for large future fusion tokamak reactor like ITER. In order to provide database relevant to ITER, the SIRENS electrothermal plasma facility at NCSU has been modified to closely simulate heat fluxes expected in ITER.
Directory of Open Access Journals (Sweden)
Hong Lei
2016-05-01
Full Text Available Microencapsulation is highly attractive for oral drug delivery. Microparticles are a common form of drug carrier for this purpose. There is still a high demand on efficient methods to fabricate microparticles with uniform sizes and well-controlled particle properties. In this paper, uniform hydroxypropyl methylcellulose phthalate (HPMCP-based pharmaceutical microparticles loaded with either hydrophobic or hydrophilic model drugs have been directly formulated by using a unique aerosol technique, i.e., the microfluidic spray drying technology. A series of microparticles of controllable particle sizes, shapes, and structures are fabricated by tuning the solvent composition and drying temperature. It is found that a more volatile solvent and a higher drying temperature can result in fast evaporation rates to form microparticles of larger lateral size, more irregular shape, and denser matrix. The nature of the model drugs also plays an important role in determining particle properties. The drug release behaviors of the pharmaceutical microparticles are dependent on their structural properties and the nature of a specific drug, as well as sensitive to the pH value of the release medium. Most importantly, drugs in the microparticles obtained by using a more volatile solvent or a higher drying temperature can be well protected from degradation in harsh simulated gastric fluids due to the dense structures of the microparticles, while they can be fast-released in simulated intestinal fluids through particle dissolution. These pharmaceutical microparticles are potentially useful for site-specific (enteric delivery of orally-administered drugs.
Bernhardt, P. A.; Siefring, C. L.; Gatling, G.; Briczinski, S. J., Jr.; Vierinen, J.; Bhatt, A.; Holzworth, R. H., II; McCarthy, M.; Gustavsson, B.; La Hoz, C.; Latteck, R.
2015-12-01
A sounding rocket launched from Andoya, Norway in September 2015 carried 37 rocket motors and a multi-instrument daughter payload into the ionosphere to study the generation of plasma wave electric fields and ionospheric density disturbances by the high-speed injection of dust particles. The primary purpose of the CARE II mission is to validate the dress-particle theory of enhanced incoherent scatter from a dusty plasma and to validate models of plasma instabilities driven by high-speed charged particles. The CARE II chemical payload produces 66 kg of micron-sized dust particles composed of aluminium oxide. In addition to the dust, simple molecular combustion products such as N2, H2, CO2, CO, H20 and NO will be injected into the bottomside of the F-layer. Charging of the dust and ion charge exchange with the molecules yields plasma particles moving at hypersonic velocities. Streaming instabilities and shear electric fields causes plasma turbulence that can be detected using ground radars and in situ plasma instruments. The instrument payload was separated from the chemical release payload soon after launch to measure electric field vectors, electron and ion densities, and integrated electron densities from the rocket to the ground. The chemical release of high speed dust was directed upward on the downleg of the rocket trajectory to intersect the F-Layer. The instrument section was about 600 meters from the dust injection module at the release time. Ground HF and UHF radars were operated to detected scatter and refraction by the modified ionosphere. Optical instruments from airborne and ground observatories were used to map the dispersal of the dust using scattered sunlight. The plasma interactions are being simulated with both fluid and particle-in-cell (PIC) codes. CARE II is a follow-on to the CARE I rocket experiment conducted from Wallops Island Virginia in September 2009.
Large methane releases lead to strong aerosol forcing and reduced cloudiness
DEFF Research Database (Denmark)
Kurten, T.; Zhou, L.; Makkonen, R.;
2011-01-01
The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller...... is predicted to significantly decrease hydroxyl radical (OH) concentrations, while moderately increasing ozone (O-3). These changes lead to a 70% increase in the atmospheric lifetime of methane, and an 18% decrease in global mean cloud droplet number concentrations (CDNC). The CDNC change causes a radiative...... forcing that is comparable in magnitude to the long-wave radiative forcing ("enhanced greenhouse effect") of the added methane. Together, the indirect CH4-O-3 and CH4-OHaerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously...
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Jo, Sang-Hee; Kim, Ki-Hyun
2016-01-15
In this study, an experimental method for the collection and analysis of carbonyl compounds (CCs) released due to the use of electronic cigarettes (e-cigarettes or ECs) was developed and validated through a series of laboratory experiments. As part of this work, the conversion of CCs from a refill solution (e-solution) to aerosol also was investigated based on mass change tracking (MCT) approach. Aerosol samples generated from an e-cigarette were collected manually using 2,4-dinitrophenylhydrazine (DNPH) cartridges at a constant sampling (puffing) velocity of 1 L min(-1) with the following puff conditions: puff duration (2s), interpuff interval (10s), and puff number (5, 10, and 15 times). The MCT approach allowed us to improve the sampling of CCs through critical evaluation of the puff conditions in relation to the consumed quantities of refill solution. The emission concentrations of CCs remained constant when e-cigarettes were sampled at or above 10 puff. Upon aerosolization, the concentrations of formaldehyde and acetaldehyde increased 6.23- and 58.4-fold, respectively, relative to their concentrations in e-solution. Furthermore, a number of CCs were found to be present in the aerosol samples which were not detected in the initial e-solution (e.g., acetone, butyraldehyde, and o-tolualdehyde).
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
International Nuclear Information System (INIS)
Analytical methods are described for (a) sodium; (b) the following anions of sodium aerosols: OH-, CO2- and HCO3-; (c) fission products Cs and Sr. For sodium, the ion selective electrode was used. The anions were determined by a titration method using phenolphthalein and methyl orange as indicators. Atomic absorption spectroscopy was used for Cs and Sr. (U.K.)
Effect of aerosolization on subsequent bacterial survival.
Walter, M V; Marthi, B; Fieland, V P; Ganio, L M
1990-01-01
To determine whether aerosolization could impair bacterial survival, Pseudomonas syringae and Erwinia herbicola were aerosolized in a greenhouse, the aerosol was sampled at various distances from the site of release by using all-glass impingers, and bacterial survival was followed in the impingers for 6 h. Bacterial survival subsequent to aerosolization of P. syringae and E. herbicola was not impaired 1 m from the site of release. P. syringae aerosolized at 3 to 15 m from the site of release ...
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
International Nuclear Information System (INIS)
A comprehensive review has been undertaken of appropriate analytical techniques to monitor and measure the chemical effects that occur in large-scale tests designed to study severe reactor accidents. Various methods have been developed to determine the chemical forms of the vapours, aerosols and deposits generated during and after such integral experiments. Other specific techniques have the long-term potential to provide some of the desired data in greater detail, although considerable efforts are still required to apply these techniques to the study of radioactive debris. Such in-situ and post-test methods of analysis have been also assessed in terms of their applicability to the analysis of samples from the Phebus-FP tests. The recommended in-situ methods of analysis are gamma-ray spectroscopy, potentiometry, mass spectrometry, and Raman/UV-visible absorption spectroscopy. Vapour/aerosol and deposition samples should also be obtained at well-defined time intervals during each experiment for subsequent post-test analysis. No single technique can provide all the necessary chemical data from these samples, and the most appropriate method of analysis involves a complementary combination of autoradiography, AES, IR, MRS, SEMS/EDS, SIMS/LMIS, XPS and XRD
Introduction to Bayesian statistics
Bolstad, William M
2016-01-01
There is a strong upsurge in the use of Bayesian methods in applied statistical analysis, yet most introductory statistics texts only present frequentist methods. Bayesian statistics has many important advantages that students should learn about if they are going into fields where statistics will be used. In this Third Edition, four newly-added chapters address topics that reflect the rapid advances in the field of Bayesian staistics. The author continues to provide a Bayesian treatment of introductory statistical topics, such as scientific data gathering, discrete random variables, robust Bayesian methods, and Bayesian approaches to inferenfe cfor discrete random variables, bionomial proprotion, Poisson, normal mean, and simple linear regression. In addition, newly-developing topics in the field are presented in four new chapters: Bayesian inference with unknown mean and variance; Bayesian inference for Multivariate Normal mean vector; Bayesian inference for Multiple Linear RegressionModel; and Computati...
Bayesian artificial intelligence
Korb, Kevin B
2003-01-01
As the power of Bayesian techniques has become more fully realized, the field of artificial intelligence has embraced Bayesian methodology and integrated it to the point where an introduction to Bayesian techniques is now a core course in many computer science programs. Unlike other books on the subject, Bayesian Artificial Intelligence keeps mathematical detail to a minimum and covers a broad range of topics. The authors integrate all of Bayesian net technology and learning Bayesian net technology and apply them both to knowledge engineering. They emphasize understanding and intuition but also provide the algorithms and technical background needed for applications. Software, exercises, and solutions are available on the authors' website.
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Bayesian Games with Intentions
Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael
2016-01-01
We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Salama, Rania O; Traini, Daniela; Chan, Hak-Kim; Young, Paul M
2008-09-01
Three in vitro methodologies were evaluated as models for the analysis of drug release from controlled release (CR) microparticulates for inhalation. USP Apparatus 2 (dissolution model), USP Apparatus 4 (flow through model) and a modified Franz cell (diffusion model), were investigated using identical sink volumes and temperatures (1000 ml and 37 degrees C). Microparticulates containing DSCG and different percentages of PVA (0%, 30%, 50%, 70% and 90%) were used as model CR formulations. Evaluation of the release profiles of DSCG from the modified PVA formulations, suggested that all data fitted a Weibull distribution model with R2 > or =0.942. Statistical analysis of the t(d) (time for 63.2% drug release) indicated that all methodologies could distinguish between microparticles that did or did not contain PVA (Students t-test, p or =0.862 for the diffusion methodology data set). Due to the relatively low water content in the respiratory tract and the lack of differentiation between formulations for USP Apparatus 2 and 4, it is concluded that the diffusion model is more applicable for the evaluation of CR inhalation medicines. PMID:18534832
Energy Technology Data Exchange (ETDEWEB)
Journeau, Ch.; Piluso, P.; Correggio, P.; Godin-Jacqmin, L
2007-07-01
In a hypothetical case of severe accident in a PWR type VVER-440, a complex corium pool could be formed and fission products could be released. In order to study aerosols release in terms of mechanisms, kinetics, nature or quantity, and to better precise the source term of VVER-440, a series of experiments have been performed in the Colima facility and the test Colima CA-U3 has been successfully performed thanks to technological modifications to melt a prototypical corium at 2760 C degrees. Specific instrumentation has allowed us to follow the evolution of the corium melt and the release, transport and deposition of the fission products. The main conclusions are: -) there is a large release of Cr, Te, Sr, Pr and Rh (>95%w), -) there is a significant release of Fe (50%w), -) there is a small release of Ba, Ce, La, Nb, Nd and Y (<90%w), -) there is a very small release of U in proportion (<5%w) but it is one of the major released species in mass, and -) there is no release of Zr. The Colima experimental results are consistent with previous experiments on irradiated fuels except for Ba, Fe and U releases. (A.C.)
International Nuclear Information System (INIS)
The US program LACE (LWR Aerosol Containment Experiments), in which Italy participates together with several European countries, Canada and Japan, aims at evaluating by means of a large scale experimental activity at HEDL the retention in the pipings and primary container of the radioactive aerosol released following severe accidents in light water reactors. At the same time these experiences will make available data through which the codes used to analyse the behaviour of the aerosol in the containment and to verify whether by means of the codes of thermohydraulic computation it is possible to evaluate with sufficient accuracy variable influencing the aerosol behaviour, can be validated. This report shows and compares the results obtained by the participants in the LACE program with the aerosol containment codes NAVA 5 and CONTAIN for the pre-test computations of the test LA 1, in which an accident called containment by pass is simulated
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes...... and edges. The nodes represent variables, which may be either discrete or continuous. An edge between two nodes A and B indicates a direct influence between the state of A and the state of B, which in some domains can also be interpreted as a causal relation. The wide-spread use of Bayesian networks...... is largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Frühwirth-Schnatter, Sylvia
1990-01-01
In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo
2014-06-01
Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.
International Nuclear Information System (INIS)
Organic aerosols scatter solar radiation. They may also either enhance or decrease concentrations of cloud condensation nuclei. This paper summarizes observed concentrations of aerosols in remote continental and marine locations and provides estimates for the sources of organic aerosol matter. The anthropogenic sources of organic aerosols may be as large as the anthropogenic sources of sulfate aerosols, implying a similar magnitude of direct forcing of climate. The source estimates are highly uncertain and subject to revision in the future. A slow secondary source of organic aerosols of unknown origin may contribute to the observed oceanic concentrations. The role of organic aerosols acting as cloud condensation nuclei (CCN) is described and it is concluded that they may either enhance or decrease the ability of anthropogenic sulfate aerosols to act as CCN
Review of models applicable to accident aerosols
International Nuclear Information System (INIS)
Estimations of potential airborne-particle releases are essential in safety assessments of nuclear-fuel facilities. This report is a review of aerosol behavior models that have potential applications for predicting aerosol characteristics in compartments containing accident-generated aerosol sources. Such characterization of the accident-generated aerosols is a necessary step toward estimating their eventual release in any accident scenario. Existing aerosol models can predict the size distribution, concentration, and composition of aerosols as they are acted on by ventilation, diffusion, gravity, coagulation, and other phenomena. Models developed in the fields of fluid mechanics, indoor air pollution, and nuclear-reactor accidents are reviewed with this nuclear fuel facility application in mind. The various capabilities of modeling aerosol behavior are tabulated and discussed, and recommendations are made for applying the models to problems of differing complexity
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution...
Bayesian Lensing Shear Measurement
Bernstein, Gary M
2013-01-01
We derive an estimator of weak gravitational lensing shear from background galaxy images that avoids noise-induced biases through a rigorous Bayesian treatment of the measurement. The Bayesian formalism requires a prior describing the (noiseless) distribution of the target galaxy population over some parameter space; this prior can be constructed from low-noise images of a subsample of the target population, attainable from long integrations of a fraction of the survey field. We find two ways to combine this exact treatment of noise with rigorous treatment of the effects of the instrumental point-spread function and sampling. The Bayesian model fitting (BMF) method assigns a likelihood of the pixel data to galaxy models (e.g. Sersic ellipses), and requires the unlensed distribution of galaxies over the model parameters as a prior. The Bayesian Fourier domain (BFD) method compresses galaxies to a small set of weighted moments calculated after PSF correction in Fourier space. It requires the unlensed distributi...
Malicious Bayesian Congestion Games
Gairing, Martin
2008-01-01
In this paper, we introduce malicious Bayesian congestion games as an extension to congestion games where players might act in a malicious way. In such a game each player has two types. Either the player is a rational player seeking to minimize her own delay, or - with a certain probability - the player is malicious in which case her only goal is to disturb the other players as much as possible. We show that such games do in general not possess a Bayesian Nash equilibrium in pure strategies (i.e. a pure Bayesian Nash equilibrium). Moreover, given a game, we show that it is NP-complete to decide whether it admits a pure Bayesian Nash equilibrium. This result even holds when resource latency functions are linear, each player is malicious with the same probability, and all strategy sets consist of singleton sets. For a slightly more restricted class of malicious Bayesian congestion games, we provide easy checkable properties that are necessary and sufficient for the existence of a pure Bayesian Nash equilibrium....
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Hybrid Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2012-01-01
Bayesian Optimization aims at optimizing an unknown non-convex/concave function that is costly to evaluate. We are interested in application scenarios where concurrent function evaluations are possible. Under such a setting, BO could choose to either sequentially evaluate the function, one input at a time and wait for the output of the function before making the next selection, or evaluate the function at a batch of multiple inputs at once. These two different settings are commonly referred to as the sequential and batch settings of Bayesian Optimization. In general, the sequential setting leads to better optimization performance as each function evaluation is selected with more information, whereas the batch setting has an advantage in terms of the total experimental time (the number of iterations). In this work, our goal is to combine the strength of both settings. Specifically, we systematically analyze Bayesian optimization using Gaussian process as the posterior estimator and provide a hybrid algorithm t...
Loredo, T J
2004-01-01
I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Bayesian and frequentist inequality tests
David M. Kaplan; Zhuo, Longhao
2016-01-01
Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...
A. Korattikara; V. Rathod; K. Murphy; M. Welling
2015-01-01
We consider the problem of Bayesian parameter estimation for deep neural networks, which is important in problem settings where we may have little data, and/ or where we need accurate posterior predictive densities p(y|x, D), e.g., for applications involving bandits or active learning. One simple ap
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
Loredo, Thomas J.
2004-04-01
I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimental...
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...
THE EFFECT OF AEROSOLIZATION ON SUBSEQUENT BACTERIAL SURVIVAL
To determine whether aerosolization could impair baterial survival, Pseudomonas syringae and Erwinia herbicola were aerosolized in a greenhouse, the aerosol was sampled at various distances from the site of release by using all-glass impingers, and bacterial survival was followed...
Restrepo, Marcos I; Keyt, Holly; Reyes, Luis F
2015-06-01
Administration of medications via aerosolization is potentially an ideal strategy to treat airway diseases. This delivery method ensures high concentrations of the medication in the targeted tissues, the airways, with generally lower systemic absorption and systemic adverse effects. Aerosolized antibiotics have been tested as treatment for bacterial infections in patients with cystic fibrosis (CF), non-CF bronchiectasis (NCFB), and ventilator-associated pneumonia (VAP). The most successful application of this to date is treatment of infections in patients with CF. It has been hypothesized that similar success would be seen in NCFB and in difficult-to-treat hospital-acquired infections such as VAP. This review summarizes the available evidence supporting the use of aerosolized antibiotics and addresses the specific considerations that clinicians should recognize when prescribing an aerosolized antibiotic for patients with CF, NCFB, and VAP.
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
DEFF Research Database (Denmark)
Mørup, Morten; Schmidt, Mikkel N
2012-01-01
Many networks of scientific interest naturally decompose into clusters or communities with comparatively fewer external than internal links; however, current Bayesian models of network communities do not exert this intuitive notion of communities. We formulate a nonparametric Bayesian model...... for community detection consistent with an intuitive definition of communities and present a Markov chain Monte Carlo procedure for inferring the community structure. A Matlab toolbox with the proposed inference procedure is available for download. On synthetic and real networks, our model detects communities...... consistent with ground truth, and on real networks, it outperforms existing approaches in predicting missing links. This suggests that community structure is an important structural property of networks that should be explicitly modeled....
Brody, Samuel; Lapata, Mirella
2009-01-01
Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...
Bayesian Generalized Rating Curves
Helgi Sigurðarson 1985
2014-01-01
A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...
Efficient Bayesian Phase Estimation
Wiebe, Nathan; Granade, Chris
2016-07-01
We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory
2016-04-01
Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.
Buseck, P. R.; Schwartz, S. E.
2003-12-01
It is widely believed that "On a clear day you can see forever," as proclaimed in the 1965 Broadway musical of the same name. While an admittedly beautiful thought, we all know that this concept is only figurative. Aside from Earth's curvature and Rayleigh scattering by air molecules, aerosols - colloidal suspensions of solid or liquid particles in a gas - limit our vision. Even on the clearest day, there are billions of aerosol particles per cubic meter of air.Atmospheric aerosols are commonly referred to as smoke, dust, haze, and smog, terms that are loosely reflective of their origin and composition. Aerosol particles have arisen naturally for eons from sea spray, volcanic emissions, wind entrainment of mineral dust, wildfires, and gas-to-particle conversion of hydrocarbons from plants and dimethylsulfide from the oceans. However, over the industrial period, the natural background aerosol has been greatly augmented by anthropogenic contributions, i.e., those produced by human activities. One manifestation of this impact is reduced visibility (Figure 1). Thus, perhaps more than in other realms of geochemistry, when considering the composition of the troposphere one must consider the effects of these activities. The atmosphere has become a reservoir for vast quantities of anthropogenic emissions that exert important perturbations on it and on the planetary ecosystem in general. Consequently, much recent research focuses on the effects of human activities on the atmosphere and, through them, on the environment and Earth's climate. For these reasons consideration of the geochemistry of the atmosphere, and of atmospheric aerosols in particular, must include the effects of human activities. (201K)Figure 1. Impairment of visibility by aerosols. Photographs at Yosemite National Park, California, USA. (a) Low aerosol concentration (particulate matter of aerodynamic diameter less than 2.5 μm, PM2.5=0.3 μg m-3; particulate matter of aerodynamic diameter less than 10
MODIS 3 km aerosol product: algorithm and global perspective
Remer, L. A.; Mattoo, S; R. C. Levy; L. A. Munchak
2013-01-01
After more than a decade of producing a nominal 10 km aerosol product based on the dark target method, the MODerate resolution Imaging Spectroradiometer (MODIS) aerosol team will be releasing a nominal 3 km product as part of their Collection 6 release. The new product differs from the original 10 km product only in the manner in which reflectance pixels are ingested, organized and selected by the aerosol algorithm. Overall, the 3 km product closely mirrors the 10 km product. H...
Bayesian optimization for materials design
Frazier, Peter I.; Wang, Jialei
2015-01-01
We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...
Bayesian Network Based XP Process Modelling
Directory of Open Access Journals (Sweden)
Mohamed Abouelela
2010-07-01
Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.
Bayesian Posteriors Without Bayes' Theorem
Hill, Theodore P
2012-01-01
The classical Bayesian posterior arises naturally as the unique solution of several different optimization problems, without the necessity of interpreting data as conditional probabilities and then using Bayes' Theorem. For example, the classical Bayesian posterior is the unique posterior that minimizes the loss of Shannon information in combining the prior and the likelihood distributions. These results, direct corollaries of recent results about conflations of probability distributions, reinforce the use of Bayesian posteriors, and may help partially reconcile some of the differences between classical and Bayesian statistics.
Toxicity of atmospheric aerosols on marine phytoplankton
Paytan, A.; Mackey, K.R.M.; Chen, Y.; Lima, I.D.; Doney, S.C.; Mahowald, N.; Labiosa, R.; Post, A.F.
2009-01-01
Atmospheric aerosol deposition is an important source of nutrients and trace metals to the open ocean that can enhance ocean productivity and carbon sequestration and thus influence atmospheric carbon dioxide concentrations and climate. Using aerosol samples from different back trajectories in incubation experiments with natural communities, we demonstrate that the response of phytoplankton growth to aerosol additions depends on specific components in aerosols and differs across phytoplankton species. Aerosol additions enhanced growth by releasing nitrogen and phosphorus, but not all aerosols stimulated growth. Toxic effects were observed with some aerosols, where the toxicity affected picoeukaryotes and Synechococcus but not Prochlorococcus.We suggest that the toxicity could be due to high copper concentrations in these aerosols and support this by laboratory copper toxicity tests preformed with Synechococcus cultures. However, it is possible that other elements present in the aerosols or unknown synergistic effects between these elements could have also contributed to the toxic effect. Anthropogenic emissions are increasing atmospheric copper deposition sharply, and based on coupled atmosphere-ocean calculations, we show that this deposition can potentially alter patterns of marine primary production and community structure in high aerosol, low chlorophyll areas, particularly in the Bay of Bengal and downwind of South and East Asia.
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Computationally efficient Bayesian tracking
Aughenbaugh, Jason; La Cour, Brian
2012-06-01
In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model......This paper describes the use of model-based geostatistics for choosing the set of sampling locations, collectively called the design, to be used in a geostatistical analysis. Two types of design situation are considered. These are retrospective design, which concerns the addition of sampling...
Inference in hybrid Bayesian networks
DEFF Research Database (Denmark)
Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;
2009-01-01
Since the 1980s, Bayesian Networks (BNs) have become increasingly popular for building statistical models of complex systems. This is particularly true for boolean systems, where BNs often prove to be a more efficient modelling framework than traditional reliability-techniques (like fault trees...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Code Development on Aerosol Behavior under Severe Accident-Aerosol Coagulation
Energy Technology Data Exchange (ETDEWEB)
Ha, Kwang Soon; Kim, Sung Il; Ryu, Eun Hyun [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)
2015-10-15
The behaviors of the larger aerosol particles are described usually by continuum mechanics. The smallest particles have diameters less than the mean free path of gas phase molecules and the behavior of these particles can often be described well by free molecular physics. The vast majority of aerosol particles arising in reactor accident analyses have behaviors in the very complicated regime intermediate between the continuum mechanics and free molecular limit. The package includes initial inventories, release from fuel and debris, aerosol dynamics with vapor condensation and revaporization, deposition on structure surfaces, transport through flow paths, and removal by engineered safety features. Aerosol dynamic processes and the condensation and evaporation of fission product vapors after release from fuel are considered within each MELCOR control volume. The aerosol dynamics models are based on MAEROS, a multi-section, multicomponent aerosol dynamics code, but without calculation of condensation. Aerosols can deposit directly on surfaces such as heat structures and water pools, or can agglomerate and eventually fall out once they exceed the largest size specified by the user for the aerosol size distribution. Aerosols deposited on surfaces cannot currently be resuspended.
Implementing Bayesian Vector Autoregressions Implementing Bayesian Vector Autoregressions
Directory of Open Access Journals (Sweden)
Richard M. Todd
1988-03-01
Full Text Available Implementing Bayesian Vector Autoregressions This paper discusses how the Bayesian approach can be used to construct a type of multivariate forecasting model known as a Bayesian vector autoregression (BVAR. In doing so, we mainly explain Doan, Littermann, and Sims (1984 propositions on how to estimate a BVAR based on a certain family of prior probability distributions. indexed by a fairly small set of hyperparameters. There is also a discussion on how to specify a BVAR and set up a BVAR database. A 4-variable model is used to iliustrate the BVAR approach.
Dynamic Bayesian diffusion estimation
Dedecius, K
2012-01-01
The rapidly increasing complexity of (mainly wireless) ad-hoc networks stresses the need of reliable distributed estimation of several variables of interest. The widely used centralized approach, in which the network nodes communicate their data with a single specialized point, suffers from high communication overheads and represents a potentially dangerous concept with a single point of failure needing special treatment. This paper's aim is to contribute to another quite recent method called diffusion estimation. By decentralizing the operating environment, the network nodes communicate just within a close neighbourhood. We adopt the Bayesian framework to modelling and estimation, which, unlike the traditional approaches, abstracts from a particular model case. This leads to a very scalable and universal method, applicable to a wide class of different models. A particularly interesting case - the Gaussian regressive model - is derived as an example.
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
International Nuclear Information System (INIS)
This report summarizes the work on the development of fibre metallic prefilters to be placed upstream of HEPA filters for the exhaust gases of nuclear process plants. Investigations at ambient and high temperature were carried out. Measurements of the filtration performance of Bekipor porous webs and sintered mats were performed in the AFLT (aerosol filtration at low temperature) unit with a throughput of 15 m3/h. A parametric study on the influence of particle size, fibre diameter, number of layers and superficial velocity led to the optimum choice of the working parameters. Three selected filter types were then tested with polydisperse aerosols using a candle-type filter configuration or a flat-type filter configuration. The small-diameter candle type is not well suited for a spraying nozzles regeneration system so that only the flat-type filter was retained for high-temperature tests. A high-temperature test unit (AFHT) with a throughput of 8 to 10 m3/h at 4000C was used to test the three filter types with an aerosol generated by high-temperature calcination of a simulated nitric acid waste solution traced with 134Cs. The regeneration of the filter by spray washing and the effect of the regeneration on the filter performance was studied for the three filter types. The porous mats have a higher dust loading capacity than the sintered web which means that their regeneration frequency can be kept lower
Washington University St Louis — TOMS_AI_G is an aerosol related dataset derived from the Total Ozone Monitoring Satellite (TOMS) Sensor. The TOMS aerosol index arises from absorbing aerosols such...
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Electrically Driven Technologies for Radioactive Aerosol Abatement
Energy Technology Data Exchange (ETDEWEB)
David W. DePaoli; Ofodike A. Ezekoye; Costas Tsouris; Valmor F. de Almeida
2003-01-28
The purpose of this research project was to develop an improved understanding of how electriexecy driven processes, including electrocoalescence, acoustic agglomeration, and electric filtration, may be employed to efficiently treat problems caused by the formation of aerosols during DOE waste treatment operations. The production of aerosols during treatment and retrieval operations in radioactive waste tanks and during thermal treatment operations such as calcination presents a significant problem of cost, worker exposure, potential for release, and increased waste volume.
Irregular-Time Bayesian Networks
Ramati, Michael
2012-01-01
In many fields observations are performed irregularly along time, due to either measurement limitations or lack of a constant immanent rate. While discrete-time Markov models (as Dynamic Bayesian Networks) introduce either inefficient computation or an information loss to reasoning about such processes, continuous-time Markov models assume either a discrete state space (as Continuous-Time Bayesian Networks), or a flat continuous state space (as stochastic dif- ferential equations). To address these problems, we present a new modeling class called Irregular-Time Bayesian Networks (ITBNs), generalizing Dynamic Bayesian Networks, allowing substantially more compact representations, and increasing the expressivity of the temporal dynamics. In addition, a globally optimal solution is guaranteed when learning temporal systems, provided that they are fully observed at the same irregularly spaced time-points, and a semiparametric subclass of ITBNs is introduced to allow further adaptation to the irregular nature of t...
Oak Ridge National Laboratory — The aerosol observation system (AOS) is the primary Atmospheric Radiation Measurement (ARM) platform for in situ aerosol measurements at the surface. The principal...
Neuronanatomy, neurology and Bayesian networks
Bielza Lozoya, Maria Concepcion
2014-01-01
Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...
Dale Poirier
2008-01-01
This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.
Dynamic Batch Bayesian Optimization
Azimi, Javad; Fern, Xiaoli
2011-01-01
Bayesian optimization (BO) algorithms try to optimize an unknown function that is expensive to evaluate using minimum number of evaluations/experiments. Most of the proposed algorithms in BO are sequential, where only one experiment is selected at each iteration. This method can be time inefficient when each experiment takes a long time and more than one experiment can be ran concurrently. On the other hand, requesting a fix-sized batch of experiments at each iteration causes performance inefficiency in BO compared to the sequential policies. In this paper, we present an algorithm that asks a batch of experiments at each time step t where the batch size p_t is dynamically determined in each step. Our algorithm is based on the observation that the sequence of experiments selected by the sequential policy can sometimes be almost independent from each other. Our algorithm identifies such scenarios and request those experiments at the same time without degrading the performance. We evaluate our proposed method us...
Nonparametric Bayesian Classification
Coram, M A
2002-01-01
A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...
Bayesian Stochastic Search for the Best Predictors: Nowcasting GDP Growth
Nikolaus Hautsch; Fuyu Yang
2014-01-01
We propose a Bayesian framework for nowcasting GDP growth in real time. Using vintage data on macroeconomic announcements we set up a state space system connecting latent GDP growth rates to agencies' releases of GDP and other economic indicators. We propose a Gibbs sampling scheme to filter out daily GDP growth rates using all available macroeconomic information. The sample draws from the resulting posterior distribution, thereby allowing us to simulate backcasting, nowcasting, and forecasti...
Bayesian disclosure risk assessment: predicting small frequencies in contingency tables
Forster, Jonathan J.; Webb, Emily L
2007-01-01
We propose an approach for assessing the risk of individual identification in the release of categorical data. This requires the accurate calculation of predictive probabilities for those cells in a contingency table which have small sample frequencies, making the problem somewhat different from usual contingency table estimation, where interest is generally focussed on regions of high probability. Our approach is Bayesian and provides posterior predictive probabilities of identification risk...
Directory of Open Access Journals (Sweden)
K. Hara
2013-10-01
Full Text Available Unusual aerosol enhancement is often observed at Syowa Station, Antarctica during winter through spring. Simultaneous aerosol measurements near the surface and in the upper atmosphere were conducted twice using a ground-based optical particle counter, a balloon-borne optical particle counter, and micro-pulse LIDAR (MPL in August and September 2012. During 13–15 August, aerosol enhancement occurred immediately after a storm condition. A high backscatter ratio and aerosol concentrations were observed from the surface to ca. 2.5 km over Syowa Station. Clouds appeared occasionally at the top of aerosol-enhanced layer during the episode. Aerosol enhancement was terminated on 15 August by strong winds caused by a cyclone's approach. In the second case on 5–7 September, aerosol number concentrations in Dp > 0.3 μm near the surface reached > 104 L−1 at about 15:00 UT on 5 September in spite of calm wind conditions, whereas MPL measurement exhibited aerosols were enhanced at about 04:00 UT at 1000–1500 m above Syowa Station. The aerosol enhancement occurred near the surface–ca. 4 km. In both cases, air masses with high aerosol enhancement below 2.5–3 km were transported mostly from the boundary layer over the sea-ice area. In addition, air masses at 3–4 km in the second case came from the boundary layer over the open-sea area. This air mass history strongly suggests that dispersion of sea-salt particles from the sea-ice surface contributes considerably to the aerosol enhancement in the lower free troposphere (about 3 km and that the release of sea-salt particles from the ocean surface engenders high aerosol concentrations in the free troposphere (3–4 km.
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Institute of Scientific and Technical Information of China (English)
Fengfu Fu; Liangjun Xu; Wei Ye; Yiquan Chen; Mingyu Jiang; Xueqin Xu
2006-01-01
Different-sized aerosols were collected by an Andersen air sampler to observe the detailed morphology of the black carbon (BC) aerosols which were separated chemically from the other accompanying aerosols, using a Scanning Electron Microscope equipped with an Energy Dispersive X-ray Spectrometer (SEM-EDX). The results indicate that most BC aerosols are spherical particles of about 50 nm in diameter and with a homogeneous surface. Results also show that these particles aggregate with other aerosols or with themselves to form larger agglomerates in the micrometer range. The shape of these 50-nm BC spherical particles was found to be very similar to that of BC particles released from petroleum-powered vehicular internal combustion engines. These spherical BC particles were shown to be different from the previously reported fullerenes found using Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight Mass Spectrometry (MALDI-TOF-MS).
Aerosol Climate Time Series in ESA Aerosol_cci
Popp, Thomas; de Leeuw, Gerrit; Pinnock, Simon
2016-04-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. Meanwhile, full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer, but also from ATSR instruments and the POLDER sensor), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which were also validated and improved in the reprocessing. For the three ATSR algorithms the use of an ensemble method was tested. The paper will summarize and discuss the status of dataset reprocessing and validation. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension
Attention in a bayesian framework
DEFF Research Database (Denmark)
Whiteley, Louise Emma; Sahani, Maneesh
2012-01-01
The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models...... of perception, and use this observation to frame a new computational account of the need for, and action of, attention - unifying diverse attentional phenomena in a way that goes beyond previous inferential, probabilistic and Bayesian models. Attentional effects are most evident in cluttered environments......, and include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
Bayesian test and Kuhn's paradigm
Institute of Scientific and Technical Information of China (English)
Chen Xiaoping
2006-01-01
Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.
Perception, illusions and Bayesian inference.
Nour, Matthew M; Nour, Joseph M
2015-01-01
Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
Bayesian methods for proteomic biomarker development
Directory of Open Access Journals (Sweden)
Belinda Hernández
2015-12-01
In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.
Bayesian variable order Markov models: Towards Bayesian predictive state representations
C. Dimitrakakis
2009-01-01
We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st
Aerosol typing - key information from aerosol studies
Mona, Lucia; Kahn, Ralph; Papagiannopoulos, Nikolaos; Holzer-Popp, Thomas; Pappalardo, Gelsomina
2016-04-01
Aerosol typing is a key source of aerosol information from ground-based and satellite-borne instruments. Depending on the specific measurement technique, aerosol typing can be used as input for retrievals or represents an output for other applications. Typically aerosol retrievals require some a priori or external aerosol type information. The accuracy of the derived aerosol products strongly depends on the reliability of these assumptions. Different sensors can make use of different aerosol type inputs. A critical review and harmonization of these procedures could significantly reduce related uncertainties. On the other hand, satellite measurements in recent years are providing valuable information about the global distribution of aerosol types, showing for example the main source regions and typical transport paths. Climatological studies of aerosol load at global and regional scales often rely on inferred aerosol type. There is still a high degree of inhomogeneity among satellite aerosol typing schemes, which makes the use different sensor datasets in a consistent way difficult. Knowledge of the 4d aerosol type distribution at these scales is essential for understanding the impact of different aerosol sources on climate, precipitation and air quality. All this information is needed for planning upcoming aerosol emissions policies. The exchange of expertise and the communication among satellite and ground-based measurement communities is fundamental for improving long-term dataset consistency, and for reducing aerosol type distribution uncertainties. Aerosol typing has been recognized as one of its high-priority activities of the AEROSAT (International Satellite Aerosol Science Network, http://aero-sat.org/) initiative. In the AEROSAT framework, a first critical review of aerosol typing procedures has been carried out. The review underlines the high heterogeneity in many aspects: approach, nomenclature, assumed number of components and parameters used for the
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
A Bayesian Nonparametric Approach to Test Equating
Karabatsos, George; Walker, Stephen G.
2009-01-01
A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…
Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.
Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian
2016-05-01
Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.
Bayesian Classification of Image Structures
DEFF Research Database (Denmark)
Goswami, Dibyendu; Kalkan, Sinan; Krüger, Norbert
2009-01-01
In this paper, we describe work on Bayesian classi ers for distinguishing between homogeneous structures, textures, edges and junctions. We build semi-local classiers from hand-labeled images to distinguish between these four different kinds of structures based on the concept of intrinsic dimensi...
Bayesian Agglomerative Clustering with Coalescents
Teh, Yee Whye; Daumé III, Hal; Roy, Daniel
2009-01-01
We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.
Bayesian NL interpretation and learning
H. Zeevat
2011-01-01
Everyday natural language communication is normally successful, even though contemporary computational linguistics has shown that NL is characterised by very high degree of ambiguity and the results of stochastic methods are not good enough to explain the high success rate. Bayesian natural language
Differentiated Bayesian Conjoint Choice Designs
Z. Sándor (Zsolt); M. Wedel (Michel)
2003-01-01
textabstractPrevious conjoint choice design construction procedures have produced a single design that is administered to all subjects. This paper proposes to construct a limited set of different designs. The designs are constructed in a Bayesian fashion, taking into account prior uncertainty about
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...
Bayesian stable isotope mixing models
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
2013-01-01
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...
3-D contextual Bayesian classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
In this paper we will consider extensions of a series of Bayesian 2-D contextual classification pocedures proposed by Owen (1984) Hjort & Mohn (1984) and Welch & Salter (1971) and Haslett (1985) to 3 spatial dimensions. It is evident that compared to classical pixelwise classification further...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Bayesian Analysis of Experimental Data
Directory of Open Access Journals (Sweden)
Lalmohan Bhar
2013-10-01
Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.
MODIS 3 km aerosol product: algorithm and global perspective
Remer, L. A.; Mattoo, S; R. C. Levy; Munchak, L.
2013-01-01
After more than a decade of producing a nominal 10 km aerosol product based on the dark target method, the MODIS aerosol team will be releasing a nominal 3 km product as part of their Collection 6 release. The new product differs from the original 10 km product only in the manner in which reflectance pixels are ingested, organized and selected by the aerosol algorithm. Overall, the 3 km product closely mirrors the 10 km product. However, the finer resolution product is able to retrieve over o...
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
Aerosol mobility size spectrometer
Wang, Jian; Kulkarni, Pramod
2007-11-20
A device for measuring aerosol size distribution within a sample containing aerosol particles. The device generally includes a spectrometer housing defining an interior chamber and a camera for recording aerosol size streams exiting the chamber. The housing includes an inlet for introducing a flow medium into the chamber in a flow direction, an aerosol injection port adjacent the inlet for introducing a charged aerosol sample into the chamber, a separation section for applying an electric field to the aerosol sample across the flow direction and an outlet opposite the inlet. In the separation section, the aerosol sample becomes entrained in the flow medium and the aerosol particles within the aerosol sample are separated by size into a plurality of aerosol flow streams under the influence of the electric field. The camera is disposed adjacent the housing outlet for optically detecting a relative position of at least one aerosol flow stream exiting the outlet and for optically detecting the number of aerosol particles within the at least one aerosol flow stream.
Bayesian analysis of rare events
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Protection of air in premises and environment against beryllium aerosols
Energy Technology Data Exchange (ETDEWEB)
Bitkolov, N.Z.; Vishnevsky, E.P.; Krupkin, A.V. [Research Inst. of Industrial and Marine Medicine, St. Petersburg (Russian Federation)
1998-01-01
First and foremost, the danger of beryllium aerosols concerns a possibility of their inhalation. The situation is aggravated with high biological activity of the beryllium in a human lung. The small allowable beryllium aerosols` concentration in air poses a rather complex and expensive problem of the pollution prevention and clearing up of air. The delivery and transportation of beryllium aerosols from sites of their formation are defined by the circuit of ventilation, that forms aerodynamics of air flows in premises, and aerodynamic links between premises. The causes of aerosols release in air of premises from hoods, isolated and hermetically sealed vessels can be vibrations, as well as pulses of temperature and pressure. Furthermore, it is possible the redispersion of aerosols from dirty surfaces. The effective protection of air against beryllium aerosols at industrial plants is provided by a complex of hygienic measures: from individual means of breath protection up to collective means of the prevention of air pollution. (J.P.N.)
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...
Directory of Open Access Journals (Sweden)
A. Määttä
2013-09-01
Full Text Available We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI. Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.
Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods
Zhu, Weixuan
2016-01-01
The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...
Aerosol satellite remote sensing
Veefkind, Joris Pepijn
2001-01-01
Aerosols are inportant for many processes in the atmosphere. Aerosols are a leading uncertainty in predicting global climate change, To a large extent this uncertainty is caused by a lack of knowledge on the occurrence and concentration of aerosols. On global scale, this information can only be o
Bayesian versus 'plain-vanilla Bayesian' multitarget statistics
Mahler, Ronald P. S.
2004-08-01
Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."
Bayesian inference on proportional elections.
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259
Bayesian approach to rough set
Marwala, Tshilidzi
2007-01-01
This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.
Bayesian priors for transiting planets
Kipping, David M
2016-01-01
As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...
Bayesian Source Separation and Localization
Knuth, K H
1998-01-01
The problem of mixed signals occurs in many different contexts; one of the most familiar being acoustics. The forward problem in acoustics consists of finding the sound pressure levels at various detectors resulting from sound signals emanating from the active acoustic sources. The inverse problem consists of using the sound recorded by the detectors to separate the signals and recover the original source waveforms. In general, the inverse problem is unsolvable without additional information. This general problem is called source separation, and several techniques have been developed that utilize maximum entropy, minimum mutual information, and maximum likelihood. In previous work, it has been demonstrated that these techniques can be recast in a Bayesian framework. This paper demonstrates the power of the Bayesian approach, which provides a natural means for incorporating prior information into a source model. An algorithm is developed that utilizes information regarding both the statistics of the amplitudes...
Bayesian Inference for Radio Observations
Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin
2015-01-01
(Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...
Aerosol behavior in a steam-air environment
International Nuclear Information System (INIS)
The behavior of aerosols assumed to be characteristic of those generated during accident sequences and released into containment is being studied in the Nuclear Safety Pilot Plant (NSPP). Observation on the behavior of U3O8 aerosol, Fe2O3 aerosol, concrete aerosol, and various mixtures of these aerosols in a dry air environment and in a steam-air environment within the NSPP vessel are reported. Under dry conditions, the aerosols are agglomerated in the form of branched chains; the aerodynamic mass median diameter (AMMD) of the U3O8, Fe2O3 and mixed U3O8-Fe2O3 aerosols ranged between 1.5 and 3μm while that of the concrete aerosol was about 1 μm. A steam-air environment, which would be present in LWR containment during and following an accident, causes the U3O8, the Fe2O3, and mixed U3O8-Fe2O3 aerosols to behave differently from that in a dry atmosphere; the primary effect is an enhanced rate of removal of the aerosol from the vessel atmosphere. Steam does not have a significant effect on the removal rate of a concrete aerosol. Electron microscopy showed the agglomerated U3O8, Fe2O3, and mixed U3O8-Fe2O3 aerosols to be in the form of spherical clumps of particles differing from the intermingled branched chains observed in the dry air tests; the AMMD was in the range of 1 to 2 μm. Steam had a lesser influence on the physical shape of the concrete aerosol with the shape being intermediate between branched chain and spherical clumps. 9 figures
A Bayesian Nonparametric IRT Model
Karabatsos, George
2015-01-01
This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...
Elements of Bayesian experimental design
Energy Technology Data Exchange (ETDEWEB)
Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)
1997-09-01
We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.
Bayesian kinematic earthquake source models
Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.
2009-12-01
Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high
Bayesian Stable Isotope Mixing Models
Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard
2012-01-01
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...
Bayesian Network--Response Regression
WANG, LU; Durante, Daniele; Dunson, David B.
2016-01-01
There is an increasing interest in learning how human brain networks vary with continuous traits (e.g., personality, cognitive abilities, neurological disorders), but flexible procedures to accomplish this goal are limited. We develop a Bayesian semiparametric model, which combines low-rank factorizations and Gaussian process priors to allow flexible shifts of the conditional expectation for a network-valued random variable across the feature space, while including subject-specific random eff...
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Mohammad-Djafari, Ali
2007-01-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali
2004-11-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian analysis of contingency tables
Gómez Villegas, Miguel A.; González Pérez, Beatriz
2005-01-01
The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...
Bayesian estimation of turbulent motion
Héas, P.; Herzet, C.; Mémin, E.; Heitz, D.; P. D. Mininni
2013-01-01
International audience Based on physical laws describing the multi-scale structure of turbulent flows, this article proposes a regularizer for fluid motion estimation from an image sequence. Regularization is achieved by imposing some scale invariance property between histograms of motion increments computed at different scales. By reformulating this problem from a Bayesian perspective, an algorithm is proposed to jointly estimate motion, regularization hyper-parameters, and to select the ...
Bayesian Kernel Mixtures for Counts
Canale, Antonio; David B Dunson
2011-01-01
Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviatio...
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Bayesian second law of thermodynamics.
Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples. PMID:27627241
Bayesian second law of thermodynamics
Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
Tackett, J. L.; Getzewich, B. J.; Winker, D. M.; Vaughan, M. A.
2015-12-01
With nine years of retrievals, the CALIOP level 3 aerosol profile product provides an unprecedented synopsis of aerosol extinction in three dimensions and the potential to quantify changes in aerosol distributions over time. The CALIOP level 3 aerosol profile product, initially released as a beta product in 2011, reports monthly averages of quality-screened aerosol extinction profiles on a uniform latitude/longitude grid for different cloud-cover scenarios, called "sky conditions". This presentation demonstrates improvements to the second version of the product which will be released in September 2015. The largest improvements are the new sky condition definitions which parse the atmosphere into "cloud-free" views accessible to passive remote sensors, "all-sky" views accessible to active remote sensors and "cloudy-sky" views for opaque and transparent clouds which were previously inaccessible to passive remote sensors. Taken together, the new sky conditions comprehensively summarize CALIOP aerosol extinction profiles for a broad range of scientific queries. In addition to dust-only extinction profiles, the new version will include polluted-dust and smoke-only extinction averages. A new method is adopted for averaging dust-only extinction profiles to reduce high biases which exist in the beta version of the level 3 aerosol profile product. This presentation justifies the new averaging methodology and demonstrates vertical profiles of dust and smoke extinction over Africa during the biomass burning season. Another crucial advancement demonstrated in this presentation is a new approach for computing monthly mean aerosol optical depth which removes low biases reported in the beta version - a scenario unique to lidar datasets.
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan
2004-01-01
We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating...... and differentiating these circuits in time linear in their size. We report on experimental results showing the successful compilation, and efficient inference, on relational Bayesian networks whose {\\primula}--generated propositional instances have thousands of variables, and whose jointrees have clusters...
Bayesian Posterior Distributions Without Markov Chains
Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.
2012-01-01
Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...
Variational bayesian method of estimating variance components.
Arakawa, Aisaku; Taniguchi, Masaaki; Hayashi, Takeshi; Mikawa, Satoshi
2016-07-01
We developed a Bayesian analysis approach by using a variational inference method, a so-called variational Bayesian method, to determine the posterior distributions of variance components. This variational Bayesian method and an alternative Bayesian method using Gibbs sampling were compared in estimating genetic and residual variance components from both simulated data and publically available real pig data. In the simulated data set, we observed strong bias toward overestimation of genetic variance for the variational Bayesian method in the case of low heritability and low population size, and less bias was detected with larger population sizes in both methods examined. The differences in the estimates of variance components between the variational Bayesian and the Gibbs sampling were not found in the real pig data. However, the posterior distributions of the variance components obtained with the variational Bayesian method had shorter tails than those obtained with the Gibbs sampling. Consequently, the posterior standard deviations of the genetic and residual variances of the variational Bayesian method were lower than those of the method using Gibbs sampling. The computing time required was much shorter with the variational Bayesian method than with the method using Gibbs sampling.
SYNTHESIZED EXPECTED BAYESIAN METHOD OF PARAMETRIC ESTIMATE
Institute of Scientific and Technical Information of China (English)
Ming HAN; Yuanyao DING
2004-01-01
This paper develops a new method of parametric estimate, which is named as "synthesized expected Bayesian method". When samples of products are tested and no failure events occur, thedefinition of expected Bayesian estimate is introduced and the estimates of failure probability and failure rate are provided. After some failure information is introduced by making an extra-test, a synthesized expected Bayesian method is defined and used to estimate failure probability, failure rateand some other parameters in exponential distribution and Weibull distribution of populations. Finally,calculations are performed according to practical problems, which show that the synthesized expected Bayesian method is feasible and easy to operate.
Bayesian Methods and Universal Darwinism
Campbell, John
2009-12-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.
Bayesian phylogeography finds its roots.
Directory of Open Access Journals (Sweden)
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
Kopeika, Norman S.; Zilberman, Arkadi; Yitzhaky, Yitzhak
2014-05-01
Different views of the significance of aerosol MTF have been reported. For example, one recent paper [OE, 52(4)/2013, pp. 046201] claims that the aerosol MTF "contrast reduction is approximately independent of spatial frequency, and image blur is practically negligible". On the other hand, another recent paper [JOSA A, 11/2013, pp. 2244-2252] claims that aerosols "can have a non-negligible effect on the atmospheric point spread function". We present clear experimental evidence of common significant aerosol blur and evidence that aerosol contrast reduction can be extremely significant. In the IR, it is more appropriate to refer to such phenomena as aerosol-absorption MTF. The role of imaging system instrumentation on such MTF is addressed too.
Numeracy, frequency, and Bayesian reasoning
Directory of Open Access Journals (Sweden)
Gretchen B. Chapman
2009-02-01
Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.
Bayesian Query-Focused Summarization
Daumé, Hal
2009-01-01
We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.
Bayesian Sampling using Condition Indicators
DEFF Research Database (Denmark)
Faber, Michael H.; Sørensen, John Dalsgaard
2002-01-01
The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept...... of condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators...
Sensitivity of aerosol direct radiative forcing to aerosol vertical profile
Chung, Chul E.; Choi, Jung-Ok
2014-01-01
Aerosol vertical profile significantly affects the aerosol direct radiative forcing at the TOA level. The degree to which the aerosol profile impacts the aerosol forcing depends on many factors such as presence of cloud, surface albedo and aerosol single scattering albedo (SSA). Using a radiation model, we show that for absorbing aerosols (with an SSA of 0.7–0.8) whether aerosols are located above cloud or below induces at least one order of magnitude larger changes of the aerosol forcing tha...
Aerosols Science and Technology
Agranovski, Igor
2011-01-01
This self-contained handbook and ready reference examines aerosol science and technology in depth, providing a detailed insight into this progressive field. As such, it covers fundamental concepts, experimental methods, and a wide variety of applications, ranging from aerosol filtration to biological aerosols, and from the synthesis of carbon nanotubes to aerosol reactors.Written by a host of internationally renowned experts in the field, this is an essential resource for chemists and engineers in the chemical and materials disciplines across multiple industries, as well as ideal supplementary
Using Bayesian Networks to Improve Knowledge Assessment
Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra
2013-01-01
In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Launhardt, Ralf; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Bayesian credible interval construction for Poisson statistics
Institute of Scientific and Technical Information of China (English)
ZHU Yong-Sheng
2008-01-01
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.
Modeling Diagnostic Assessments with Bayesian Networks
Almond, Russell G.; DiBello, Louis V.; Moulder, Brad; Zapata-Rivera, Juan-Diego
2007-01-01
This paper defines Bayesian network models and examines their applications to IRT-based cognitive diagnostic modeling. These models are especially suited to building inference engines designed to be synchronous with the finer grained student models that arise in skills diagnostic assessment. Aspects of the theory and use of Bayesian network models…
Advances in Bayesian Modeling in Educational Research
Levy, Roy
2016-01-01
In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…
Learning dynamic Bayesian networks with mixed variables
DEFF Research Database (Denmark)
Bøttcher, Susanne Gammelgaard
This paper considers dynamic Bayesian networks for discrete and continuous variables. We only treat the case, where the distribution of the variables is conditional Gaussian. We show how to learn the parameters and structure of a dynamic Bayesian network and also how the Markov order can be learned...
The Bayesian Revolution Approaches Psychological Development
Shultz, Thomas R.
2007-01-01
This commentary reviews five articles that apply Bayesian ideas to psychological development, some with psychology experiments, some with computational modeling, and some with both experiments and modeling. The reviewed work extends the current Bayesian revolution into tasks often studied in children, such as causal learning and word learning, and…
Bayesian Network for multiple hypthesis tracking
W.P. Zajdel; B.J.A. Kröse
2002-01-01
For a flexible camera-to-camera tracking of multiple objects we model the objects behavior with a Bayesian network and combine it with the multiple hypohesis framework that associates observations with objects. Bayesian networks offer a possibility to factor complex, joint distributions into a produ
Monitoring biological aerosols using UV fluorescence
Eversole, Jay D.; Roselle, Dominick; Seaver, Mark E.
1999-01-01
An apparatus has been designed and constructed to continuously monitor the number density, size, and fluorescent emission of ambient aerosol particles. The application of fluorescence to biological particles suspended in the atmosphere requires laser excitation in the UV spectral region. In this study, a Nd:YAG laser is quadrupled to provide a 266 nm wavelength to excite emission from single micrometer-sized particles in air. Fluorescent emission is used to continuously identify aerosol particles of biological origin. For calibration, biological samples of Bacillus subtilis spores and vegetative cells, Esherichia coli, Bacillus thuringiensis and Erwinia herbicola vegetative cells were prepared as suspensions in water and nebulized to produce aerosols. Detection of single aerosol particles, provides elastic scattering response as well as fluorescent emission in two spectral bands simultaneously. Our efforts have focuses on empirical characterization of the emission and scattering characteristics of various bacterial samples to determine the feasibility of optical discrimination between different cell types. Preliminary spectroscopic evidence suggest that different samples can be distinguished as separate bio-aerosol groups. In addition to controlled sample results, we will also discuss the most recent result on the effectiveness of detection outdoor releases and variations in environmental backgrounds.
Deposition and retention of radioactive aerosols on desert vegetation
International Nuclear Information System (INIS)
Deposition velocities and retention times were obtained for submicron aerosols of 134Cs and 141Ce on a shrub species (Artemisia tridentata) and a grass (Elymus elimoides) in a natural desert environment. Submicron aerosols of these two nuclides were artificially generated and released over a sagebrush community in southeast Idaho during each of three seasons: spring, summer and winter, to determine the effects of weathering and plant development on aerosol deposition and retention. Information on friction velocities, roughness lengths, and particle size was also obtained
2nd Bayesian Young Statisticians Meeting
Bitto, Angela; Kastner, Gregor; Posekany, Alexandra
2015-01-01
The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...
BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.
Khakabimamaghani, Sahand; Ester, Martin
2016-01-01
The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data.
BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.
Khakabimamaghani, Sahand; Ester, Martin
2016-01-01
The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data. PMID:26776199
Evaluation of a radioactive aerosol surveillance system
Energy Technology Data Exchange (ETDEWEB)
Scripsick, R.C.; Stafford, R.G.; Beckman, R.J.; Tillery, M.I.; Romero, P.O.
1978-06-26
Measurements of the dilution of air contaminants between worker breathing zone and area air samplers were made by releasing a test aerosol in a workroom equipped with an aerosol surveillance system. The data were used to evaluate performance, and suggest improvements in design of the workroom's alarming air monitor system. It was found that a breathing zone concentration of 960 times the maximum permissible concentration in air (MPC/sub a/) for a half-hour was required to trigger alarms of the existing monitoring system under some release conditions. Alternative air monitor placement, suggested from dilution measurements, would reduce this average triggering concentration to 354 MPC/sub a/. Deployment of additional air monitors could further reduce the average triggering concentration to 241 MPC/sub a/. The relation between number of monitors and triggering concentration was studied. No significant decrease in average triggering concentration was noted for arrays containing greater than five monitors.
Evaluation of a radioactive aerosol surveillance system
International Nuclear Information System (INIS)
Measurements of the dilution of air contaminants between worker breathing zone and area air samplers were made by releasing a test aerosol in a workroom equipped with an aerosol surveillance system. The data were used to evaluate performance, and suggest improvements in design of the workroom's alarming air monitor system. It was found that a breathing zone concentration of 960 times the maximum permissible concentration in air (MPC/sub a/) for a half-hour was required to trigger alarms of the existing monitoring system under some release conditions. Alternative air monitor placement, suggested from dilution measurements, would reduce this average triggering concentration to 354 MPC/sub a/. Deployment of additional air monitors could further reduce the average triggering concentration to 241 MPC/sub a/. The relation between number of monitors and triggering concentration was studied. No significant decrease in average triggering concentration was noted for arrays containing greater than five monitors
Energy Technology Data Exchange (ETDEWEB)
Bauer, Susanne E.; Menon, Surabi; Koch, Dorothy; Bond, Tami; Tsigaridis, Kostas
2010-04-09
Recently, attention has been drawn towards black carbon aerosols as a likely short-term climate warming mitigation candidate. However the global and regional impacts of the direct, cloud-indirect and semi-direct forcing effects are highly uncertain, due to the complex nature of aerosol evolution and its climate interactions. Black carbon is directly released as particle into the atmosphere, but then interacts with other gases and particles through condensation and coagulation processes leading to further aerosol growth, aging and internal mixing. A detailed aerosol microphysical scheme, MATRIX, embedded within the global GISS modelE includes the above processes that determine the lifecycle and climate impact of aerosols. This study presents a quantitative assessment of the impact of microphysical processes involving black carbon, such as emission size distributions and optical properties on aerosol cloud activation and radiative forcing. Our best estimate for net direct and indirect aerosol radiative forcing change is -0.56 W/m{sup 2} between 1750 and 2000. However, the direct and indirect aerosol effects are very sensitive to the black and organic carbon size distribution and consequential mixing state. The net radiative forcing change can vary between -0.32 to -0.75 W/m{sup 2} depending on these carbonaceous particle properties. Assuming that sulfates, nitrates and secondary organics form a coating shell around a black carbon core, rather than forming a uniformly mixed particles, changes the overall net radiative forcing from a negative to a positive number. Black carbon mitigation scenarios showed generally a benefit when mainly black carbon sources such as diesel emissions are reduced, reducing organic and black carbon sources such as bio-fuels, does not lead to reduced warming.
The Effect of Water Injection on the Fission Product Aerosol Behavior in Fukushima Unit 1
International Nuclear Information System (INIS)
The most important factor affects human health is fission product that is released from the plant. Fission products usually released with types of aerosol and vapor. The amount of released aerosols out of the plant is crucial, because it can be breathed by people. In this study, the best estimated scenario of Fukushima unit 1 accident was modeled with MELCOR. The amount of released fission product aerosols was estimated according to the amount of added water into reactor pressure vessel (RPV). The analysis of Fukushima unit 1 accident was conducted in view of fission product aerosol release using MELCOR. First of all, thermodynamic results of the plant were compared to the measured data, and then fission product aerosol (CsOH) behavior was calculated with changing the amount of water injection. Water injection affects the amount of aerosol which released into reactor building, because it decreases the temperature of deposition surface. In this study, only aerosol behavior was considered, further study will be conducted including hygroscopic model
Development of Multi-Wavelength Raman Lidar and its Application on Aerosol and Cloud Research
Directory of Open Access Journals (Sweden)
Liu Dong
2016-01-01
Full Text Available A movable multi-wavelength Raman lidar (TMPRL was built in Hefei, China. Emitting with three wavelengths at 1064, 532, and 355nm, receiving three above Mie scattering signals and two nitrogen Raman signals at 386 and 607nm, and depolarization signal at 532nm, TMPRL has the capacity to investigate the height resolved optical and microphysical properties of aerosol and cloud. The retrieval algorithms of optical parameters base on Mie-Raman technique and the microphysical parameters based on Bayesian optimization method were also developed and applied to observed lidar data. Designing to make unattended operation and 24/7 continuous working, TMPRL has joined several field campaigns to study on the aerosol, cloud and their interaction researches. Some observed results of aerosol and cloud optical properties and the first attempt to validate the vertical aerosol size distribution retrieved by TMPRL and in-situ measurement by airplane are presented and discussed.
Aerosol climate time series from ESA Aerosol_cci (Invited)
Holzer-Popp, T.
2013-12-01
developed further, to evaluate the datasets and their regional and seasonal merits. The validation showed that most datasets have improved significantly and in particular PARASOL (ocean only) provides excellent results. The metrics for AATSR (land and ocean) datasets are similar to those of MODIS and MISR, with AATSR better in some land regions and less good in some others (ocean). However, AATSR coverage is smaller than that of MODIS due to swath width. The MERIS dataset provides better coverage than AATSR but has lower quality (especially over land) than the other datasets. Also the synergetic AATSR/SCIAMACHY dataset has lower quality. The evaluation of the pixel uncertainties shows first good results but also reveals that more work needs to be done to provide comprehensive information for data assimilation. Users (MACC/ECMWF, AEROCOM) confirmed the relevance of this additional information and encouraged Aerosol_cci to release the current uncertainties. The paper will summarize and discuss the results of three year work in Aerosol_cci, extract the lessons learned and conclude with an outlook to the work proposed for the next three years. In this second phase a cyclic effort of algorithm evolution, dataset generation, validation and assessment will be applied to produce and further improve complete time series from all sensors under investigation, new sensors will be added (e.g. IASI), and preparation for the Sentinel missions will be made.
A Bayesian Reflection on Surfaces
Directory of Open Access Journals (Sweden)
David R. Wolf
1999-10-01
Full Text Available Abstract: The topic of this paper is a novel Bayesian continuous-basis field representation and inference framework. Within this paper several problems are solved: The maximally informative inference of continuous-basis fields, that is where the basis for the field is itself a continuous object and not representable in a finite manner; the tradeoff between accuracy of representation in terms of information learned, and memory or storage capacity in bits; the approximation of probability distributions so that a maximal amount of information about the object being inferred is preserved; an information theoretic justification for multigrid methodology. The maximally informative field inference framework is described in full generality and denoted the Generalized Kalman Filter. The Generalized Kalman Filter allows the update of field knowledge from previous knowledge at any scale, and new data, to new knowledge at any other scale. An application example instance, the inference of continuous surfaces from measurements (for example, camera image data, is presented.
Quantum Bayesianism at the Perimeter
Fuchs, Christopher A
2010-01-01
The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.
Hedging Strategies for Bayesian Optimization
Brochu, Eric; de Freitas, Nando
2010-01-01
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
, and exercises are included for the reader to check his/her level of understanding. The techniques and methods presented for knowledge elicitation, model construction and verification, modeling techniques and tricks, learning models from data, and analyses of models have all been developed and refined......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...... primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning under uncertainty. The theory and methods presented are illustrated through more than 140 examples...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
under uncertainty. The theory and methods presented are illustrated through more than 140 examples, and exercises are included for the reader to check his or her level of understanding. The techniques and methods presented on model construction and verification, modeling techniques and tricks, learning......Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...... sections, in addition to fully-updated examples, tables, figures, and a revised appendix. Intended primarily for practitioners, this book does not require sophisticated mathematical skills or deep understanding of the underlying theory and methods nor does it discuss alternative technologies for reasoning...
State Information in Bayesian Games
Cuff, Paul
2009-01-01
Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.
Multiview Bayesian Correlated Component Analysis
DEFF Research Database (Denmark)
Kamronn, Simon Due; Poulsen, Andreas Trier; Hansen, Lars Kai
2015-01-01
Correlated component analysis as proposed by Dmochowski, Sajda, Dias, and Parra (2012) is a tool for investigating brain process similarity in the responses to multiple views of a given stimulus. Correlated components are identified under the assumption that the involved spatial networks...... are identical. Here we propose a hierarchical probabilistic model that can infer the level of universality in such multiview data, from completely unrelated representations, corresponding to canonical correlation analysis, to identical representations as in correlated component analysis. This new model, which...... we denote Bayesian correlated component analysis, evaluates favorably against three relevant algorithms in simulated data. A well-established benchmark EEG data set is used to further validate the new model and infer the variability of spatial representations across multiple subjects....
Elvira, Clément; Dobigeon, Nicolas
2015-01-01
Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...
Nonparametric Bayesian inference in biostatistics
Müller, Peter
2015-01-01
As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...
Bayesian Kernel Mixtures for Counts.
Canale, Antonio; Dunson, David B
2011-12-01
Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437
Bayesian networks in educational assessment
Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M
2015-01-01
Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...
Aerosol processing in stratiform clouds in ECHAM6-HAM
Neubauer, David; Lohmann, Ulrike; Hoose, Corinna
2013-04-01
Aerosol processing in stratiform clouds by uptake into cloud particles, collision-coalescence, chemical processing inside the cloud particles and release back into the atmosphere has important effects on aerosol concentration, size distribution, chemical composition and mixing state. Aerosol particles can act as cloud condensation nuclei. Cloud droplets can take up further aerosol particles by collisions. Atmospheric gases may also be transferred into the cloud droplets and undergo chemical reactions, e.g. the production of atmospheric sulphate. Aerosol particles are also processed in ice crystals. They may be taken up by homogeneous freezing of cloud droplets below -38° C or by heterogeneous freezing above -38° C. This includes immersion freezing of already immersed aerosol particles in the droplets and contact freezing of particles colliding with a droplet. Many clouds do not form precipitation and also much of the precipitation evaporates before it reaches the ground. The water soluble part of the aerosol particles concentrates in the hydrometeors and together with the insoluble part forms a single, mixed, larger particle, which is released. We have implemented aerosol processing into the current version of the general circulation model ECHAM6 (Stevens et al., 2013) coupled to the aerosol module HAM (Stier et al., 2005). ECHAM6-HAM solves prognostic equations for the cloud droplet number and ice crystal number concentrations. In the standard version of HAM, seven modes are used to describe the total aerosol. The modes are divided into soluble/mixed and insoluble modes and the number concentrations and masses of different chemical components (sulphate, black carbon, organic carbon, sea salt and mineral dust) are prognostic variables. We extended this by an explicit representation of aerosol particles in cloud droplets and ice crystals in stratiform clouds similar to Hoose et al. (2008a,b). Aerosol particles in cloud droplets are represented by 5 tracers for the
Bayesian models a statistical primer for ecologists
Hobbs, N Thompson
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark
2006-01-01
We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference...... by evaluating and differentiating these circuits in time linear in their size. We report on experimental results showing successful compilation and efficient inference on relational Bayesian networks, whose PRIMULA--generated propositional instances have thousands of variables, and whose jointrees have clusters...
Aqueous aerosol SOA formation: impact on aerosol physical properties.
Woo, Joseph L; Kim, Derek D; Schwier, Allison N; Li, Ruizhi; McNeill, V Faye
2013-01-01
Organic chemistry in aerosol water has recently been recognized as a potentially important source of secondary organic aerosol (SOA) material. This SOA material may be surface-active, therefore potentially affecting aerosol heterogeneous activity, ice nucleation, and CCN activity. Aqueous aerosol chemistry has also been shown to be a potential source of light-absorbing products ("brown carbon"). We present results on the formation of secondary organic aerosol material in aerosol water and the associated changes in aerosol physical properties from GAMMA (Gas-Aerosol Model for Mechanism Analysis), a photochemical box model with coupled gas and detailed aqueous aerosol chemistry. The detailed aerosol composition output from GAMMA was coupled with two recently developed modules for predicting a) aerosol surface tension and b) the UV-Vis absorption spectrum of the aerosol, based on our previous laboratory observations. The simulation results suggest that the formation of oligomers and organic acids in bulk aerosol water is unlikely to perturb aerosol surface tension significantly. Isoprene-derived organosulfates are formed in high concentrations in acidic aerosols under low-NO(x) conditions, but more experimental data are needed before the potential impact of these species on aerosol surface tension may be evaluated. Adsorption of surfactants from the gas phase may further suppress aerosol surface tension. Light absorption by aqueous aerosol SOA material is driven by dark glyoxal chemistry and is highest under high-NO(x) conditions, at high relative humidity, in the early morning hours. The wavelength dependence of the predicted absorption spectra is comparable to field observations and the predicted mass absorption efficiencies suggest that aqueous aerosol chemistry can be a significant source of aerosol brown carbon under urban conditions. PMID:24601011
DARE : Dedicated Aerosols Retrieval Experiment
Smorenburg, K.; Courrèges-Lacoste, G.B.; Decae, R.; Court, A.J.; Leeuw, G. de; Visser, H.
2004-01-01
At present there is an increasing interest in remote sensing of aerosols from space because of the large impact of aerosols on climate, earth observation and health. TNO has performed a study aimed at improving aerosol characterisation using a space based instrument and state-of-the-art aerosol retr
Interactions of fission product vapours with aerosols
Energy Technology Data Exchange (ETDEWEB)
Benson, C.G.; Newland, M.S. [AEA Technology, Winfrith (United Kingdom)
1996-12-01
Reactions between structural and reactor materials aerosols and fission product vapours released during a severe accident in a light water reactor (LWR) will influence the magnitude of the radiological source term ultimately released to the environment. The interaction of cadmium aerosol with iodine vapour at different temperatures has been examined in a programme of experiments designed to characterise the kinetics of the system. Laser induced fluorescence (LIF) is a technique that is particularly amenable to the study of systems involving elemental iodine because of the high intensity of the fluorescence lines. Therefore this technique was used in the experiments to measure the decrease in the concentration of iodine vapour as the reaction with cadmium proceeded. Experiments were conducted over the range of temperatures (20-350{sup o}C), using calibrated iodine vapour and cadmium aerosol generators that gave well-quantified sources. The LIF results provided information on the kinetics of the process, whilst examination of filter samples gave data on the composition and morphology of the aerosol particles that were formed. The results showed that the reaction of cadmium with iodine was relatively fast, giving reaction half-lives of approximately 0.3 s. This suggests that the assumption used by primary circuit codes such as VICTORIA that reaction rates are mass-transfer limited, is justified for the cadmium-iodine reaction. The reaction was first order with respect to both cadmium and iodine, and was assigned as pseudo second order overall. However, there appeared to be a dependence of aerosol surface area on the overall rate constant, making the precise order of the reaction difficult to assign. The relatively high volatility of the cadmium iodide formed in the reaction played an important role in determining the composition of the particles. (author) 23 figs., 7 tabs., 22 refs.
Barbaro, Elena; Kirchgeorg, Torben; Zangrando, Roberta; Vecchiato, Marco; Piazza, Rossano; Barbante, Carlo; Gambaro, Andrea
2015-10-01
The processes and transformations occurring in the Antarctic aerosol during atmospheric transport were described using selected sugars as source tracers. Monosaccharides (arabinose, fructose, galactose, glucose, mannose, ribose, xylose), disaccharides (sucrose, lactose, maltose, lactulose), alcohol-sugars (erythritol, mannitol, ribitol, sorbitol, xylitol, maltitol, galactitol) and anhydrosugars (levoglucosan, mannosan and galactosan) were measured in the Antarctic aerosol collected during four different sampling campaigns. For quantification, a sensitive high-pressure anion exchange chromatography was coupled with a single quadrupole mass spectrometer. The method was validated, showing good accuracy and low method quantification limits. This study describes the first determination of sugars in the Antarctic aerosol. The total mean concentration of sugars in the aerosol collected at the "Mario Zucchelli" coastal station was 140 pg m-3; as for the aerosol collected over the Antarctic plateau during two consecutive sampling campaigns, the concentration amounted to 440 and 438 pg m-3. The study of particle-size distribution allowed us to identify the natural emission from spores or from sea-spray as the main sources of sugars in the coastal area. The enrichment of sugars in the fine fraction of the aerosol collected on the Antarctic plateau is due to the degradation of particles during long-range atmospheric transport. The composition of sugars in the coarse fraction was also investigated in the aerosol collected during the oceanographic cruise.
Institute of Scientific and Technical Information of China (English)
YUAN Hui; WANG Ying; ZHUANG Guoshun
2004-01-01
Methane sulphonate (MSA) and sulfate (SO42-), the main oxidation products of dimethyl sulfide (DMS), are the target of atmospheric chemistry study, as sulfate aerosol would have important impact on the global climate change. It is widely believed that DMS is mainly emitted from phytoplankton production in marine boundary layer (MBL), and MSA is usually used as the tracer of non-sea-salt sulfate (nss- SO42-) in marine and coastal areas (MSA/SO42- = 1/18). Many observations of MSA were in marine and coastal aerosols. To our surprise, MSA was frequently (>60%) detected in Beijing TSP, PM10, and PM2.5 aerosols, even in the samples collected during the dust storm period. The concentrations of MSA were higher than those measured in marine aerosols. Factor analysis, correlation analysis and meteorology analysis indicated that there was no obvious marine influence on Beijing aerosols. DMS from terrestrial emissions and dimethyl sulphoxide (DMSO) from industrial wastes could be the two possible precursors of MSA. Warm and low-pressure air masses and long time radiation were beneficial to the formation of MSA. Anthropogenic pollution from regional and local sources might be the dominant contributor to MSA in Beijing aerosol. This was the first report of MSA in aerosols collected in an inland site in China. This new finding would lead to the further study on the balance of sulfur in inland cities and its global biogeochemical cycle.
The Diagnosis of Reciprocating Machinery by Bayesian Networks
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.
International Nuclear Information System (INIS)
Submicron aerosols, ranging in particle diameter from 0.1 μm to 0.001 μm, and in number concentration from 10,000 to 100,000 per cm3, are more or less continuously suspended in the atmosphere we breathe. They usually require in situ measurement of concentration and size distribution with instruments such as diffusion batteries and condensation nucleus counters. Laboratory measurements require the development of submicron aerosol generators. The development of several of these devices and their use in the laboratory and field to measure radioactive as well as inactive aerosols is described
Bayesian Uncertainty Analyses Via Deterministic Model
Krzysztofowicz, R.
2001-05-01
Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.
Learning Bayesian networks for discrete data
Liang, Faming
2009-02-01
Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.
A Bayesian approach to model uncertainty
International Nuclear Information System (INIS)
A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given
Bayesian Control for Concentrating Mixed Nuclear Waste
Welch, Robert L.; Smith, Clayton
2013-01-01
A control algorithm for batch processing of mixed waste is proposed based on conditional Gaussian Bayesian networks. The network is compiled during batch staging for real-time response to sensor input.
An Intuitive Dashboard for Bayesian Network Inference
International Nuclear Information System (INIS)
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++
An Intuitive Dashboard for Bayesian Network Inference
Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.
2014-03-01
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Nomograms for Visualization of Naive Bayesian Classifier
Možina, Martin; Demšar, Janez; Michael W Kattan; Zupan, Blaz
2004-01-01
Besides good predictive performance, the naive Bayesian classifier can also offer a valuable insight into the structure of the training data and effects of the attributes on the class probabilities. This structure may be effectively revealed through visualization of the classifier. We propose a new way to visualize the naive Bayesian model in the form of a nomogram. The advantages of the proposed method are simplicity of presentation, clear display of the effects of individual attribute value...
Subjective Bayesian Analysis: Principles and Practice
Goldstein, Michael
2006-01-01
We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.
Bayesian Analysis of Multivariate Probit Models
Siddhartha Chib; Edward Greenberg
1996-01-01
This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...
Fitness inheritance in the Bayesian optimization algorithm
Pelikan, Martin; Sastry, Kumara
2004-01-01
This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions...
Kernel Bayesian Inference with Posterior Regularization
Song, Yang; Jun ZHU; Ren, Yong
2016-01-01
We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former th...
Bayesian Classification in Medicine: The Transferability Question *
Zagoria, Ronald J.; Reggia, James A.; Price, Thomas R.; Banko, Maryann
1981-01-01
Using probabilities derived from a geographically distant patient population, we applied Bayesian classification to categorize stroke patients by etiology. Performance was assessed both by error rate and with a new linear accuracy coefficient. This approach to patient classification was found to be surprisingly accurate when compared to classification by two neurologists and to classification by the Bayesian method using “low cost” local and subjective probabilities. We conclude that for some...
Bayesian target tracking based on particle filter
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.
Bayesian Variable Selection in Spatial Autoregressive Models
Jesus Crespo Cuaresma; Philipp Piribauer
2015-01-01
This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...
Fuzzy Functional Dependencies and Bayesian Networks
Institute of Scientific and Technical Information of China (English)
LIU WeiYi(刘惟一); SONG Ning(宋宁)
2003-01-01
Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.
Bayesian Models of Brain and Behaviour
Penny, William
2012-01-01
This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...
Bayesian Modeling of a Human MMORPG Player
Synnaeve, Gabriel
2010-01-01
This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Bayesian Modeling of a Human MMORPG Player
Synnaeve, Gabriel; Bessière, Pierre
2011-03-01
This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.
Methods for Bayesian Power Spectrum Inference with Galaxy Surveys
Jasche, Jens; Wandelt, Benjamin D.
2013-12-01
We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves upon previous Bayesian methods by performing a joint inference of the three-dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases, and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate subsamples. This method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal-to-noise regimes by using a deterministic reversible jump algorithm. This approach reduces the correlation length of the sampler by several orders of magnitude, turning the otherwise numerically unfeasible problem of joint parameter exploration into a numerically manageable task. We test our method on an artificial mock galaxy survey, emulating characteristic features of the Sloan Digital Sky Survey data release 7, such as its survey geometry and luminosity-dependent biases. These tests demonstrate the numerical feasibility of our large scale Bayesian inference frame work when the parameter space has millions of dimensions. This method reveals and correctly treats the anti-correlation between bias amplitudes and power spectrum, which are not taken into account in current approaches to power spectrum estimation, a 20% effect across large ranges in k space. In addition, this method results in constrained realizations of density fields obtained without assuming the power spectrum or bias parameters
Preliminary results of the aerosol optical depth retrieval in Johor, Malaysia
International Nuclear Information System (INIS)
Monitoring of atmospheric aerosols over the urban area is important as tremendous amounts of pollutants are released by industrial activities and heavy traffic flow. Air quality monitoring by satellite observation provides better spatial coverage, however, detailed aerosol properties retrieval remains a challenge. This is due to the limitation of aerosol retrieval algorithm on high reflectance (bright surface) areas. The aim of this study is to retrieve aerosol optical depth over urban areas of Iskandar Malaysia; the main southern development zone in Johor state, using Moderate Resolution Imaging Spectroradiometer (MODIS) 500 m resolution data. One of the important steps is the aerosol optical depth retrieval is to characterise different types of aerosols in the study area. This information will be used to construct a Look Up Table containing the simulated aerosol reflectance and corresponding aerosol optical depth. Thus, in this study we have characterised different aerosol types in the study area using Aerosol Robotic Network (AERONET) data. These data were processed using cluster analysis and the preliminary results show that the area is consisting of coastal urban (65%), polluted urban (27.5%), dust particles (6%) and heavy pollution (1.5%) aerosols
Preliminary results of the aerosol optical depth retrieval in Johor, Malaysia
Lim, H. Q.; Kanniah, K. D.; Lau, A. M. S.
2014-02-01
Monitoring of atmospheric aerosols over the urban area is important as tremendous amounts of pollutants are released by industrial activities and heavy traffic flow. Air quality monitoring by satellite observation provides better spatial coverage, however, detailed aerosol properties retrieval remains a challenge. This is due to the limitation of aerosol retrieval algorithm on high reflectance (bright surface) areas. The aim of this study is to retrieve aerosol optical depth over urban areas of Iskandar Malaysia; the main southern development zone in Johor state, using Moderate Resolution Imaging Spectroradiometer (MODIS) 500 m resolution data. One of the important steps is the aerosol optical depth retrieval is to characterise different types of aerosols in the study area. This information will be used to construct a Look Up Table containing the simulated aerosol reflectance and corresponding aerosol optical depth. Thus, in this study we have characterised different aerosol types in the study area using Aerosol Robotic Network (AERONET) data. These data were processed using cluster analysis and the preliminary results show that the area is consisting of coastal urban (65%), polluted urban (27.5%), dust particles (6%) and heavy pollution (1.5%) aerosols.
Aerosols from biomass combustion
Energy Technology Data Exchange (ETDEWEB)
Nussbaumer, T.
2001-07-01
This report is the proceedings of a seminar on biomass combustion and aerosol production organised jointly by the International Energy Agency's (IEA) Task 32 on bio energy and the Swiss Federal Office of Energy (SFOE). This collection of 16 papers discusses the production of aerosols and fine particles by the burning of biomass and their effects. Expert knowledge on the environmental impact of aerosols, formation mechanisms, measurement technologies, methods of analysis and measures to be taken to reduce such emissions is presented. The seminar, visited by 50 participants from 11 countries, shows, according to the authors, that the reduction of aerosol emissions resulting from biomass combustion will remain a challenge for the future.
Emergency Protection from Aerosols
Energy Technology Data Exchange (ETDEWEB)
Cristy, G.A.
2001-11-13
Expedient methods were developed that could be used by an average person, using only materials readily available, to protect himself and his family from injury by toxic (e.g., radioactive) aerosols. The most effective means of protection was the use of a household vacuum cleaner to maintain a small positive pressure on a closed house during passage of the aerosol cloud. Protection factors of 800 and above were achieved.
Kahn, Ralph A.
2014-01-01
AeroCom is an open international initiative of scientists interested in the advancement of the understanding of global aerosol properties and aerosol impacts on climate. A central goal is to more strongly tie and constrain modeling efforts to observational data. A major element for exchanges between data and modeling groups are annual meetings. The meeting was held September 20 through October 2, 1014 and the organizers would like to post the presentations.
Aerosol removal by emergency spray in PWR containment: synthesis of the TOSQAN aerosol tests
International Nuclear Information System (INIS)
During the course of a severe accident in a nuclear Pressurized Water Reactor (PWR), containment reactor is pressurized by steam and hydrogen released from a primary circuit breach and distributed into the containment according to convective flows and steam wall condensation. In addition, core degradation leads to fission product release into the containment. Water spraying is used in the containment as mitigation means in order to reduce pressure, to remove fission products and to enhance the gas mixing in case of presence of hydrogen. This paper presents the synthesis of the results of the TOSQAN aerosol program undertaken by the Institut de Radioprotection et de Surete Nucleaire (IRSN) devoted to study the aerosol removal by a spray, for typical accidental thermal hydraulic conditions in PWR containment. (author)
Bayesian inference for OPC modeling
Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.
2016-03-01
The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.
Bayesian analysis of cosmic structures
Kitaura, Francisco-Shu
2011-01-01
We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...
Bayesian analysis of volcanic eruptions
Ho, Chih-Hsiang
1990-10-01
The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.
BAYESIAN APPROACH OF DECISION PROBLEMS
Directory of Open Access Journals (Sweden)
DRAGOŞ STUPARU
2010-01-01
Full Text Available Management is nowadays a basic vector of economic development, a concept frequently used in our country as well as all over the world. Indifferently of the hierarchical level at which the managerial process is manifested, decision represents its essential moment, the supreme act of managerial activity. Its can be met in all fields of activity, practically having an unlimited degree of coverage, and in all the functions of management. It is common knowledge that the activity of any type of manger, no matter the hierarchical level he occupies, represents a chain of interdependent decisions, their aim being the elimination or limitation of the influence of disturbing factors that may endanger the achievement of predetermined objectives, and the quality of managerial decisions condition the progress and viability of any enterprise. Therefore, one of the principal characteristics of a successful manager is his ability to adopt the most optimal decisions of high quality. The quality of managerial decisions are conditioned by the manager’s general level of education and specialization, the manner in which they are preoccupied to assimilate the latest information and innovations in the domain of management’s theory and practice and the applying of modern managerial methods and techniques in the activity of management. We are presenting below the analysis of decision problems in hazardous conditions in terms of Bayesian theory – a theory that uses the probabilistic calculus.
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.
An introduction to Gaussian Bayesian networks.
Grzegorczyk, Marco
2010-01-01
The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain. PMID:20824469
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion.
Zanini, Andrea; Woodbury, Allan D
2016-01-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
Characterisation of Aerosols from Simulated Radiological Dispersion Events
Di Lemma, F.G.
2015-01-01
The research described in this thesis aims at improving the evaluation of the radiaoctive aerosol release from different Radiological Dispersion Events (RDE's), such as accidents and sabotage involving radioactive and nuclear materials. These studies help in a better assessment of the source term as
Calculations of sodium aerosol concentrations at breeder reactor air intake ports
International Nuclear Information System (INIS)
This report describes the methodology used and results obtained in efforts to estimate the sodium aerosol concentrations at air intake ports of a liquid-metal cooled, fast-breeder nuclear reactor. A range of wind speeds from 2 to 10 m/s is assumed, and an effort is made to include building wake effects which in many cases dominate the dispersal of aerosols near buildings. For relatively small release rates on the order of 1 to 10 kg/s, it is suggested that the plume rise will be small and that estimates of aerosol concentrations may be derived using the methodology of Wilson and Britter (1982), which describes releases from surface vents. For more acute releases with release rates on the order of 100 kg/s, much higher release velocities are expected, and plume rise must be considered. Both momentum-driven and density-driven plume rise are considered. An effective increase in release height is computed using the Split-H methodology with a parameterization suggested by Ramsdell (1983), and the release source strength was transformed to rooftop level. Evaluation of the acute release aerosol concentration was then based on the methodology for releases from a surface release of this transformed source strength
Bayesian tomographic reconstruction of microsystems
Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali
2007-11-01
The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast). To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique. In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations.
Physical metrology of aerosols; Metrologie physique des aerosols
Energy Technology Data Exchange (ETDEWEB)
Boulaud, D.; Vendel, J. [CEA Saclay, 91 - Gif-sur-Yvette (France). Inst. de Protection et de Surete Nucleaire
1996-12-31
The various detection and measuring methods for aerosols are presented, and their selection is related to aerosol characteristics (size range, concentration or mass range), thermo-hydraulic conditions (carrier fluid temperature, pressure and flow rate) and to the measuring system conditions (measuring frequency, data collection speed, cost...). Methods based on aerosol dynamic properties (inertial, diffusional and electrical methods) and aerosol optical properties (localized and integral methods) are described and their performances and applications are compared
Computationally efficient Bayesian inference for inverse problems.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574
Dimensionality reduction in Bayesian estimation algorithms
Directory of Open Access Journals (Sweden)
G. W. Petty
2013-03-01
Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process.
Biological aerosol background characterization
Blatny, Janet; Fountain, Augustus W., III
2011-05-01
To provide useful information during military operations, or as part of other security situations, a biological aerosol detector has to respond within seconds or minutes to an attack by virulent biological agents, and with low false alarms. Within this time frame, measuring virulence of a known microorganism is extremely difficult, especially if the microorganism is of unknown antigenic or nucleic acid properties. Measuring "live" characteristics of an organism directly is not generally an option, yet only viable organisms are potentially infectious. Fluorescence based instruments have been designed to optically determine if aerosol particles have viability characteristics. Still, such commercially available biological aerosol detection equipment needs to be improved for their use in military and civil applications. Air has an endogenous population of microorganisms that may interfere with alarm software technologies. To design robust algorithms, a comprehensive knowledge of the airborne biological background content is essential. For this reason, there is a need to study ambient live bacterial populations in as many locations as possible. Doing so will permit collection of data to define diverse biological characteristics that in turn can be used to fine tune alarm algorithms. To avoid false alarms, improving software technologies for biological detectors is a crucial feature requiring considerations of various parameters that can be applied to suppress alarm triggers. This NATO Task Group will aim for developing reference methods for monitoring biological aerosol characteristics to improve alarm algorithms for biological detection. Additionally, they will focus on developing reference standard methodology for monitoring biological aerosol characteristics to reduce false alarm rates.
Lidar observations of Nabro volcano aerosol layers in the stratosphere over Gwangju, Korea
Directory of Open Access Journals (Sweden)
D. Shin
2015-01-01
Full Text Available We report on the first Raman lidar measurements of stratospheric aerosol layers in the upper troposphere and lower stratosphere over Korea. The data were taken with the multiwavelength aerosol Raman lidar at Gwangju (35.10° N, 126.53° E, Korea. The volcanic ash particles and gases were released around 12 June 2011 during the eruption of the Nabro volcano (13.37° N, 41.7° E in Eritrea, east Africa. Forward trajectory computations show that the volcanic aerosols were advected from North Africa to East Asia. The first observation of the stratospheric aerosol layers over Korea was on 19 June 2011. The stratospheric aerosol layers appeared between 15 and 17 km height a.s.l. The aerosol layers' maximum value of the backscatter coefficient and the linear particle depolarization ratio at 532 nm were 1.5 ± 0.3 Mm−1 sr−1 and 2.2%, respectively. We found these values at 16.4 km height a.s.l. 44 days after this first observation, we observed the stratospheric aerosol layer again. We continuously probed the upper troposphere and lower stratosphere for this aerosol layer during the following 5 months, until December 2011. The aerosol layers typically occurred between 10 and 20 km height a.s.l. The stratospheric aerosol optical depth and the maximum backscatter coefficient at 532 nm decreased during these 5 months.
Combustion aerosols from potassium-containing fuels
Energy Technology Data Exchange (ETDEWEB)
Balzer Nielsen, Lars
1998-12-31
The scope of the work presented in this thesis is the formation and evolution of aerosol particles in the submicron range during combustion processes, in particular where biomass is used alone or co-fired with coal. An introduction to the formation processes of fly ash in general and submicron aerosol in particular during combustion is presented, along with some known problems related to combustion of biomass for power generation. The work falls in two parts. The first is the design of a laboratory setup for investigation of homogeneous nucleation and particle dynamics at high temperature. The central unit of the setup is a laminar flow aerosol condenser (LFAC), which essentially is a 173 cm long tubular furnace with an externally cooled wall. A mathematical model is presented which describes the formation and evolution of the aerosol in the LFAC, where the rate of formation of new nuclei is calculated using the so-called classical theory. The model includes mass and energy conservation equations and an expression for the description of particle growth by diffusion. The resulting set of nonlinear second-order partial differential equations are solved numerically using the method of orthogonal collocation. The model is implemented in the FORTRAN code MONAERO. The second part of this thesis describes a comprehensive investigation of submicron aerosol formation during co-firing of coal and straw carried out at a 380 MW{sub Th} pulverized coal unit at Studstrup Power Plant, Aarhus. Three types of coal are used, and total boiler load and straw input is varied systematically. Straw contains large amounts of potassium, which is released during combustion. Submicron aerosol is sampled between the two banks of the economizer at a flue gas temperature of 350 deg. C using a novel ejector probe. The aerosol is characterized using the SMPS system and a Berner-type low pressure impactor. The chemical composition of the particles collected in the impactor is determined using
Bayesian Methods for Radiation Detection and Dosimetry
Groer, Peter G
2002-01-01
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...
Adaptive approximate Bayesian computation for complex models
Lenormand, Maxime; Deffuant, Guillaume
2011-01-01
Approximate Bayesian computation (ABC) is a family of computational techniques in Bayesian statistics. These techniques allow to fit a model to data without relying on the computation of the model likelihood. They instead require to simulate a large number of times the model to be fitted. A number of refinements to the original rejection-based ABC scheme have been proposed, including the sequential improvement of posterior distributions. This technique allows to decrease the number of model simulations required, but it still presents several shortcomings which are particularly problematic for costly to simulate complex models. We here provide a new algorithm to perform adaptive approximate Bayesian computation, which is shown to perform better on both a toy example and a complex social model.
Learning Bayesian Networks from Correlated Data
Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola
2016-05-01
Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.
Bayesian Fusion of Multi-Band Images
Wei, Qi; Tourneret, Jean-Yves
2013-01-01
In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...
Bayesian Image Reconstruction Based on Voronoi Diagrams
Cabrera, G F; Hitschfeld, N
2007-01-01
We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.
Dynamic Bayesian Combination of Multiple Imperfect Classifiers
Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon
2012-01-01
Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...
Bayesian inference of the metazoan phylogeny
DEFF Research Database (Denmark)
Glenner, Henrik; Hansen, Anders J; Sørensen, Martin V;
2004-01-01
been the only feasible combined approach but is highly sensitive to long-branch attraction. Recent development of stochastic models for discrete morphological characters and computationally efficient methods for Bayesian inference has enabled combined molecular and morphological data analysis...... with rigorous statistical approaches less prone to such inconsistencies. We present the first statistically founded analysis of a metazoan data set based on a combination of morphological and molecular data and compare the results with a traditional parsimony analysis. Interestingly, the Bayesian analyses...... such as the ecdysozoans and lophotrochozoans. Parsimony, on the contrary, shows conflicting results, with morphology being congruent to the Bayesian results and the molecular data set producing peculiarities that are largely reflected in the combined analysis....
Variational Bayesian Inference of Line Spectra
DEFF Research Database (Denmark)
Badiu, Mihai Alin; Hansen, Thomas Lundgaard; Fleury, Bernard Henri
2016-01-01
In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid; and the coeffici......In this paper, we address the fundamental problem of line spectral estimation in a Bayesian framework. We target model order and parameter estimation via variational inference in a probabilistic model in which the frequencies are continuous-valued, i.e., not restricted to a grid......; and the coefficients are governed by a Bernoulli-Gaussian prior model turning model order selection into binary sequence detection. Unlike earlier works which retain only point estimates of the frequencies, we undertake a more complete Bayesian treatment by estimating the posterior probability density functions (pdfs...
Event generator tuning using Bayesian optimization
Ilten, Philip; Yang, Yunjie
2016-01-01
Monte Carlo event generators contain a large number of parameters that must be determined by comparing the output of the generator with experimental data. Generating enough events with a fixed set of parameter values to enable making such a comparison is extremely CPU intensive, which prohibits performing a simple brute-force grid-based tuning of the parameters. Bayesian optimization is a powerful method designed for such black-box tuning applications. In this article, we show that Monte Carlo event generator parameters can be accurately obtained using Bayesian optimization and minimal expert-level physics knowledge. A tune of the PYTHIA 8 event generator using $e^+e^-$ events, where 20 parameters are optimized, can be run on a modern laptop in just two days. Combining the Bayesian optimization approach with expert knowledge should enable producing better tunes in the future, by making it faster and easier to study discrepancies between Monte Carlo and experimental data.
Hessian PDF reweighting meets the Bayesian methods
Paukkunen, Hannu
2014-01-01
We discuss the Hessian PDF reweighting - a technique intended to estimate the effects that new measurements have on a set of PDFs. The method stems straightforwardly from considering new data in a usual $\\chi^2$-fit and it naturally incorporates also non-zero values for the tolerance, $\\Delta\\chi^2>1$. In comparison to the contemporary Bayesian reweighting techniques, there is no need to generate large ensembles of PDF Monte-Carlo replicas, and the observables need to be evaluated only with the central and the error sets of the original PDFs. In spite of the apparently rather different methodologies, we find that the Hessian and the Bayesian techniques are actually equivalent if the $\\Delta\\chi^2$ criterion is properly included to the Bayesian likelihood function that is a simple exponential.
A Large Sample Study of the Bayesian Bootstrap
Lo, Albert Y.
1987-01-01
An asymptotic justification of the Bayesian bootstrap is given. Large-sample Bayesian bootstrap probability intervals for the mean, the variance and bands for the distribution, the smoothed density and smoothed rate function are also provided.
Length Scales in Bayesian Automatic Adaptive Quadrature
Directory of Open Access Journals (Sweden)
Adam Gh.
2016-01-01
Full Text Available Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1–16 (2012] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule, mesoscopic (Simpson rule, and macroscopic (quadrature sums of high algebraic degrees of precision. Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.
Bayesian estimation and tracking a practical guide
Haug, Anton J
2012-01-01
A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation
Bayesian Optimisation Algorithm for Nurse Scheduling
Li, Jingpeng
2008-01-01
Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.
A Bayesian Analysis of Spectral ARMA Model
Directory of Open Access Journals (Sweden)
Manoel I. Silvestre Bezerra
2012-01-01
Full Text Available Bezerra et al. (2008 proposed a new method, based on Yule-Walker equations, to estimate the ARMA spectral model. In this paper, a Bayesian approach is developed for this model by using the noninformative prior proposed by Jeffreys (1967. The Bayesian computations, simulation via Markov Monte Carlo (MCMC is carried out and characteristics of marginal posterior distributions such as Bayes estimator and confidence interval for the parameters of the ARMA model are derived. Both methods are also compared with the traditional least squares and maximum likelihood approaches and a numerical illustration with two examples of the ARMA model is presented to evaluate the performance of the procedures.
A Bayesian Concept Learning Approach to Crowdsourcing
DEFF Research Database (Denmark)
Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;
2011-01-01
We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing...... that our Bayesian strategies are effective even in large concept spaces with many uninformative experts....
Comparison of the Bayesian and Frequentist Approach to the Statistics
Hakala, Michal
2015-01-01
The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...
Formation of halogen-induced secondary organic aerosol (XOA)
Kamilli, Katharina; Ofner, Johannes; Zetzsch, Cornelius; Held, Andreas
2013-04-01
bromine with α-pinene. This work was funded by German Research Foundation (DFG) under grants HE 5214/5-1 and ZE792/5-2. References: Cai, X., and Griffin, R. J.: Secondary aerosol formation from the oxidation of biogenic hydrocarbons by chlorine atoms, J. Geophys. Res., 111, D14206/14201-D14206/14214, 2006. Ofner, J. Balzer, N., Buxmann, J., Grothe, H., Schmitt-Kopplin, Ph., Platt, U., and Zetzsch, C., Halogenation processes of secondary organic aerosol and implications on halogen release mechanisms, Atmos. Chem. Phys. Discuss. 12, 2975-3017, 2012.
Aerosol sample inhomogeneity with debris from the Fukushima Daiichi accident
International Nuclear Information System (INIS)
Radionuclide aerosol sampling is a vital component in the detection of nuclear explosions, nuclear accidents, and other radiation releases. This was proven by the detection and tracking of emissions from the Fukushima Daiichi incident across the globe by IMS stations. Two separate aerosol samplers were operated in Richland, WA following the event and debris from the accident were measured at levels well above detection limits. While the atmospheric activity concentration of radionuclides generally compared well between the two stations, they did not agree within uncertainties. This paper includes a detailed study of the aerosol sample homogeneity of 134Cs and 137Cs, then relates it to the overall uncertainty of the original measurement. Our results show that sample inhomogeneity adds an additional 5−10% uncertainty to each aerosol measurement and that this uncertainty is in the same range as the discrepancies between the two aerosol sample measurements from Richland, WA. - Highlights: • Statistical discrepancies arise when comparing HVAS and RASA measurements. • Beta statistic was employed to quantize statistical discrepancies. • Aerosol sample inhomogeneity determined to be 5–10%. • Statistical discrepancies attributed to sample inhomogeneity
A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri
2013-01-01
representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...
A default Bayesian hypothesis test for ANOVA designs
R. Wetzels; R.P.P.P. Grasman; E.J. Wagenmakers
2012-01-01
This article presents a Bayesian hypothesis test for analysis of variance (ANOVA) designs. The test is an application of standard Bayesian methods for variable selection in regression models. We illustrate the effect of various g-priors on the ANOVA hypothesis test. The Bayesian test for ANOVA desig
A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research
Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t
Bayesian Just-So Stories in Psychology and Neuroscience
Bowers, Jeffrey S.; Davis, Colin J.
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…
Gao, R. S.; Elkins, J. W.; Frost, G. J.; McComiskey, A. C.; Murphy, D. M.; Ogren, J. A.; Petropavlovskikh, I. V.; Rosenlof, K. H.
2014-12-01
Inverse modeling using measurements of ozone (O3) and aerosol is a powerful tool for deriving pollutant emissions. Because they have relatively long lifetimes, O3 and aerosol are transported over large distances. Frequent and globally spaced vertical profiles rather than ground-based measurements alone are therefore highly desired. Three requirements necessary for a successful global monitoring program are: Low equipment cost, low operation cost, and reliable measurements of known uncertainty. Conventional profiling using aircraft provides excellent data, but is cost prohibitive on a large scale. Here we describe a new platform and instruments meeting all three global monitoring requirements. The platform consists of a small balloon and an auto-homing glider. The glider is released from the balloon at about 5 km altitude, returning the light instrument package to the launch location, and allowing for consistent recovery of the payload. Atmospheric profiling can be performed either during ascent or descent (or both) depending on measurement requirements. We will present the specifications for two instrument packages currently under development. The first measures O3, RH, p, T, dry aerosol particle number and size distribution, and aerosol optical depth. The second measures dry aerosol particle number and size distribution, and aerosol absorption coefficient. Other potential instrument packages and the desired spatial/temporal resolution for the GOA2HEAD monitoring initiative will also be discussed.
DEFF Research Database (Denmark)
Butcher, Andrew Charles
a relationship between plunging jet particle ux, oceanic particle ux, and energy dissipation rate in both systems. Previous sea spray aerosol studies dissipate an order of magnitude more energy for the same particle ux production as the open ocean. A scaling factor related to the energy expended in air...
Advancing Models and Evaluation of Cumulus, Climate and Aerosol Interactions
Energy Technology Data Exchange (ETDEWEB)
Gettelman, Andrew [University Corporation for Atmospheric Research (NCAR), Boulder, CO (United States)
2015-10-27
This project was successfully able to meet its’ goals, but faced some serious challenges due to personnel issues. Nonetheless, it was largely successful. The Project Objectives were as follows: 1. Develop a unified representation of stratifom and cumulus cloud microphysics for NCAR/DOE global community models. 2. Examine the effects of aerosols on clouds and their impact on precipitation in stratiform and cumulus clouds. We will also explore the effects of clouds and precipitation on aerosols. 3. Test these new formulations using advanced evaluation techniques and observations and release
Jones, Matt; Love, Bradley C
2011-08-01
The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls
Most frugal explanations in Bayesian networks
Kwisthout, J.H.P.
2015-01-01
Inferring the most probable explanation to a set of variables, given a partial observation of the remaining variables, is one of the canonical computational problems in Bayesian networks, with widespread applications in AI and beyond. This problem, known as MAP, is computationally intractable (NP-ha
Bayesian semiparametric dynamic Nelson-Siegel model
C. Cakmakli
2011-01-01
This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric
Von Neumann was not a Quantum Bayesian.
Stacey, Blake C
2016-05-28
Wikipedia has claimed for over 3 years now that John von Neumann was the 'first quantum Bayesian'. In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported. PMID:27091166
Von Neumann Was Not a Quantum Bayesian
Blake C. Stacey
2014-01-01
Wikipedia has claimed for over three years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.
A Bayesian Approach to Interactive Retrieval
Tague, Jean M.
1973-01-01
A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…
Bayesian regularization of diffusion tensor images
DEFF Research Database (Denmark)
Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif;
2007-01-01
several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing...
Inverse Problems in a Bayesian Setting
Matthies, Hermann G.
2016-02-13
In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.
Comprehension and computation in Bayesian problem solving
Directory of Open Access Journals (Sweden)
Eric D. Johnson
2015-07-01
Full Text Available Humans have long been characterized as poor probabilistic reasoners when presented with explicit numerical information. Bayesian word problems provide a well-known example of this, where even highly educated and cognitively skilled individuals fail to adhere to mathematical norms. It is widely agreed that natural frequencies can facilitate Bayesian reasoning relative to normalized formats (e.g. probabilities, percentages, both by clarifying logical set-subset relations and by simplifying numerical calculations. Nevertheless, between-study performance on transparent Bayesian problems varies widely, and generally remains rather unimpressive. We suggest there has been an over-focus on this representational facilitator (i.e. transparent problem structures at the expense of the specific logical and numerical processing requirements and the corresponding individual abilities and skills necessary for providing Bayesian-like output given specific verbal and numerical input. We further suggest that understanding this task-individual pair could benefit from considerations from the literature on mathematical cognition, which emphasizes text comprehension and problem solving, along with contributions of online executive working memory, metacognitive regulation, and relevant stored knowledge and skills. We conclude by offering avenues for future research aimed at identifying the stages in problem solving at which correct versus incorrect reasoners depart, and how individual difference might influence this time point.
Bayesian Vector Autoregressions with Stochastic Volatility
Uhlig, H.F.H.V.S.
1996-01-01
This paper proposes a Bayesian approach to a vector autoregression with stochastic volatility, where the multiplicative evolution of the precision matrix is driven by a multivariate beta variate.Exact updating formulas are given to the nonlinear filtering of the precision matrix.Estimation of the au
Scaling Bayesian network discovery through incremental recovery
Castelo, J.R.; Siebes, A.P.J.M.
1999-01-01
Bayesian networks are a type of graphical models that, e.g., allow one to analyze the interaction among the variables in a database. A well-known problem with the discovery of such models from a database is the ``problem of high-dimensionality''. That is, the discovery of a network from a database w
A Bayesian Bootstrap for a Finite Population
Lo, Albert Y.
1988-01-01
A Bayesian bootstrap for a finite population is introduced; its small-sample distributional properties are discussed and compared with those of the frequentist bootstrap for a finite population. It is also shown that the two are first-order asymptotically equivalent.
Bayesian calibration for forensic age estimation.
Ferrante, Luigi; Skrami, Edlira; Gesuita, Rosaria; Cameriere, Roberto
2015-05-10
Forensic medicine is increasingly called upon to assess the age of individuals. Forensic age estimation is mostly required in relation to illegal immigration and identification of bodies or skeletal remains. A variety of age estimation methods are based on dental samples and use of regression models, where the age of an individual is predicted by morphological tooth changes that take place over time. From the medico-legal point of view, regression models, with age as the dependent random variable entail that age tends to be overestimated in the young and underestimated in the old. To overcome this bias, we describe a new full Bayesian calibration method (asymmetric Laplace Bayesian calibration) for forensic age estimation that uses asymmetric Laplace distribution as the probability model. The method was compared with three existing approaches (two Bayesian and a classical method) using simulated data. Although its accuracy was comparable with that of the other methods, the asymmetric Laplace Bayesian calibration appears to be significantly more reliable and robust in case of misspecification of the probability model. The proposed method was also applied to a real dataset of values of the pulp chamber of the right lower premolar measured on x-ray scans of individuals of known age. PMID:25645903
Exploiting structure in cooperative Bayesian games
F.A. Oliehoek; S. Whiteson; M.T.J. Spaan
2012-01-01
Cooperative Bayesian games (BGs) can model decision-making problems for teams of agents under imperfect information, but require space and computation time that is exponential in the number of agents. While agent independence has been used to mitigate these problems in perfect information settings,
Perfect Bayesian equilibrium. Part II: epistemic foundations
Bonanno, Giacomo
2011-01-01
In a companion paper we introduced a general notion of perfect Bayesian equilibrium which can be applied to arbitrary extensive-form games. The essential ingredient of the proposed definition is the qualitative notion of AGM-consistency. In this paper we provide an epistemic foundation for AGM-consistency based on the AGM theory of belief revision.
Decision generation tools and Bayesian inference
Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas
2014-05-01
Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.
Von Neumann Was Not a Quantum Bayesian
Stacey, Blake C
2014-01-01
Wikipedia has claimed for over two years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.
Bayesian calibration of car-following models
Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.
2010-01-01
Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p
Basics of Bayesian Learning - Basically Bayes
DEFF Research Database (Denmark)
Larsen, Jan
Tutorial presented at the IEEE Machine Learning for Signal Processing Workshop 2006, Maynooth, Ireland, September 8, 2006. The tutorial focuses on the basic elements of Bayesian learning and its relation to classical learning paradigms. This includes a critical discussion of the pros and cons...
On local optima in learning bayesian networks
DEFF Research Database (Denmark)
Dalgaard, Jens; Kocka, Tomas; Pena, Jose
2003-01-01
This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...
Bayesian Estimation Supersedes the "t" Test
Kruschke, John K.
2013-01-01
Bayesian estimation for 2 groups provides complete distributions of credible values for the effect size, group means and their difference, standard deviations and their difference, and the normality of the data. The method handles outliers. The decision rule can accept the null value (unlike traditional "t" tests) when certainty in the estimate is…
Bayesian Estimation of Thermonuclear Reaction Rates
Iliadis, Christian; Coc, Alain; Timmes, Frank; Starrfield, Sumner
2016-01-01
The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied in the past to this problem, all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extra-solar planets, gravitational waves, and type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present the first astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the d(p,$\\gamma$)$^3$He, $^3$He($^3$He,2p)$^4$He, and $^3$He($\\alpha$,$\\gamma$)$^7$Be reactions,...
Bayesian analysis of Markov point processes
DEFF Research Database (Denmark)
Berthelsen, Kasper Klitgaard; Møller, Jesper
2006-01-01
Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...
Bayesian Averaging is Well-Temperated
DEFF Research Database (Denmark)
Hansen, Lars Kai
2000-01-01
Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation...
Modelling crime linkage with Bayesian networks
J. de Zoete; M. Sjerps; D. Lagnado; N. Fenton
2015-01-01
When two or more crimes show specific similarities, such as a very distinct modus operandi, the probability that they were committed by the same offender becomes of interest. This probability depends on the degree of similarity and distinctiveness. We show how Bayesian networks can be used to model
Computational statistics using the Bayesian Inference Engine
Weinberg, Martin D.
2013-09-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.
Universal Darwinism as a process of Bayesian inference
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment". Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description clo...
Modification of combustion aerosols in the atmosphere
Energy Technology Data Exchange (ETDEWEB)
Weingartner, E. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1996-07-01
Combustion aerosols particles are released on large scale into the atmosphere in the industrialized regions as well as in the tropics (by wood fires). The particles are subjected to various aging processes which depend on the size, morphology, and chemical composition of the particles. The interaction of combustion particles with sunlight and humidity as well as adsorption and desorption of volatile material to or from the particles considerably changes their physical and chemical properties and thus their residence time in the atmosphere. This is of importance because combustion particles are known to have a variety of health effects on people. Moreover, atmospheric aerosol particles have an influence on climate, directly through the reflection and absorption of solar radiation and indirectly through modifying the optical properties and lifetime of clouds. In a first step, a field experiment was carried out to study the sources and characteristics of combustion aerosols that are emitted from vehicles in a road tunnel. It was found that most of the fine particles were tail pipe emissions of diesel powered vehicles. The calculation shows that on an average these vehicles emit about 300 mg fine particulate matter per driven kilometer. This emission factor is at least 100 times higher than the mean emission factor estimated for gasoline powered vehicles. Furthermore, it is found that during their residence time in the tunnel, the particles undergo significant changes: The particles change towards a more compact structure. The conclusion is reached that this is mainly due to adsorption of volatile material from the gas phase to the particle surface. In the atmosphere, the life cycle as well as the radiative and chemical properties of an aerosol particle is strongly dependent on its response to humidity. Therefore the hygroscopic behavior of combustion particles emitted from single sources (i.e. from a gasoline and a diesel engine) were studied in laboratory experiments.
Evaporation of droplets in a Champagne wine aerosol
Ghabache, Elisabeth; Liger-Belair, Gérard; Antkowiak, Arnaud; Séon, Thomas
2016-04-01
In a single glass of champagne about a million bubbles nucleate on the wall and rise towards the surface. When these bubbles reach the surface and rupture, they project a multitude of tiny droplets in the form of a particular aerosol holding a concentrate of wine aromas. Based on the model experiment of a single bubble bursting in idealized champagnes, the key features of the champagne aerosol are identified. In particular, we show that film drops, critical in sea spray for example, are here nonexistent. We then demonstrate that compared to a still wine, champagne fizz drastically enhances the transfer of liquid into the atmosphere. There, conditions on bubble radius and wine viscosity that optimize aerosol evaporation are provided. These results pave the way towards the fine tuning of flavor release during sparkling wine tasting, a major issue for the sparkling wine industry.
Evaporation of droplets in a Champagne wine aerosol
Ghabache, Elisabeth; Liger-Belair, Gérard; Antkowiak, Arnaud; Séon, Thomas
2016-01-01
In a single glass of champagne about a million bubbles nucleate on the wall and rise towards the surface. When these bubbles reach the surface and rupture, they project a multitude of tiny droplets in the form of a particular aerosol holding a concentrate of wine aromas. Based on the model experiment of a single bubble bursting in idealized champagnes, the key features of the champagne aerosol are identified. In particular, we show that film drops, critical in sea spray for example, are here nonexistent. We then demonstrate that compared to a still wine, champagne fizz drastically enhances the transfer of liquid into the atmosphere. There, conditions on bubble radius and wine viscosity that optimize aerosol evaporation are provided. These results pave the way towards the fine tuning of flavor release during sparkling wine tasting, a major issue for the sparkling wine industry. PMID:27125240
Bayesian network learning for natural hazard assessments
Vogel, Kristin
2016-04-01
Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables
Graphical aerosol classification method using aerosol relative optical depth
Chen, Qi-Xiang; Yuan, Yuan; Shuai, Yong; Tan, He-Ping
2016-06-01
A simple graphical method is presented to classify aerosol types based on a combination of aerosol optical thickness (AOT) and aerosol relative optical thickness (AROT). Six aerosol types, including maritime (MA), desert dust (DD), continental (CO), sub-continental (SC), urban industry (UI) and biomass burning (BB), are discriminated in a two dimensional space of AOT440 and AROT1020/440. Numerical calculations are performed using MIE theory based on a multi log-normal particle size distribution, and the AROT ranges for each aerosol type are determined. More than 5 years of daily observations from 8 representative aerosol sites are applied to the method to confirm spatial applicability. Finally, 3 individual cases are analyzed according to their specific aerosol status. The outcomes indicate that the new graphical method coordinates well with regional characteristics and is also able to distinguish aerosol variations in individual situations. This technique demonstrates a novel way to estimate different aerosol types and provide information on radiative forcing calculations and satellite data corrections.
Aerosol samplers innovation possibilities
International Nuclear Information System (INIS)
The growing demand for an early detection of increased levels of the artificial radionuclides in the atmosphere resulted in the design and fabrication of an aerosol sampler with automated spectrometric unit providing online gamma spectrometry above the aerosol filter. Study was performed with two types of high volume samplers- SENYA JL-900 SnowWhite (900 m3/h) a SENYA JL-150 Hunter (150 m3/h). This work gives results of the design optimization with respect to the detector type, geometry of measurement, remote control and spectrometric evaluation 222Rn and 220Rn concentration fluctuations in the outdoor air are discussed with regard to the detection limit so the radionuclides expected after the NPP accident. (authors)
Fission product vapour - aerosol interactions in the containment: simulant fuel studies
International Nuclear Information System (INIS)
Experiments have been conducted in the Falcon facility to study the interaction of fission product vapours released from simulant fuel samples with control rod aerosols. The aerosols generated from both the control rod and fuel sample were chemically distinct and had different deposition characteristics. Extensive interaction was observed between the fission product vapours and the control rod aerosol. The two dominant mechanisms were condensation of the vapours onto the aerosol, and chemical reactions between the two components; sorption phenomena were believed to be only of secondary importance. The interaction of fission product vapours and reactor materials aerosols could have a major impact on the transport characteristics of the radioactive emission from a degrading core. (author)
Aerosol characterization during project POLINAT
Energy Technology Data Exchange (ETDEWEB)
Hagen, D.E.; Hopkins, A.R.; Paladino, J.D.; Whitefield, P.D. [Missouri Univ., Rolla, MO (United States). Cloud and Aerosol Sciences Lab.; Lilenfeld, H.V. [McDonnell Douglas Aerospace-East, St. Louis, MO (United States)
1997-12-31
The objectives of the aerosol/particulate characterization measurements of project POLINAT (POLlution from aircraft emissions In the North ATlantic flight corridor) are: to search for aerosol/particulate signatures of air traffic emissions in the region of the North Atlantic Flight Corridor; to search for the aerosol/particulate component of large scale enhancement (`corridor effects`) of air traffic related species in the North Atlantic region; to determine the effective emission indices for the aerosol/particulate component of engine exhaust in both the near and far field of aircraft exhaust plumes; to measure the dispersion and transformation of the aerosol/particulate component of aircraft emissions as a function of ambient condition; to characterize background levels of aerosol/particulate concentrations in the North Atlantic Region; and to determine effective emission indices for engine exhaust particulates for regimes beyond the jet phase of plume expansion. (author) 10 refs.
Aerosol Observing System (AOS) Handbook
Energy Technology Data Exchange (ETDEWEB)
Jefferson, A
2011-01-17
The Aerosol Observing System (AOS) is a suite of in situ surface measurements of aerosol optical and cloud-forming properties. The instruments measure aerosol properties that influence the earth’s radiative balance. The primary optical measurements are those of the aerosol scattering and absorption coefficients as a function of particle size and radiation wavelength and cloud condensation nuclei (CCN) measurements as a function of percent supersaturation. Additional measurements include those of the particle number concentration and scattering hygroscopic growth. Aerosol optical measurements are useful for calculating parameters used in radiative forcing calculations such as the aerosol single-scattering albedo, asymmetry parameter, mass scattering efficiency, and hygroscopic growth. CCN measurements are important in cloud microphysical models to predict droplet formation.
Aerosol influence on radiative cooling
Grassl, Hartmut
2011-01-01
Aerosol particles have a complex index of refraction and therefore contribute to atmospheric emission and radiative cooling rates. In this paper calculations of the longwave flux divergence within the atmosphere at different heights are presented including water vapour and aerosol particles as emitters and absorbers. The spectral region covered is 5 to 100 microns divided into 23 spectral intervals. The relevant properties of the aerosol particles, the single scattering albedo and the extinct...
Bayesian parameter estimation for effective field theories
Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A
2015-01-01
We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Applications of Bayesian spectrum representation in acoustics
Botts, Jonathan M.
This dissertation utilizes a Bayesian inference framework to enhance the solution of inverse problems where the forward model maps to acoustic spectra. A Bayesian solution to filter design inverts a acoustic spectra to pole-zero locations of a discrete-time filter model. Spatial sound field analysis with a spherical microphone array is a data analysis problem that requires inversion of spatio-temporal spectra to directions of arrival. As with many inverse problems, a probabilistic analysis results in richer solutions than can be achieved with ad-hoc methods. In the filter design problem, the Bayesian inversion results in globally optimal coefficient estimates as well as an estimate the most concise filter capable of representing the given spectrum, within a single framework. This approach is demonstrated on synthetic spectra, head-related transfer function spectra, and measured acoustic reflection spectra. The Bayesian model-based analysis of spatial room impulse responses is presented as an analogous problem with equally rich solution. The model selection mechanism provides an estimate of the number of arrivals, which is necessary to properly infer the directions of simultaneous arrivals. Although, spectrum inversion problems are fairly ubiquitous, the scope of this dissertation has been limited to these two and derivative problems. The Bayesian approach to filter design is demonstrated on an artificial spectrum to illustrate the model comparison mechanism and then on measured head-related transfer functions to show the potential range of application. Coupled with sampling methods, the Bayesian approach is shown to outperform least-squares filter design methods commonly used in commercial software, confirming the need for a global search of the parameter space. The resulting designs are shown to be comparable to those that result from global optimization methods, but the Bayesian approach has the added advantage of a filter length estimate within the same unified
Narrowband interference parameterization for sparse Bayesian recovery
Ali, Anum
2015-09-11
This paper addresses the problem of narrowband interference (NBI) in SC-FDMA systems by using tools from compressed sensing and stochastic geometry. The proposed NBI cancellation scheme exploits the frequency domain sparsity of the unknown signal and adopts a Bayesian sparse recovery procedure. This is done by keeping a few randomly chosen sub-carriers data free to sense the NBI signal at the receiver. As Bayesian recovery requires knowledge of some NBI parameters (i.e., mean, variance and sparsity rate), we use tools from stochastic geometry to obtain analytical expressions for the required parameters. Our simulation results validate the analysis and depict suitability of the proposed recovery method for NBI mitigation. © 2015 IEEE.
Bayesian networks for enterprise risk assessment
Bonafede, C E
2006-01-01
According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. In general risk is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover qualitative data must be converted in numerical values to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Network is a useful tool to integrate different information and in particular to study the risk's joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a Bayesian networks in the parti...
Bayesianism and inference to the best explanation
Directory of Open Access Journals (Sweden)
Valeriano IRANZO
2008-01-01
Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.
QBism, the Perimeter of Quantum Bayesianism
Fuchs, Christopher A
2010-01-01
This article summarizes the Quantum Bayesian point of view of quantum mechanics, with special emphasis on the view's outer edges---dubbed QBism. QBism has its roots in personalist Bayesian probability theory, is crucially dependent upon the tools of quantum information theory, and most recently, has set out to investigate whether the physical world might be of a type sketched by some false-started philosophies of 100 years ago (pragmatism, pluralism, nonreductionism, and meliorism). Beyond conceptual issues, work at Perimeter Institute is focused on the hard technical problem of finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when an agent considers gambling on the consequences of...
Distributed Bayesian Networks for User Modeling
DEFF Research Database (Denmark)
Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang;
2006-01-01
The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used...... by such adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... of Web-based eLearning platforms. The scenario we are tackling assumes learners who use several systems over time, which are able to create partial Bayesian Networks for user models based on the local system context. In particular, we focus on how to merge these partial user models. Our merge mechanism...
Machine learning a Bayesian and optimization perspective
Theodoridis, Sergios
2015-01-01
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...
Bayesian image reconstruction: Application to emission tomography
Energy Technology Data Exchange (ETDEWEB)
Nunez, J.; Llacer, J.
1989-02-01
In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.
The Bayesian Who Knew Too Much
Benétreau-Dupin, Yann
2014-01-01
In several papers, John Norton has argued that Bayesianism cannot handle ignorance adequately due to its inability to distinguish between neutral and disconfirming evidence. He argued that this inability sows confusion in, e.g., anthropic reasoning in cosmology or the Doomsday argument, by allowing one to draw unwarranted conclusions from a lack of knowledge. Norton has suggested criteria for a candidate for representation of neutral support. Imprecise credences (families of credal probability functions) constitute a Bayesian-friendly framework that allows us to avoid inadequate neutral priors and better handle ignorance. The imprecise model generally agrees with Norton's representation of ignorance but requires that his criterion of self-duality be reformulated or abandoned
Software Health Management with Bayesian Networks
Mengshoel, Ole; Schumann, JOhann
2011-01-01
Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.
Learning Bayesian networks using genetic algorithm
Institute of Scientific and Technical Information of China (English)
Chen Fei; Wang Xiufeng; Rao Yimei
2007-01-01
A new method to evaluate the fitness of the Bayesian networks according to the observed data is provided. The main advantage of this criterion is that it is suitable for both the complete and incomplete cases while the others not.Moreover it facilitates the computation greatly. In order to reduce the search space, the notation of equivalent class proposed by David Chickering is adopted. Instead of using the method directly, the novel criterion, variable ordering, and equivalent class are combined,moreover the proposed mthod avoids some problems caused by the previous one. Later, the genetic algorithm which allows global convergence, lack in the most of the methods searching for Bayesian network is applied to search for a good model in thisspace. To speed up the convergence, the genetic algorithm is combined with the greedy algorithm. Finally, the simulation shows the validity of the proposed approach.
Bayesian Population Projections for the United Nations.
Raftery, Adrian E; Alkema, Leontine; Gerland, Patrick
2014-02-01
The United Nations regularly publishes projections of the populations of all the world's countries broken down by age and sex. These projections are the de facto standard and are widely used by international organizations, governments and researchers. Like almost all other population projections, they are produced using the standard deterministic cohort-component projection method and do not yield statements of uncertainty. We describe a Bayesian method for producing probabilistic population projections for most countries that the United Nations could use. It has at its core Bayesian hierarchical models for the total fertility rate and life expectancy at birth. We illustrate the method and show how it can be extended to address concerns about the UN's current assumptions about the long-term distribution of fertility. The method is implemented in the R packages bayesTFR, bayesLife, bayesPop and bayesDem.
Approximate Bayesian Computation: a nonparametric perspective
Blum, Michael
2010-01-01
Approximate Bayesian Computation is a family of likelihood-free inference techniques that are well-suited to models defined in terms of a stochastic generating mechanism. In a nutshell, Approximate Bayesian Computation proceeds by computing summary statistics s_obs from the data and simulating summary statistics for different values of the parameter theta. The posterior distribution is then approximated by an estimator of the conditional density g(theta|s_obs). In this paper, we derive the asymptotic bias and variance of the standard estimators of the posterior distribution which are based on rejection sampling and linear adjustment. Additionally, we introduce an original estimator of the posterior distribution based on quadratic adjustment and we show that its bias contains a fewer number of terms than the estimator with linear adjustment. Although we find that the estimators with adjustment are not universally superior to the estimator based on rejection sampling, we find that they can achieve better perfor...
Bayesian information fusion networks for biosurveillance applications.
Mnatsakanyan, Zaruhi R; Burkom, Howard S; Coberly, Jacqueline S; Lombardo, Joseph S
2009-01-01
This study introduces new information fusion algorithms to enhance disease surveillance systems with Bayesian decision support capabilities. A detection system was built and tested using chief complaints from emergency department visits, International Classification of Diseases Revision 9 (ICD-9) codes from records of outpatient visits to civilian and military facilities, and influenza surveillance data from health departments in the National Capital Region (NCR). Data anomalies were identified and distribution of time offsets between events in the multiple data streams were established. The Bayesian Network was built to fuse data from multiple sources and identify influenza-like epidemiologically relevant events. Results showed increased specificity compared with the alerts generated by temporal anomaly detection algorithms currently deployed by NCR health departments. Further research should be done to investigate correlations between data sources for efficient fusion of the collected data.
Bayesian Magnetohydrodynamic Seismology of Coronal Loops
Arregui, Inigo
2011-01-01
We perform a Bayesian parameter inference in the context of resonantly damped transverse coronal loop oscillations. The forward problem is solved in terms of parametric results for kink waves in one-dimensional flux tubes in the thin tube and thin boundary approximations. For the inverse problem, we adopt a Bayesian approach to infer the most probable values of the relevant parameters, for given observed periods and damping times, and to extract their confidence levels. The posterior probability distribution functions are obtained by means of Markov Chain Monte Carlo simulations, incorporating observed uncertainties in a consistent manner. We find well localized solutions in the posterior probability distribution functions for two of the three parameters of interest, namely the Alfven travel time and the transverse inhomogeneity length-scale. The obtained estimates for the Alfven travel time are consistent with previous inversion results, but the method enables us to additionally constrain the transverse inho...
A Bayesian nonparametric meta-analysis model.
Karabatsos, George; Talbott, Elizabeth; Walker, Stephen G
2015-03-01
In a meta-analysis, it is important to specify a model that adequately describes the effect-size distribution of the underlying population of studies. The conventional normal fixed-effect and normal random-effects models assume a normal effect-size population distribution, conditionally on parameters and covariates. For estimating the mean overall effect size, such models may be adequate, but for prediction, they surely are not if the effect-size distribution exhibits non-normal behavior. To address this issue, we propose a Bayesian nonparametric meta-analysis model, which can describe a wider range of effect-size distributions, including unimodal symmetric distributions, as well as skewed and more multimodal distributions. We demonstrate our model through the analysis of real meta-analytic data arising from behavioral-genetic research. We compare the predictive performance of the Bayesian nonparametric model against various conventional and more modern normal fixed-effects and random-effects models.
Distributed Detection via Bayesian Updates and Consensus
Liu, Qipeng; Wang, Xiaofan
2014-01-01
In this paper, we discuss a class of distributed detection algorithms which can be viewed as implementations of Bayes' law in distributed settings. Some of the algorithms are proposed in the literature most recently, and others are first developed in this paper. The common feature of these algorithms is that they all combine (i) certain kinds of consensus protocols with (ii) Bayesian updates. They are different mainly in the aspect of the type of consensus protocol and the order of the two operations. After discussing their similarities and differences, we compare these distributed algorithms by numerical examples. We focus on the rate at which these algorithms detect the underlying true state of an object. We find that (a) The algorithms with consensus via geometric average is more efficient than that via arithmetic average; (b) The order of consensus aggregation and Bayesian update does not apparently influence the performance of the algorithms; (c) The existence of communication delay dramatically slows do...
Probabilistic forecasting and Bayesian data assimilation
Reich, Sebastian
2015-01-01
In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...
Bayesian Peak Picking for NMR Spectra
Cheng, Yichen
2014-02-01
Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.
A Bayesian approach to person perception.
Clifford, C W G; Mareschal, I; Otsuka, Y; Watson, T L
2015-11-01
Here we propose a Bayesian approach to person perception, outlining the theoretical position and a methodological framework for testing the predictions experimentally. We use the term person perception to refer not only to the perception of others' personal attributes such as age and sex but also to the perception of social signals such as direction of gaze and emotional expression. The Bayesian approach provides a formal description of the way in which our perception combines current sensory evidence with prior expectations about the structure of the environment. Such expectations can lead to unconscious biases in our perception that are particularly evident when sensory evidence is uncertain. We illustrate the ideas with reference to our recent studies on gaze perception which show that people have a bias to perceive the gaze of others as directed towards themselves. We also describe a potential application to the study of the perception of a person's sex, in which a bias towards perceiving males is typically observed.
Bayesian parameter estimation for effective field theories
Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.
2016-07-01
We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
BONNSAI: correlated stellar observables in Bayesian methods
Schneider, F R N; Fossati, L; Langer, N; de Koter, A
2016-01-01
In an era of large spectroscopic surveys of stars and big data, sophisticated statistical methods become more and more important in order to infer fundamental stellar parameters such as mass and age. Bayesian techniques are powerful methods because they can match all available observables simultaneously to stellar models while taking prior knowledge properly into account. However, in most cases it is assumed that observables are uncorrelated which is generally not the case. Here, we include correlations in the Bayesian code BONNSAI by incorporating the covariance matrix in the likelihood function. We derive a parametrisation of the covariance matrix that, in addition to classical uncertainties, only requires the specification of a correlation parameter that describes how observables co-vary. Our correlation parameter depends purely on the method with which observables have been determined and can be analytically derived in some cases. This approach therefore has the advantage that correlations can be accounte...
International Nuclear Information System (INIS)
The Swiss Gas Industry has carried out a systematic, technical estimate of methane release from the complete supply chain from production to consumption for the years 1992/1993. The result of this survey provided a conservative value, amounting to 0.9% of the Swiss domestic output. A continuation of the study taking into account new findings with regard to emission factors and the effect of the climate is now available, which provides a value of 0.8% for the target year of 1996. These results show that the renovation of the network has brought about lower losses in the local gas supplies, particularly for the grey cast iron pipelines. (author)
Yu, Jihnhee; Hutson, Alan D; Siddiqui, Adnan H; Kedron, Mary A
2016-02-01
In some small clinical trials, toxicity is not a primary endpoint; however, it often has dire effects on patients' quality of life and is even life-threatening. For such clinical trials, rigorous control of the overall incidence of adverse events is desirable, while simultaneously collecting safety information. In this article, we propose group sequential toxicity monitoring strategies to control overall toxicity incidents below a certain level as opposed to performing hypothesis testing, which can be incorporated into an existing study design based on the primary endpoint. We consider two sequential methods: a non-Bayesian approach in which stopping rules are obtained based on the 'future' probability of an excessive toxicity rate; and a Bayesian adaptation modifying the proposed non-Bayesian approach, which can use the information obtained at interim analyses. Through an extensive Monte Carlo study, we show that the Bayesian approach often provides better control of the overall toxicity rate than the non-Bayesian approach. We also investigate adequate toxicity estimation after the studies. We demonstrate the applicability of our proposed methods in controlling the symptomatic intracranial hemorrhage rate for treating acute ischemic stroke patients.
The Size-Weight Illusion is not anti-Bayesian after all: a unifying Bayesian account.
Peters, Megan A K; Ma, Wei Ji; Shams, Ladan
2016-01-01
When we lift two differently-sized but equally-weighted objects, we expect the larger to be heavier, but the smaller feels heavier. However, traditional Bayesian approaches with "larger is heavier" priors predict the smaller object should feel lighter; this Size-Weight Illusion (SWI) has thus been labeled "anti-Bayesian" and has stymied psychologists for generations. We propose that previous Bayesian approaches neglect the brain's inference process about density. In our Bayesian model, objects' perceived heaviness relationship is based on both their size and inferred density relationship: observers evaluate competing, categorical hypotheses about objects' relative densities, the inference about which is then used to produce the final estimate of weight. The model can qualitatively and quantitatively reproduce the SWI and explain other researchers' findings, and also makes a novel prediction, which we confirmed. This same computational mechanism accounts for other multisensory phenomena and illusions; that the SWI follows the same process suggests that competitive-prior Bayesian inference can explain human perception across many domains.
Topics in current aerosol research
Hidy, G M
1971-01-01
Topics in Current Aerosol Research deals with the fundamental aspects of aerosol science, with emphasis on experiment and theory describing highly dispersed aerosols (HDAs) as well as the dynamics of charged suspensions. Topics covered range from the basic properties of HDAs to their formation and methods of generation; sources of electric charges; interactions between fluid and aerosol particles; and one-dimensional motion of charged cloud of particles. This volume is comprised of 13 chapters and begins with an introduction to the basic properties of HDAs, followed by a discussion on the form
Resolution and Content Improvements to MISR Aerosol and Land Surface Products
Garay, M. J.; Bull, M. A.; Diner, D. J.; Hansen, E. G.; Kalashnikova, O. V.
2015-12-01
Since early 2000, the Multi-angle Imaging SpectroRadiometer (MISR) instrument on NASA's Terra satellite has been providing operational Level 2 (swath-based) aerosol optical depth (AOD) and particle property retrievals at 17.6 km spatial resolution and atmospherically corrected land surface products at 1.1 km resolution. The performance of the aerosol product has been validated against ground-based Aerosol Robotic Network (AERONET) observations, model comparisons, and climatological assessments. This product has played a major role in studies of the impacts of aerosols on climate and air quality. The surface product has found a variety of uses, particularly at regional scales for assessing vegetation and land surface change. A major development effort has led to the release of an update to the operational (Version 22) MISR Level 2 aerosol and land surface retrieval products, which has been in production since December 2007. The new release is designated Version 23. The resolution of the aerosol product has been increased to 4.4 km, allowing more detailed characterization of aerosol spatial variability, especially near local sources and in urban areas. The product content has been simplified and updated to include more robust measures of retrieval uncertainty and other fields to benefit users. The land surface product has also been updated to incorporate the Version 23 aerosol product as input and to improve spatial coverage, particularly over mountainous terrain and snow/ice-covered surfaces. We will describe the major upgrades incorporated in Version 23 and present validation of the aerosol product against both the standard AERONET historical database, as well as high spatial density AERONET-DRAGON deployments. Comparisons will also be shown relative to the Version 22 aerosol and land surface products. Applications enabled by these product updates will be discussed.
An Emerging Global Aerosol Climatology from the MODIS Satellite Sensors
Remer, Lorraine A.; Kleidman, Richard G.; Levy, Robert C.; Kaufman, Yoram J.; Tanre, Didier; Mattoo, Shana; Martins, J. Vandelei; Ichoku, Charles; Koren, Ilan; Hongbin, Yu; Holben, Brent N.
2008-01-01
The recently released Collection 5 MODIS aerosol products provide a consistent record of the Earth's aerosol system. Comparison with ground-based AERONET observations of aerosol optical depth (AOD) we find that Collection 5 MODIS aerosol products estimate AOD to within expected accuracy more than 60% of the time over ocean and more than 72% of the time over land. This is similar to previous results for ocean, and better than the previous results for land. However, the new Collection introduces a 0.01 5 offset between the Terra and Aqua global mean AOD over ocean, where none existed previously. Aqua conforms to previous values and expectations while Terra is high. The cause of the offset is unknown, but changes to calibration are a possible explanation. We focus the climatological analysis on the better understood Aqua retrievals. We find that global mean AOD at 550 nm over oceans is 0.13 and over land 0.19. AOD in situations with 80% cloud fraction are twice the global mean values, although such situations occur only 2% of the time over ocean and less than 1% of the time over land. There is no drastic change in aerosol particle size associated with these very cloudy situations. Regionally, aerosol amounts vary from polluted areas such as East Asia and India, to the cleanest regions such as Australia and the northern continents. In almost all oceans fine mode aerosol dominates over dust, except in the tropical Atlantic downwind of the Sahara and in some months the Arabian Sea.
International Nuclear Information System (INIS)
In order to study the natural release of aerosol particles by the Amazon Basin tropical rain forest, the composition and size distribution of biogenic aerosol particles were analyzed. The role of the atmospheric emissions from the Amazon Basin rain forest in the global atmosphere will be investigated. The atmosphere was studied in long-term sampling stations in three different locations. The elemental composition of aerosol particles released during biomass burning was also measured in several different ecosystems, from primary forest to Savannah. One of the main focuses was to identify and quantify important physical and chemical processes in the generation, transformation and deposition of aerosol particles. Also important was to obtain a better understanding of natural aerosol sources concerning identification, their characteristics and strength, to be able to understand the natural chemistry in the atmosphere on a global scale. 36 refs, 3 figs, 3 tabs
Bayesian nonparametric regression with varying residual density
Pati, Debdeep; Dunson, David B.
2013-01-01
We consider the problem of robust Bayesian inference on the mean regression function allowing the residual density to change flexibly with predictors. The proposed class of models is based on a Gaussian process prior for the mean regression function and mixtures of Gaussians for the collection of residual densities indexed by predictors. Initially considering the homoscedastic case, we propose priors for the residual density based on probit stick-breaking (PSB) scale mixtures and symmetrized ...
Towards Bayesian Deep Learning: A Survey
Wang, Hao; Yeung, Dit-Yan
2016-01-01
While perception tasks such as visual object recognition and text understanding play an important role in human intelligence, the subsequent tasks that involve inference, reasoning and planning require an even higher level of intelligence. The past few years have seen major advances in many perception tasks using deep learning models. For higher-level inference, however, probabilistic graphical models with their Bayesian nature are still more powerful and flexible. To achieve integrated intel...
Improving Environmental Scanning Systems Using Bayesian Networks
Simon Welter; Jörg H. Mayer; Reiner Quick
2013-01-01
As companies’ environment is becoming increasingly volatile, scanning systems gain in importance. We propose a hybrid process model for such systems' information gathering and interpretation tasks that combines quantitative information derived from regression analyses and qualitative knowledge from expert interviews. For the latter, we apply Bayesian networks. We derive the need for such a hybrid process model from a literature review. We lay out our model to find a suitable set of business e...
Approximate Bayesian inference for complex ecosystems
Michael P H Stumpf
2014-01-01
Mathematical models have been central to ecology for nearly a century. Simple models of population dynamics have allowed us to understand fundamental aspects underlying the dynamics and stability of ecological systems. What has remained a challenge, however, is to meaningfully interpret experimental or observational data in light of mathematical models. Here, we review recent developments, notably in the growing field of approximate Bayesian computation (ABC), that allow us to calibrate mathe...
Forming Object Concept Using Bayesian Network
Nakamura, Tomoaki; Nagai, Takayuki
2010-01-01
This chapter hase discussed a novel framework for object understanding. Implementation of the proposed framework using Bayesian Network has been presented. Although the result given in this paper is preliminary one, we have shown that the system can form object concept by observing the performance by human hands. The on-line learning is left for the future works. Moreover the model should be extended so that it can represent the object usage and work objects.
Bayesian belief networks in business continuity.
Phillipson, Frank; Matthijssen, Edwin; Attema, Thomas
2014-01-01
Business continuity professionals aim to mitigate the various challenges to the continuity of their company. The goal is a coherent system of measures that encompass detection, prevention and recovery. Choices made in one part of the system affect other parts as well as the continuity risks of the company. In complex organisations, however, these relations are far from obvious. This paper proposes the use of Bayesian belief networks to expose these relations, and presents a modelling framework for this approach. PMID:25193453
Informed Source Separation: A Bayesian Tutorial
Knuth, Kevin
2013-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In this tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea...
Market Segmentation Using Bayesian Model Based Clustering
Van Hattum, P.
2009-01-01
This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...
Approximate Bayesian computation in population genetics.
Beaumont, Mark A; Zhang, Wenyang; Balding, David J.
2002-01-01
We propose a new method for approximate Bayesian statistical inference on the basis of summary statistics. The method is suited to complex problems that arise in population genetics, extending ideas developed in this setting by earlier authors. Properties of the posterior distribution of a parameter, such as its mean or density curve, are approximated without explicit likelihood calculations. This is achieved by fitting a local-linear regression of simulated parameter values on simulated summ...
Bayesian nonparametric duration model with censorship
Directory of Open Access Journals (Sweden)
Joseph Hakizamungu
2007-10-01
Full Text Available This paper is concerned with nonparametric i.i.d. durations models censored observations and we establish by a simple and unified approach the general structure of a bayesian nonparametric estimator for a survival function S. For Dirichlet prior distributions, we describe completely the structure of the posterior distribution of the survival function. These results are essentially supported by prior and posterior independence properties.
Bayesian modeling and classification of neural signals
Lewicki, Michael S.
1994-01-01
Signal processing and classification algorithms often have limited applicability resulting from an inaccurate model of the signal's underlying structure. We present here an efficient, Bayesian algorithm for modeling a signal composed of the superposition of brief, Poisson-distributed functions. This methodology is applied to the specific problem of modeling and classifying extracellular neural waveforms which are composed of a superposition of an unknown number of action potentials CAPs). ...
Bayesian Estimation and Inference Using Stochastic Electronics.
Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André
2016-01-01
In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream. PMID:27047326
Bayesian biclustering of gene expression data
Liu Jun S; Gu Jiajun
2008-01-01
Abstract Background Biclustering of gene expression data searches for local patterns of gene expression. A bicluster (or a two-way cluster) is defined as a set of genes whose expression profiles are mutually similar within a subset of experimental conditions/samples. Although several biclustering algorithms have been studied, few are based on rigorous statistical models. Results We developed a Bayesian biclustering model (BBC), and implemented a Gibbs sampling procedure for its statistical in...
Nonparametric Bayesian Storyline Detection from Microtexts
Krishnan, Vinodh; Eisenstein, Jacob
2016-01-01
News events and social media are composed of evolving storylines, which capture public attention for a limited period of time. Identifying these storylines would enable many high-impact applications, such as tracking public interest and opinion in ongoing crisis events. However, this requires integrating temporal and linguistic information, and prior work takes a largely heuristic approach. We present a novel online non-parametric Bayesian framework for storyline detection, using the distance...
Dual Control for Approximate Bayesian Reinforcement Learning
Klenske, Edgar D.; Hennig, Philipp
2015-01-01
Control of non-episodic, finite-horizon dynamical systems with uncertain dynamics poses a tough and elementary case of the exploration-exploitation trade-off. Bayesian reinforcement learning, reasoning about the effect of actions and future observations, offers a principled solution, but is intractable. We review, then extend an old approximate approach from control theory---where the problem is known as dual control---in the context of modern regression methods, specifically generalized line...
A Bayesian framework for robotic programming
Lebeltel, Olivier; Diard, Julien; Bessiere, Pierre; Mazer, Emmanuel
2000-01-01
We propose an original method for programming robots based on Bayesian inference and learning. This method formally deals with problems of uncertainty and incomplete information that are inherent to the field. Indeed, the principal difficulties of robot programming comes from the unavoidable incompleteness of the models used. We present the formalism for describing a robotic task as well as the resolution methods. This formalism is inspired by the theory of probability, suggested by the physi...
Constrained bayesian inference of project performance models
Sunmola, Funlade
2013-01-01
Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...
Bayesian Estimation and Inference Using Stochastic Electronics.
Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André
2016-01-01
In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...
Bayesian Variable Selection via Particle Stochastic Search.
Shi, Minghui; Dunson, David B
2011-02-01
We focus on Bayesian variable selection in regression models. One challenge is to search the huge model space adequately, while identifying high posterior probability regions. In the past decades, the main focus has been on the use of Markov chain Monte Carlo (MCMC) algorithms for these purposes. In this article, we propose a new computational approach based on sequential Monte Carlo (SMC), which we refer to as particle stochastic search (PSS). We illustrate PSS through applications to linear regression and probit models.
Bayesian Spatial Modelling with R-INLA
Finn Lindgren; Håvard Rue
2015-01-01
The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...
Sparse Bayesian learning in ISAR tomography imaging
Institute of Scientific and Technical Information of China (English)
SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang
2015-01-01
Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.
Particle identification in ALICE: a Bayesian approach
Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Vargas Trevino, Aurora Diozcora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym
2016-01-01
We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...
Bayesian Analysis of Individual Level Personality Dynamics
Directory of Open Access Journals (Sweden)
Edward Cripps
2016-07-01
Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively ﬁxed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the beneﬁts of Bayesian techniques for the analysis of within-person processes. These include more formal speciﬁcation of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.
Bayesian Recurrent Neural Network for Language Modeling.
Chien, Jen-Tzung; Ku, Yuan-Chu
2016-02-01
A language model (LM) is calculated as the probability of a word sequence that provides the solution to word prediction for a variety of information systems. A recurrent neural network (RNN) is powerful to learn the large-span dynamics of a word sequence in the continuous space. However, the training of the RNN-LM is an ill-posed problem because of too many parameters from a large dictionary size and a high-dimensional hidden layer. This paper presents a Bayesian approach to regularize the RNN-LM and apply it for continuous speech recognition. We aim to penalize the too complicated RNN-LM by compensating for the uncertainty of the estimated model parameters, which is represented by a Gaussian prior. The objective function in a Bayesian classification network is formed as the regularized cross-entropy error function. The regularized model is constructed not only by calculating the regularized parameters according to the maximum a posteriori criterion but also by estimating the Gaussian hyperparameter by maximizing the marginal likelihood. A rapid approximation to a Hessian matrix is developed to implement the Bayesian RNN-LM (BRNN-LM) by selecting a small set of salient outer-products. The proposed BRNN-LM achieves a sparser model than the RNN-LM. Experiments on different corpora show the robustness of system performance by applying the rapid BRNN-LM under different conditions.
Bayesian Analysis of Individual Level Personality Dynamics
Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Bayesian Inference of a Multivariate Regression Model
Directory of Open Access Journals (Sweden)
Marick S. Sinay
2014-01-01
Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.
Bayesian Methods for Radiation Detection and Dosimetry
International Nuclear Information System (INIS)
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed compartmental activities. From the estimated probability densities of the model parameters we were able to derive the densities for compartmental activities for a two compartment catenary model at different times. We also calculated the average activities and their standard deviation for a simple two compartment model
Bayesian and Dempster–Shafer fusion
Indian Academy of Sciences (India)
Subhash Challa; Don Koks
2004-04-01
The Kalman Filter is traditionally viewed as a prediction–correction ﬁltering algorithm. In this work we show that it can be viewed as a Bayesian fusion algorithm and derive it using Bayesian arguments. We begin with an outline of Bayes theory, using it to discuss well-known quantities such as priors, likelihood and posteriors, and we provide the basic Bayesian fusion equation. We derive the Kalman Filter from this equation using a novel method to evaluate the Chapman–Kolmogorov prediction integral. We then use the theory to fuse data from multiple sensors. Vying with this approach is the Dempster–Shafer theory, which deals with measures of “belief”, and is based on the nonclassical idea of “mass” as opposed to probability. Although these two measures look very similar, there are some differences. We point them out through outlining the ideas of the Dempster– Shafer theory and presenting the basic Dempster–Shafer fusion equation. Finally we compare the two methods, and discuss the relative merits and demerits using an illustrative example.
Bayesian Analysis of Individual Level Personality Dynamics.
Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Converse, Sarah J.; Royle, J. Andrew; Urbanek, Richard P.
2012-01-01
Inbreeding depression is frequently a concern of managers interested in restoring endangered species. Decisions to reduce the potential for inbreeding depression by balancing genotypic contributions to reintroduced populations may exact a cost on long-term demographic performance of the population if those decisions result in reduced numbers of animals released and/or restriction of particularly successful genotypes (i.e., heritable traits of particular family lines). As part of an effort to restore a migratory flock of Whooping Cranes (Grus americana) to eastern North America using the offspring of captive breeders, we obtained a unique dataset which includes post-release mark-recapture data, as well as the pedigree of each released individual. We developed a Bayesian formulation of a multi-state model to analyze radio-telemetry, band-resight, and dead recovery data on reintroduced individuals, in order to track survival and breeding state transitions. We used studbook-based individual covariates to examine the comparative evidence for and degree of effects of inbreeding, genotype, and genotype quality on post-release survival of reintroduced individuals. We demonstrate implementation of the Bayesian multi-state model, which allows for the integration of imperfect detection, multiple data types, random effects, and individual- and time-dependent covariates. Our results provide only weak evidence for an effect of the quality of an individual's genotype in captivity on post-release survival as well as for an effect of inbreeding on post-release survival. We plan to integrate our results into a decision-analytic modeling framework that can explicitly examine tradeoffs between the effects of inbreeding and the effects of genotype and demographic stochasticity on population establishment.
Directory of Open Access Journals (Sweden)
Yufei Huang
2007-06-01
Full Text Available Reverse engineering of genetic regulatory networks from time series microarray data are investigated. We propose a dynamic Bayesian networks (DBNs modeling and a full Bayesian learning scheme. The proposed DBN directly models the continuous expression levels and also is associated with parameters that indicate the degree as well as the type of regulations. To learn the network from data, we proposed a reversible jump Markov chain Monte Carlo (RJMCMC algorithm. The RJMCMC algorithm can provide not only more accurate inference results than the deterministic alternative algorithms but also an estimate of the a posteriori probabilities (APPs of the network topology. The estimated APPs provide useful information on the confidence of the inferred results and can also be used for efficient Bayesian data integration. The proposed approach is tested on yeast cell cycle microarray data and the results are compared with the KEGG pathway map.
Source strength of fungal spore aerosolization from moldy building material
Energy Technology Data Exchange (ETDEWEB)
Gorny, Rafa L.; Reponen, Tiina; Grinshpun, Sergey A.; Willeke, Klaus [Cincinnati Univ., Dept. of Environmental Health, Cincinnati, OH (United States)
2001-07-01
The release of Aspergillus versicolor, Cladosporium cladosporioides, and Penicillium melinii spores from agar and ceiling tile surfaces was tested under different controlled environmental conditions using a newly designed and constructed aerosolization chamber. This study revealed that all the investigated parameters, such as fungal species, air velocity above the surface, texture of the surface, and vibration of contaminated material, affected the fungal spore release. It was found that typical indoor air currents can release up to 200 spores cm {sup -2} from surface with fungal spores during 30-min experiments. The release of fungal spores from smooth agar surfaces was found to be inadequate for accurately predicting the emission from rough ceiling tile surfaces because the air turbulence increases the spore release from a rough surface. A vibration of a frequency of 1Hz at a power level of 14W resulted in a significant increase in the spore release rate. The release appears to depend on the morphology of the fungal colonies grown on ceiling tile surfaces including the thickness of conidiophores, the length of spore chains, and the shape of spores. The spores were found to be released continuously during each 30-min experiment. However, the release rate was usually highest during the first few minutes of exposure to air currents and mechanical vibration. About 71-88% of the spores released during a 30-min interval became airborne during the first 10min. (Author)
Source strength of fungal spore aerosolization from moldy building material
Górny, Rafał L.; Reponen, Tiina; Grinshpun, Sergey A.; Willeke, Klaus
The release of Aspergillus versicolor, Cladosporium cladosporioides, and Penicillium melinii spores from agar and ceiling tile surfaces was tested under different controlled environmental conditions using a newly designed and constructed aerosolization chamber. This study revealed that all the investigated parameters, such as fungal species, air velocity above the surface, texture of the surface, and vibration of contaminated material, affected the fungal spore release. It was found that typical indoor air currents can release up to 200 spores cm -2 from surfaces with fungal spores during 30-min experiments. The release of fungal spores from smooth agar surfaces was found to be inadequate for accurately predicting the emission from rough ceiling tile surfaces because the air turbulence increases the spore release from a rough surface. A vibration at a frequency of 1 Hz at a power level of 14 W resulted in a significant increase in the spore release rate. The release appears to depend on the morphology of the fungal colonies grown on ceiling tile surfaces including the thickness of conidiophores, the length of spore chains, and the shape of spores. The spores were found to be released continuously during each 30-min experiment. However, the release rate was usually highest during the first few minutes of exposure to air currents and mechanical vibration. About 71-88% of the spores released during a 30-min interval became airborne during the first 10 min.
Learning Local Components to Understand Large Bayesian Networks
DEFF Research Database (Denmark)
Zeng, Yifeng; Xiang, Yanping; Cordero, Jorge;
2009-01-01
Bayesian networks are known for providing an intuitive and compact representation of probabilistic information and allowing the creation of models over a large and complex domain. Bayesian learning and reasoning are nontrivial for a large Bayesian network. In parallel, it is a tough job for users...... (domain experts) to extract accurate information from a large Bayesian network due to dimensional difficulty. We define a formulation of local components and propose a clustering algorithm to learn such local components given complete data. The algorithm groups together most inter-relevant attributes...... in a domain. We evaluate its performance on three benchmark Bayesian networks and provide results in support. We further show that the learned components may represent local knowledge more precisely in comparison to the full Bayesian networks when working with a small amount of data....
Aerosol absorption and radiative forcing
Directory of Open Access Journals (Sweden)
P. Stier
2007-05-01
Full Text Available We present a comprehensive examination of aerosol absorption with a focus on evaluating the sensitivity of the global distribution of aerosol absorption to key uncertainties in the process representation. For this purpose we extended the comprehensive aerosol-climate model ECHAM5-HAM by effective medium approximations for the calculation of aerosol effective refractive indices, updated black carbon refractive indices, new cloud radiative properties considering the effect of aerosol inclusions, as well as by modules for the calculation of long-wave aerosol radiative properties and instantaneous aerosol forcing. The evaluation of the simulated aerosol absorption optical depth with the AERONET sun-photometer network shows a good agreement in the large scale global patterns. On a regional basis it becomes evident that the update of the BC refractive indices to Bond and Bergstrom (2006 significantly improves the previous underestimation of the aerosol absorption optical depth. In the global annual-mean, absorption acts to reduce the short-wave anthropogenic aerosol top-of-atmosphere (TOA radiative forcing clear-sky from –0.79 to –0.53 W m^{−2} (33% and all-sky from –0.47 to –0.13 W m^{−2} (72%. Our results confirm that basic assumptions about the BC refractive index play a key role for aerosol absorption and radiative forcing. The effect of the usage of more accurate effective medium approximations is comparably small. We demonstrate that the diversity in the AeroCom land-surface albedo fields contributes to the uncertainty in the simulated anthropogenic aerosol radiative forcings: the usage of an upper versus lower bound of the AeroCom land albedos introduces a global annual-mean TOA forcing range of 0.19 W m^{−2} (36% clear-sky and of 0.12 W m^{−2} (92% all-sky. The consideration of black carbon inclusions on cloud radiative properties results in a small global annual-mean all-sky absorption of 0.05 W
Bayesian networks as a tool for epidemiological systems analysis
Lewis, F.I.
2012-01-01
Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter ...
Small sample Bayesian analyses in assessment of weapon performance
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Abundant test data are required in assessment of weapon performance.When weapon test data are insufficient,Bayesian analyses in small sample circumstance should be considered and the test data should be provided by simulations.The several Bayesian approaches are discussed and some limitations are founded.An improvement is put forward after limitations of Bayesian approaches available are analyzed and t he improved approach is applied to assessment of some new weapon performance.
BAYESIAN ESTIMATION OF RELIABILITY IN TWOPARAMETER GEOMETRIC DISTRIBUTION
Directory of Open Access Journals (Sweden)
Sudhansu S. Maiti
2015-12-01
Full Text Available Bayesian estimation of reliability of a component, tR ( = P(X ≥ t, when X follows two-parameter geometric distribution, has been considered. Maximum Likelihood Estimator (MLE, an Unbiased Estimator and Bayesian Estimator have been compared. Bayesian estimation of component reliability R = P ( X ≤ Y , arising under stress-strength setup, when Y is assumed to follow independent two-parameter geometric distribution has also been discussed assuming independent priors for parameters under different loss functions.
Chain ladder method: Bayesian bootstrap versus classical bootstrap
Peters, Gareth W.; Mario V. W\\"uthrich; Shevchenko, Pavel V.
2010-01-01
The intention of this paper is to estimate a Bayesian distribution-free chain ladder (DFCL) model using approximate Bayesian computation (ABC) methodology. We demonstrate how to estimate quantities of interest in claims reserving and compare the estimates to those obtained from classical and credibility approaches. In this context, a novel numerical procedure utilising Markov chain Monte Carlo (MCMC), ABC and a Bayesian bootstrap procedure was developed in a truly distribution-free setting. T...
A tutorial introduction to Bayesian models of cognitive development
Perfors, Amy; Tenenbaum, Joshua B.; Griffiths, Thomas L.; Xu, Fei
2010-01-01
We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in...
Bayesian just-so stories in psychology and neuroscience
Bowers, J.S.; Davis, Colin J
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make three main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak at best. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account fo...
Bayesian just-so stories in cognitive psychology and neuroscience.
Bowers, J.S.; Davis, Colin J
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make three main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak at best. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account fo...
The Bayesian Modelling Of Inflation Rate In Romania
Mihaela Simionescu
2014-01-01
Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estim...
Bayesian non- and semi-parametric methods and applications
Rossi, Peter
2014-01-01
This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number
Kahn, R. A.
2009-12-01
As expected, the aerosol data products from the NASA Earth Observing System’s MISR and MODIS instruments provide significant advances in regional and global aerosol optical depth (AOD) mapping, aerosol type measurement, and source plume characterization from space. Although these products have been and are being used for many applications, ranging from regional air quality assessment, to aerosol air mass type evolution, to aerosol injection height and aerosol transport model validation, uncertainties still limit the quantitative constraints these satellite data place on global-scale direct aerosol radiative forcing. Some further refinement of the current aerosol products is possible, but a major advance in this area seems to require a different paradigm, involving the integration of satellite and suborbital data with models. This presentation will briefly summarize where we stand, and what incremental advances we can expect, with the current aerosol products, and will then elaborate on some initial steps aimed at the necessary integration. Many other AGU presentations, covering parts of the community’s emerging efforts in this direction, will be referenced, and key points from the recently released CCSP-SAP (US Climate Change Program - Synthesis and Assessment Product) 2.3 - Atmospheric aerosols: Properties and Climate Impacts, will be included in the discussion.
MODIS aerosol product at 3 km spatial resolution for urban and air quality studies
Mattoo, S.; Remer, L. A.; Levy, R. C.; Holben, B. N.; Smirnov, A.
2008-12-01
The MODerate resolution Imaging Spectroradiometer (MODIS) aboard the Terra and Aqua satellites has been producing an aerosol product since early 2000. The original product reports aerosol optical depth and a variety of other aerosol parameters at a spatial resolution of 10 km over both land and ocean. The 10 km product is actually constructed from 500 m pixels, which permits a strict selection process to choose the "best" or "cleanest" pixels in each 10 km square for use in the aerosol retrieval. Thus, the original 10 km product provides a useful product, accurate in many applications. However, the 10 km product can miss narrow aerosol plumes and the spatial variability associated with urban air pollution. The MODIS aerosol team will be introducing a finer resolution aerosol product over land regions in the next release of the product (Collection 6). The new product will be produced at 3 km resolution. It is based on the same procedures as the original product and benefits from the same spatial variability criteria for finding and masking cloudy pixels. The 3 km product does capture the higher spatial variability associated with individual aerosol plumes. However, it is noisier than the 10 km product. Both products will be available operationally in Collection 6. The new 3km product offers new synergistic possibilities with PM2.5 monitoring networks, AERONET and various air quality models such as CMAQ.
Directory of Open Access Journals (Sweden)
David Lunn
Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.
A Bayesian Approach for Localization of Acoustic Emission Source in Plate-Like Structures
Directory of Open Access Journals (Sweden)
Gang Yan
2015-01-01
Full Text Available This paper presents a Bayesian approach for localizing acoustic emission (AE source in plate-like structures with consideration of uncertainties from modeling error and measurement noise. A PZT sensor network is deployed to monitor and acquire AE wave signals released by possible damage. By using continuous wavelet transform (CWT, the time-of-flight (TOF information of the AE wave signals is extracted and measured. With a theoretical TOF model, a Bayesian parameter identification procedure is developed to obtain the AE source location and the wave velocity at a specific frequency simultaneously and meanwhile quantify their uncertainties. It is based on Bayes’ theorem that the posterior distributions of the parameters about the AE source location and the wave velocity are obtained by relating their priors and the likelihood of the measured time difference data. A Markov chain Monte Carlo (MCMC algorithm is employed to draw samples to approximate the posteriors. Also, a data fusion scheme is performed to fuse results identified at multiple frequencies to increase accuracy and reduce uncertainty of the final localization results. Experimental studies on a stiffened aluminum panel with simulated AE events by pensile lead breaks (PLBs are conducted to validate the proposed Bayesian AE source localization approach.
Bayesian inference of the initial conditions from large-scale structure surveys
Leclercq, Florent
2016-10-01
Analysis of three-dimensional cosmological surveys has the potential to answer outstanding questions on the initial conditions from which structure appeared, and therefore on the very high energy physics at play in the early Universe. We report on recently proposed statistical data analysis methods designed to study the primordial large-scale structure via physical inference of the initial conditions in a fully Bayesian framework, and applications to the Sloan Digital Sky Survey data release 7. We illustrate how this approach led to a detailed characterization of the dynamic cosmic web underlying the observed galaxy distribution, based on the tidal environment.
Bayesian inference of the initial conditions from large-scale structure surveys
Leclercq, Florent
2014-01-01
Analysis of three-dimensional cosmological surveys has the potential to answer outstanding questions on the initial conditions from which structure appeared, and therefore on the very high energy physics at play in the early Universe. We report on recently proposed statistical data analysis methods designed to study the primordial large-scale structure via physical inference of the initial conditions in a fully Bayesian framework, and applications to the Sloan Digital Sky Survey data release 7. We illustrate how this approach led to a detailed characterization of the dynamic cosmic web underlying the observed galaxy distribution, based on the tidal environment.
Doing bayesian data analysis a tutorial with R and BUGS
Kruschke, John K
2011-01-01
There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all
Bayesian missing data problems EM, data augmentation and noniterative computation
Tan, Ming T; Ng, Kai Wang
2009-01-01
Bayesian Missing Data Problems: EM, Data Augmentation and Noniterative Computation presents solutions to missing data problems through explicit or noniterative sampling calculation of Bayesian posteriors. The methods are based on the inverse Bayes formulae discovered by one of the author in 1995. Applying the Bayesian approach to important real-world problems, the authors focus on exact numerical solutions, a conditional sampling approach via data augmentation, and a noniterative sampling approach via EM-type algorithms. After introducing the missing data problems, Bayesian approach, and poste
Bayesian integer frequency offset estimator for MIMO-OFDM systems
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
Carrier frequency offset (CFO) in MIMO-OFDM systems can be decoupled into two parts: fraction frequency offset (FFO) and integer frequency offset (IFO). The problem of IFO estimation is addressed and a new IFO estimator based on the Bayesian philosophy is proposed. Also, it is shown that the Bayesian IFO estimator is optimal among all the IFO estimators. Furthermore, the Bayesian estimator can take advantage of oversampling so that better performance can be obtained. Finally, numerical results show the optimality of the Bayesian estimator and validate the theoretical analysis.
The bugs book a practical introduction to Bayesian analysis
Lunn, David; Best, Nicky; Thomas, Andrew; Spiegelhalter, David
2012-01-01
Introduction: Probability and ParametersProbabilityProbability distributionsCalculating properties of probability distributionsMonte Carlo integrationMonte Carlo Simulations Using BUGSIntroduction to BUGSDoodleBUGSUsing BUGS to simulate from distributionsTransformations of random variablesComplex calculations using Monte CarloMultivariate Monte Carlo analysisPredictions with unknown parametersIntroduction to Bayesian InferenceBayesian learningPosterior predictive distributionsConjugate Bayesian inferenceInference about a discrete parameterCombinations of conjugate analysesBayesian and classica
A Bayesian Justification for Random Sampling in Sample Survey
Directory of Open Access Journals (Sweden)
Glen Meeden
2012-07-01
Full Text Available In the usual Bayesian approach to survey sampling the sampling design, plays a minimal role, at best. Although a close relationship between exchangeable prior distributions and simple random sampling has been noted; how to formally integrate simple random sampling into the Bayesian paradigm is not clear. Recently it has been argued that the sampling design can be thought of as part of a Bayesian's prior distribution. We will show here that under this scenario simple random sample can be given a Bayesian justification in survey sampling.
Aerosol extinction in coastal zone
Piazzola, J.; Kaloshin, G.; Leeuw, G. de; Eijk, A.M.J. van
2004-01-01
The performance of electro-optical systems can be substantially affected by aerosol particles that scatter and absorb electromagnetic radiation. A few years ago, an empirical model was developed describing the aerosol size distributions in the Mediterranean coastal atmosphere near Toulon (France). T
Aerosol therapy in young children
H.M. Janssens (Hettie)
2001-01-01
textabstractInhalation of aerosolized drugs has become an established means for treatment of pulmonary diseases in the last fifiy years. The majoriry of aerosol therapy in childhood concerns inhaled corticosteroids and bronchodilators in the management of asthma. Administration of drugs via the inha
Aerosols indirectly warm the Arctic
Directory of Open Access Journals (Sweden)
T. Mauritsen
2010-07-01
Full Text Available On average, airborne aerosol particles cool the Earth's surface directly by absorbing and scattering sunlight and indirectly by influencing cloud reflectivity, life time, thickness or extent. Here we show that over the central Arctic Ocean, where there is frequently a lack of aerosol particles upon which clouds may form, a small increase in aerosol loading may enhance cloudiness thereby likely causing a climatologically significant warming at the ice-covered Arctic surface. Under these low concentration conditions cloud droplets grow to drizzle sizes and fall, even in the absence of collisions and coalescence, thereby diminishing cloud water. Evidence from a case study suggests that interactions between aerosol, clouds and precipitation could be responsible for attaining the observed low aerosol concentrations.
Investigation of activity release during light water reactor core meltdown
International Nuclear Information System (INIS)
A test facility was developed for the determination of activity release and of aerosol characteristics under realistic light water reactor core melting conditions. It is composed of a high-frequency induction furnace, a ThO2 crucible system, and a collection apparatus consisting of membrane and particulate filters. Thirty-gram samples of a representative core material mixture (corium) were melted under air, argon, or steam at 0.8 to 2.2 bar. In air at 27000C, for example, the relative release was 0.4 to 0.7% for iron, chromium, and cobalt and 4 to 11% for tin, antimony, and manganese. Higher release values of 20 to 40% at lower temperatures (21500C, air) were found for selenium, cadmium, tellurium, and cesium. The size distribution of the aerosol particles was trimodal with maxima at diameters of 0.17, 0.30, and 0.73 μm. The result of a qualitative x-ray microanalysis was that the main elements of the melt were contained in each aerosol particle. Further investigations will include larger melt masses and the additional influence of concrete on the release and aerosol behavior
Assessment of the basis for modeling releases from plutonium oxidation
International Nuclear Information System (INIS)
Ideally, a model of the release of plutonium aerosols from plutonium during oxidation or combustion should begin from a description of the plutonium material and its surroundings and proceed unequivocally to a situation-dependent estimate of the amount of oxide released and its size distribution. Such a model would need to provide a description of the heat- and mass-transfer processes involved and link them directly to the rate of aerosol production. The first step, the description of heat and mass transfer, is more easily achieved from current information than the second, the aerosol release. The sections of this report titled ''Physical Fundamentals'' and ''Available Theoretical Information'' describe the approach that would be required for theoretical modeling. The ''Experimental Results'' section describes the information on aerosol releases, size distributions, peak temperatures, oxidation rates, and experimental conditions that we have gleaned from the existing experimental literature. The data is summarized and the bibliography lists the relevant literature that has and has not been reviewed. 42 refs., 10 figs., 6 tabs
Lidar data assimilation for improved analyses of volcanic aerosol events
Lange, Anne Caroline; Elbern, Hendrik
2014-05-01
Observations of hazardous events with release of aerosols are hardly analyzable by today's data assimilation algorithms, without producing an attenuating bias. Skillful forecasts of unexpected aerosol events are essential for human health and to prevent an exposure of infirm persons and aircraft with possibly catastrophic outcome. Typical cases include mineral dust outbreaks, mostly from large desert regions, wild fires, and sea salt uplifts, while the focus aims for volcanic eruptions. In general, numerical chemistry and aerosol transport models cannot simulate such events without manual adjustments. The concept of data assimilation is able to correct the analysis, as long it is operationally implemented in the model system. Though, the tangent-linear approximation, which describes a substantial precondition for today's cutting edge data assimilation algorithms, is not valid during unexpected aerosol events. As part of the European COPERNICUS (earth observation) project MACC II and the national ESKP (Earth System Knowledge Platform) initiative, we developed a module that enables the assimilation of aerosol lidar observations, even during unforeseeable incidences of extreme emissions of particulate matter. Thereby, the influence of the background information has to be reduced adequately. Advanced lidar instruments comprise on the one hand the aspect of radiative transfer within the atmosphere and on the other hand they can deliver a detailed quantification of the detected aerosols. For the assimilation of maximal exploited lidar data, an appropriate lidar observation operator is constructed, compatible with the EURAD-IM (European Air Pollution and Dispersion - Inverse Model) system. The observation operator is able to map the modeled chemical and physical state on lidar attenuated backscatter, transmission, aerosol optical depth, as well as on the extinction and backscatter coefficients. Further, it has the ability to process the observed discrepancies with lidar
Shah, Abhik; Woolf, Peter
2009-06-01
In this paper, we introduce pebl, a Python library and application for learning Bayesian network structure from data and prior knowledge that provides features unmatched by alternative software packages: the ability to use interventional data, flexible specification of structural priors, modeling with hidden variables and exploitation of parallel processing. PMID:20161541
Bayesian inference tools for inverse problems
Mohammad-Djafari, Ali
2013-08-01
In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and
Dust layer profiling using an aerosol dropsonde
Ulanowski, Zbigniew; Kaye, Paul Henry; Hirst, Edwin; Wieser, Andreas; Stanley, Warren
2015-04-01
Routine meteorological data is obtained in the atmosphere using disposable radiosondes, giving temperature, pressure, humidity and wind speed. Additional measurements are obtained from dropsondes, released from research aircraft. However, a crucial property not yet measured is the size and concentration of atmospheric particulates, including dust. Instead, indirect measurements are employed, relying on remote sensing, to meet the demands from areas such as climate research, air quality monitoring, civil emergencies etc. In addition, research aircraft can be used in situ, but airborne measurements are expensive, and aircraft use is restricted to near-horizontal profiling, which can be a limitation, as phenomena such as long-range transport depend on the vertical distribution of aerosol. The Centre for Atmospheric and Instrumentation Research at University of Hertfordshire develops light-scattering instruments for the characterization of aerosols and cloud particles. Recently a range of low-cost, miniature particle counters has been created, intended for use with systems such as disposable balloon-borne radiosondes, dropsondes, or in dense ground-based sensor networks. Versions for different particle size ranges exist. They have been used for vertical profiling of aerosols such as mineral dust or volcanic ash. A disadvantage of optical particle counters that sample through a narrow inlet is that they can become blocked, which can happen in cloud, for example. Hence, a different counter version has been developed, which can have open-path geometry, as the sensing zone is defined optically rather than being delimited by the flow system. This counter has been used for ground based air-quality monitoring around Heathrow airport. The counter has also been adapted for use with radiosondes or dropsondes. The dropsonde version has been successfully tested by launching it from research aircraft together with the so-called KITsonde, developed at the Karlsruhe Institute of
Meteorological Data Assimilation by Adaptive Bayesian Optimization.
Purser, Robert James
1992-01-01
The principal aim of this research is the elucidation of the Bayesian statistical principles that underlie the theory of objective meteorological analysis. In particular, emphasis is given to aspects of data assimilation that can benefit from an iterative numerical strategy. Two such aspects that are given special consideration are statistical validation of the covariance profiles and nonlinear initialization. A new economic algorithm is presented, based on the imposition of a sparse matrix structure for all covariances and precisions held during the computations. It is shown that very large datasets may be accommodated using this structure and a good linear approximation to the analysis equations established without the need to unnaturally fragment the problem. Since the integrity of the system of analysis equations is preserved, it is a relatively straight-forward matter to extend the basic analysis algorithm to one that incorporates a check on the plausibility of the statistical model assumed for background errors--the so-called "validation" problem. Two methods of validation are described within the sparse matrix framework: the first is essentially a direct extension of the Bayesian principles to embrace, not only the regular analysis variables, but also the parameters that determine the precise form of the covariance functions; the second technique is the non-Bayesian method of generalized cross validation adapted for use within the sparse matrix framework. The later part of this study is concerned with the establishment of a consistent dynamical balance within a forecast model--the initialization problem. The formal principles of the modern theory of initialization are reviewed and a critical examination is made of the concept of the "slow manifold". It is demonstrated, in accordance with more complete nonlinear models, that even within a simple three-mode linearized system, the notion that a universal slow manifold exists is untenable. It is therefore argued
Personalized Audio Systems - a Bayesian Approach
DEFF Research Database (Denmark)
Nielsen, Jens Brehm; Jensen, Bjørn Sand; Hansen, Toke Jansen;
2013-01-01
Modern audio systems are typically equipped with several user-adjustable parameters unfamiliar to most users listening to the system. To obtain the best possible setting, the user is forced into multi-parameter optimization with respect to the users's own objective and preference. To address this......, the present paper presents a general inter-active framework for personalization of such audio systems. The framework builds on Bayesian Gaussian process regression in which a model of the users's objective function is updated sequentially. The parameter setting to be evaluated in a given trial is...
Recovery of shapes: hypermodels and Bayesian learning
International Nuclear Information System (INIS)
We discuss the problem of recovering an image from its blurred and noisy copy with the additional information that the image consists of simple shapes with sharp edges. An iterative algorithm is given, based on the idea of updating the Tikhonov type smoothness penalty on the basis of the previous estimate. This algorithm is discussed in the framework of Bayesian hypermodels and it is shown that the approach can be justified as a sequential iterative scheme for finding the mode of the posterior density. An effective numerical algorithm based on preconditioned Krylov subspace iterations is suggested and demonstrated with a computed example
Bayesian model selection in Gaussian regression
Abramovich, Felix
2009-01-01
We consider a Bayesian approach to model selection in Gaussian linear regression, where the number of predictors might be much larger than the number of observations. From a frequentist view, the proposed procedure results in the penalized least squares estimation with a complexity penalty associated with a prior on the model size. We investigate the optimality properties of the resulting estimator. We establish the oracle inequality and specify conditions on the prior that imply its asymptotic minimaxity within a wide range of sparse and dense settings for "nearly-orthogonal" and "multicollinear" designs.
Structure-based bayesian sparse reconstruction
Quadeer, Ahmed Abdul
2012-12-01
Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.
Radioactive Contraband Detection: A Bayesian Approach
Energy Technology Data Exchange (ETDEWEB)
Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Sale, K; Chambers, D; Axelrod, M; Meyer, A
2009-03-16
Radionuclide emissions from nuclear contraband challenge both detection and measurement technologies to capture and record each event. The development of a sequential Bayesian processor incorporating both the physics of gamma-ray emissions and the measurement of photon energies offers a physics-based approach to attack this challenging problem. It is shown that a 'physics-based' structure can be used to develop an effective detection technique, but also motivates the implementation of this approach using or particle filters to enhance and extract the required information.
Bayesian Analysis of Type Ia Supernova Data
Institute of Scientific and Technical Information of China (English)
王晓峰; 周旭; 李宗伟; 陈黎
2003-01-01
Recently, the distances to type Ia supernova (SN Ia) at z ～ 0.5 have been measured with the motivation of estimating cosmological parameters. However, different sleuthing techniques tend to give inconsistent measurements for SN Ia distances (～0.3 mag), which significantly affects the determination of cosmological parameters.A Bayesian "hyper-parameter" procedure is used to analyse jointly the current SN Ia data, which considers the relative weights of different datasets. For a flat Universe, the combining analysis yields ΩM = 0.20 ± 0.07.
Confidence Biases and Learning among Intuitive Bayesians
Lévy-Garboua, Louis; Askari, Muniza; Gazel, Marco
2015-01-01
URL des Documents de travail : http://ces.univ-paris1.fr/cesdp/cesdp2015.html Documents de travail du Centre d'Economie de la Sorbonne 2015.80 - ISSN : 1955-611X We design a double-or-quits game to compare the speed of learning one's specific ability with the speed of rising confidence as the task gets increasingly difficult. We find that people on average learn to be overconfident faster than they learn their true ability and we present a simple Bayesian model of confidence which integ...
Bayesian parameter estimation by continuous homodyne detection
DEFF Research Database (Denmark)
Kiilerich, Alexander Holm; Molmer, Klaus
2016-01-01
and we show that the ensuing transient evolution is more sensitive to system parameters than the steady state of the system. The parameter sensitivity can be quantified by the Fisher information, and we investigate numerically and analytically how the temporal noise correlations in the measurement signal......We simulate the process of continuous homodyne detection of the radiative emission from a quantum system, and we investigate how a Bayesian analysis can be employed to determine unknown parameters that govern the system evolution. Measurement backaction quenches the system dynamics at all times...
Bayesian approach to avoiding track seduction
Salmond, David J.; Everett, Nicholas O.
2002-08-01
The problem of maintaining track on a primary target in the presence spurious objects is addressed. Recursive and batch filtering approaches are developed. For the recursive approach, a Bayesian track splitting filter is derived which spawns candidate tracks if there is a possibility of measurement misassociation. The filter evaluates the probability of each candidate track being associated with the primary target. The batch filter is a Markov-chain Monte Carlo (MCMC) algorithm which fits the observed data sequence to models of target dynamics and measurement-track association. Simulation results are presented.
Low Complexity Bayesian Single Channel Source Separation
DEFF Research Database (Denmark)
Beierholm, Thomas; Pedersen, Brian Dam; Winther, Ole
2004-01-01
We propose a simple Bayesian model for performing single channel speech separation using factorized source priors in a sliding window linearly transformed domain. Using a one dimensional mixture of Gaussians to model each band source leads to fast tractable inference for the source signals...... can be estimated quite precisely using ML-II, but the estimation is quite sensitive to the accuracy of the priors as opposed to the source separation quality for known mixing coefficients, which is quite insensitive to the accuracy of the priors. Finally, we discuss how to improve our approach while...
Bayesian regression of piecewise homogeneous Poisson processes
Directory of Open Access Journals (Sweden)
Diego Sevilla
2015-12-01
Full Text Available In this paper, a Bayesian method for piecewise regression is adapted to handle counting processes data distributed as Poisson. A numerical code in Mathematica is developed and tested analyzing simulated data. The resulting method is valuable for detecting breaking points in the count rate of time series for Poisson processes. Received: 2 November 2015, Accepted: 27 November 2015; Edited by: R. Dickman; Reviewed by: M. Hutter, Australian National University, Canberra, Australia.; DOI: http://dx.doi.org/10.4279/PIP.070018 Cite as: D J R Sevilla, Papers in Physics 7, 070018 (2015
A Bayesian approach to earthquake source studies
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also
Bayesian logistic betting strategy against probability forecasting
Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei
2012-01-01
We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.
Email Spam Filter using Bayesian Neural Networks
Directory of Open Access Journals (Sweden)
Nibedita Chakraborty
2012-03-01
Full Text Available Nowadays, e-mail is widely becoming one of the fastest and most economical forms of communication but they are prone to be misused. One such misuse is the posting of unsolicited, unwanted e-mails known as spam or junk e-mails. This paper presents and discusses an implementation of a spam filtering system. The idea is to use a neural network which will be trained to recognize different forms of often used words in spam mails. The Bayesian ANN is trained with finite sample sizes to approximate the ideal observer. This strategy can provide improved filtering of Spam than existing Static Spam filters.
Reasons for (prior) belief in bayesian epistemology
Dietrich, Franz; List, Christian
2012-01-01
Bayesian epistemology tells us with great precision how we should move from prior to posterior beliefs in light of new evidence or information, but says little about where our prior beliefs come from. It o¤ers few resources to describe some prior beliefs as rational or well-justi�ed, and others as irrational or unreasonable. A di¤erent strand of epistemology takes the central epistemological question to be not how to change one�s beliefs in light of new evidence, but what reasons justify a gi...
Bayesian Estimation of a Mixture Model
Ilhem Merah; Assia Chadli
2015-01-01
We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010). This one is a mixture of a Gamma distribution G(2, (1/θ)) and a new distribution L(θ). We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980) and Tierney and Kadane (1986). Usin...
Case studies in Bayesian microbial risk assessments
Directory of Open Access Journals (Sweden)
Turner Joanne
2009-12-01
Full Text Available Abstract Background The quantification of uncertainty and variability is a key component of quantitative risk analysis. Recent advances in Bayesian statistics make it ideal for integrating multiple sources of information, of different types and quality, and providing a realistic estimate of the combined uncertainty in the final risk estimates. Methods We present two case studies related to foodborne microbial risks. In the first, we combine models to describe the sequence of events resulting in illness from consumption of milk contaminated with VTEC O157. We used Monte Carlo simulation to propagate uncertainty in some of the inputs to computer models describing the farm and pasteurisation process. Resulting simulated contamination levels were then assigned to consumption events from a dietary survey. Finally we accounted for uncertainty in the dose-response relationship and uncertainty due to limited incidence data to derive uncertainty about yearly incidences of illness in young children. Options for altering the risk were considered by running the model with different hypothetical policy-driven exposure scenarios. In the second case study we illustrate an efficient Bayesian sensitivity analysis for identifying the most important parameters of a complex computer code that simulated VTEC O157 prevalence within a managed dairy herd. This was carried out in 2 stages, first to screen out the unimportant inputs, then to perform a more detailed analysis on the remaining inputs. The method works by building a Bayesian statistical approximation to the computer code using a number of known code input/output pairs (training runs. Results We estimated that the expected total number of children aged 1.5-4.5 who become ill due to VTEC O157 in milk is 8.6 per year, with 95% uncertainty interval (0,11.5. The most extreme policy we considered was banning on-farm pasteurisation of milk, which reduced the estimate to 6.4 with 95% interval (0,11. In the second
Multisnapshot Sparse Bayesian Learning for DOA
Gerstoft, Peter; Mecklenbrauker, Christoph F.; Xenaki, Angeliki; Nannuru, Santosh
2016-10-01
The directions of arrival (DOA) of plane waves are estimated from multi-snapshot sensor array data using Sparse Bayesian Learning (SBL). The prior source amplitudes is assumed independent zero-mean complex Gaussian distributed with hyperparameters the unknown variances (i.e. the source powers). For a complex Gaussian likelihood with hyperparameter the unknown noise variance, the corresponding Gaussian posterior distribution is derived. For a given number of DOAs, the hyperparameters are automatically selected by maximizing the evidence and promote sparse DOA estimates. The SBL scheme for DOA estimation is discussed and evaluated competitively against LASSO ($\\ell_1$-regularization), conventional beamforming, and MUSIC
Bayesian parameter estimation by continuous homodyne detection
Kiilerich, Alexander Holm; Mølmer, Klaus
2016-09-01
We simulate the process of continuous homodyne detection of the radiative emission from a quantum system, and we investigate how a Bayesian analysis can be employed to determine unknown parameters that govern the system evolution. Measurement backaction quenches the system dynamics at all times and we show that the ensuing transient evolution is more sensitive to system parameters than the steady state of the system. The parameter sensitivity can be quantified by the Fisher information, and we investigate numerically and analytically how the temporal noise correlations in the measurement signal contribute to the ultimate sensitivity limit of homodyne detection.
Bayesian global analysis of neutrino oscillation data
Bergstrom, Johannes; Maltoni, Michele; Schwetz, Thomas
2015-01-01
We perform a Bayesian analysis of current neutrino oscillation data. When estimating the oscillation parameters we find that the results generally agree with those of the $\\chi^2$ method, with some differences involving $s_{23}^2$ and CP-violating effects. We discuss the additional subtleties caused by the circular nature of the CP-violating phase, and how it is possible to obtain correlation coefficients with $s_{23}^2$. When performing model comparison, we find that there is no significant evidence for any mass ordering, any octant of $s_{23}^2$ or a deviation from maximal mixing, nor the presence of CP-violation.
A Bayesian Framework for Combining Valuation Estimates
Yee, Kenton K
2007-01-01
Obtaining more accurate equity value estimates is the starting point for stock selection, value-based indexing in a noisy market, and beating benchmark indices through tactical style rotation. Unfortunately, discounted cash flow, method of comparables, and fundamental analysis typically yield discrepant valuation estimates. Moreover, the valuation estimates typically disagree with market price. Can one form a superior valuation estimate by averaging over the individual estimates, including market price? This article suggests a Bayesian framework for combining two or more estimates into a superior valuation estimate. The framework justifies the common practice of averaging over several estimates to arrive at a final point estimate.
TNT Equivalency of Unconfined Aerosols of Propylene Oxide
Directory of Open Access Journals (Sweden)
A. Apparao
2014-09-01
Full Text Available The unconfined aerosols of propylene oxide (PO are formed by dispersing the fuel in air. These aerosols undergo detonation by suitable initiation and produce high impulse blast. Tri-nitro Toluene (TNT equivalence is an important parameter used to represent the power of explosive materials and compare their relative damage effects wrt TNT. The parameters commonly used for estimation of TNT equivalency are total energy of explosive source and properties of resulting blast wave, viz., blast peak overpressure and positive impulse. In the present study, the unconfined aerosols of 4.2 kg PO were formed by breaking open the cylindrical canister with the help of axially positioned central burster charge and then detonated using a secondary explosive charge after a preset time delay. The resulting blast profiles were recorded and the blast parameters were analysed. Being a non-ideal explosive source, the TNT equivalency depends on fraction of total energy utilised for blast formation, the rate of energy release, cloud dimensions, and concentration of fuel. Hence, various approaches based on energy release, experimental blast profiles, triangulated blast parameters, and ground reflected blast parameters were considered to determine the TNT equivalency of unconfined PO aerosols. It was observed that the TNT equivalency is not a single value but vary with distance. The paper provides various options for weapon designer to choose a suitable approach for considering TNT equivalency. The scaling laws established from the experimental data of unconfined aerosols of PO for blast peak over pressure and scaled impulse help in predicting the performance for different values of fuel weight and distance.Defence Science Journal, Vol. 64, No. 5, September 2014, pp.431-437, DOI:http://dx.doi.org/10.14429/dsj.64.6851
The GRAPE aerosol retrieval algorithm
Directory of Open Access Journals (Sweden)
G. E. Thomas
2009-11-01
Full Text Available The aerosol component of the Oxford-Rutherford Aerosol and Cloud (ORAC combined cloud and aerosol retrieval scheme is described and the theoretical performance of the algorithm is analysed. ORAC is an optimal estimation retrieval scheme for deriving cloud and aerosol properties from measurements made by imaging satellite radiometers and, when applied to cloud free radiances, provides estimates of aerosol optical depth at a wavelength of 550 nm, aerosol effective radius and surface reflectance at 550 nm. The aerosol retrieval component of ORAC has several incarnations – this paper addresses the version which operates in conjunction with the cloud retrieval component of ORAC (described by Watts et al., 1998, as applied in producing the Global Retrieval of ATSR Cloud Parameters and Evaluation (GRAPE data-set.
The algorithm is described in detail and its performance examined. This includes a discussion of errors resulting from the formulation of the forward model, sensitivity of the retrieval to the measurements and a priori constraints, and errors resulting from assumptions made about the atmospheric/surface state.
Bayesian item selection in constrained adaptive testing using shadow tests
Veldkamp, Bernard P.
2010-01-01
Application of Bayesian item selection criteria in computerized adaptive testing might result in improvement of bias and MSE of the ability estimates. The question remains how to apply Bayesian item selection criteria in the context of constrained adaptive testing, where large numbers of specificati
Some Quantum Information Inequalities from a Quantum Bayesian Networks Perspective
Tucci, Robert R.
2012-01-01
This is primarily a pedagogical paper. The paper re-visits some well-known quantum information theory inequalities. It does this from a quantum Bayesian networks perspective. The paper illustrates some of the benefits of using quantum Bayesian networks to discuss quantum SIT (Shannon Information Theory).
Bayesian Compressed Sensing with Unknown Measurement Noise Level
DEFF Research Database (Denmark)
Hansen, Thomas Lundgaard; Jørgensen, Peter Bjørn; Pedersen, Niels Lovmand;
2013-01-01
In sparse Bayesian learning (SBL) approximate Bayesian inference is applied to find sparse estimates from observations corrupted by additive noise. Current literature only vaguely considers the case where the noise level is unknown a priori. We show that for most state-of-the-art reconstruction a...
Universal Darwinism As a Process of Bayesian Inference.
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature. PMID:27375438
A SEMIPARAMETRIC BAYESIAN MODEL FOR CIRCULAR-LINEAR REGRESSION
We present a Bayesian approach to regress a circular variable on a linear predictor. The regression coefficients are assumed to have a nonparametric distribution with a Dirichlet process prior. The semiparametric Bayesian approach gives added flexibility to the model and is usefu...
Non-homogeneous dynamic Bayesian networks for continuous data
Grzegorczyk, Marco; Husmeier, Dirk
2011-01-01
Classical dynamic Bayesian networks (DBNs) are based on the homogeneous Markov assumption and cannot deal with non-homogeneous temporal processes. Various approaches to relax the homogeneity assumption have recently been proposed. The present paper presents a combination of a Bayesian network with c
What Is the Probability You Are a Bayesian?
Wulff, Shaun S.; Robinson, Timothy J.
2014-01-01
Bayesian methodology continues to be widely used in statistical applications. As a result, it is increasingly important to introduce students to Bayesian thinking at early stages in their mathematics and statistics education. While many students in upper level probability courses can recite the differences in the Frequentist and Bayesian…
Universal Darwinism As a Process of Bayesian Inference.
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.
Using Alien Coins to Test Whether Simple Inference Is Bayesian
Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.
2016-01-01
Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…
Bayesian Data-Model Fit Assessment for Structural Equation Modeling
Levy, Roy
2011-01-01
Bayesian approaches to modeling are receiving an increasing amount of attention in the areas of model construction and estimation in factor analysis, structural equation modeling (SEM), and related latent variable models. However, model diagnostics and model criticism remain relatively understudied aspects of Bayesian SEM. This article describes…
Bayesian Item Selection in Constrained Adaptive Testing Using Shadow Tests
Veldkamp, Bernard P.
2010-01-01
Application of Bayesian item selection criteria in computerized adaptive testing might result in improvement of bias and MSE of the ability estimates. The question remains how to apply Bayesian item selection criteria in the context of constrained adaptive testing, where large numbers of specifications have to be taken into account in the item…
Bayesian Learning and the Psychology of Rule Induction
Endress, Ansgar D.
2013-01-01
In recent years, Bayesian learning models have been applied to an increasing variety of domains. While such models have been criticized on theoretical grounds, the underlying assumptions and predictions are rarely made concrete and tested experimentally. Here, I use Frank and Tenenbaum's (2011) Bayesian model of rule-learning as a case study to…
Universal Darwinism as a process of Bayesian inference
Directory of Open Access Journals (Sweden)
John Oberon Campbell
2016-06-01
Full Text Available Many of the mathematical frameworks describing natural selection are equivalent to Bayes’ Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians. As Bayesian inference can always be cast in terms of (variational free energy minimization, natural selection can be viewed as comprising two components: a generative model of an ‘experiment’ in the external world environment, and the results of that 'experiment' or the 'surprise' entailed by predicted and actual outcomes of the ‘experiment’. Minimization of free energy implies that the implicit measure of 'surprise' experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.
Statistical assignment of DNA sequences using Bayesian phylogenetics
DEFF Research Database (Denmark)
Terkelsen, Kasper Munch; Boomsma, Wouter Krogh; Huelsenbeck, John P;
2008-01-01
We provide a new automated statistical method for DNA barcoding based on a Bayesian phylogenetic analysis. The method is based on automated database sequence retrieval, alignment, and phylogenetic analysis using a custom-built program for Bayesian phylogenetic analysis. We show on real data that ...
Survey of Bayesian Models for Modelling of Stochastic Temporal Processes
Energy Technology Data Exchange (ETDEWEB)
Ng, B
2006-10-12
This survey gives an overview of popular generative models used in the modeling of stochastic temporal systems. In particular, this survey is organized into two parts. The first part discusses the discrete-time representations of dynamic Bayesian networks and dynamic relational probabilistic models, while the second part discusses the continuous-time representation of continuous-time Bayesian networks.
Shipborne aerosol IR decoy modulated by laser
Institute of Scientific and Technical Information of China (English)
叶晓英; 吴刚; 邓盼; 范宁
2004-01-01
Working principles, features, current situation and future development of the aerosol IR decoys are summarized in this paper, and a new type aerosol IR decoy aerosol IR decoy modulated by laser is emphasized. The simulation results show that compared with the traditional IR decoys, the late-model aerosol IR decoy effectively enhances the capability of protecting targets and countermining IR guidance weapons. It is a new direction of aerosol IR decoys.
Instrumentation for tropospheric aerosol characterization
Energy Technology Data Exchange (ETDEWEB)
Shi, Z.; Young, S.E.; Becker, C.H.; Coggiola, M.J. [SRI International, Menlo Park, CA (United States); Wollnik, H. [Giessen Univ. (Germany)
1997-12-31
A new instrument has been developed that determines the abundance, size distribution, and chemical composition of tropospheric and lower stratospheric aerosols with diameters down to 0.2 {mu}m. In addition to aerosol characterization, the instrument also monitors the chemical composition of the ambient gas. More than 25.000 aerosol particle mass spectra were recorded during the NASA-sponsored Subsonic Aircraft: Contrail and Cloud Effects Special Study (SUCCESS) field program using NASA`s DC-8 research aircraft. (author) 7 refs.
Energy Technology Data Exchange (ETDEWEB)
Venzie, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)
2015-10-13
The eDPS Aerosol Collection project studies the fundamental physics of electrostatic aerosol collection for national security applications. The interpretation of aerosol data requires understanding and correcting for biases introduced from particle genesis through collection and analysis. The research and development undertaken in this project provides the basis for both the statistical correction of existing equipment and techniques; as well as, the development of new collectors and analytical techniques designed to minimize unwanted biases while improving the efficiency of locating and measuring individual particles of interest.
A fixed frequency aerosol albedometer.
Thompson, Jonathan E; Barta, Nick; Policarpio, Danielle; Duvall, Richard
2008-02-01
A new method for the measurement of aerosol single scatter albedo (omega) at 532 nm was developed. The method employs cavity ring-down spectroscopy (CRDS) for measurement of aerosol extinction coefficient (b(ext)) and an integrating sphere nephelometer for determination of aerosol scattering coefficient (b(scat)). A unique feature of this method is that the extinction and scattering measurements are conducted simultaneously, on the exact same sample volume. Limits of detection (3s) for the extinction and scattering channel were 0.61 Mm(-1) and 2.7 Mm(-1) respectively. PMID:18542299
Aerosol growth in Titan's ionosphere.
Lavvas, Panayotis; Yelle, Roger V; Koskinen, Tommi; Bazin, Axel; Vuitton, Véronique; Vigren, Erik; Galand, Marina; Wellbrock, Anne; Coates, Andrew J; Wahlund, Jan-Erik; Crary, Frank J; Snowden, Darci
2013-02-19
Photochemically produced aerosols are common among the atmospheres of our solar system and beyond. Observations and models have shown that photochemical aerosols have direct consequences on atmospheric properties as well as important astrobiological ramifications, but the mechanisms involved in their formation remain unclear. Here we show that the formation of aerosols in Titan's upper atmosphere is directly related to ion processes, and we provide a complete interpretation of observed mass spectra by the Cassini instruments from small to large masses. Because all planetary atmospheres possess ionospheres, we anticipate that the mechanisms identified here will be efficient in other environments as well, modulated by the chemical complexity of each atmosphere. PMID:23382231
Directory of Open Access Journals (Sweden)
A. Teller
2012-03-01
Full Text Available This study focuses on the effects of aerosol particles on the formation of convective clouds and precipitation in the Eastern Mediterranean sea with a special emphasis on the role of mineral dust particles in these processes. We used a new detailed numerical cloud microphysics scheme that has been implemented in the Weather Research and Forecast (WRF model in order to study aerosol-cloud interaction in 3-D configuration based on realistic meteorological data. Using a number of case studies we tested the contribution of mineral dust particles and different ice nucleation parameterizations to precipitation development. In this study we also investigated the importance of recycled (regenerated aerosols that had been released to the atmosphere following the evaporation of cloud droplets.
The results showed that increased aerosol concentration due to the presence of mineral dust enhanced the formation of ice crystals. The dynamic evolution of the cloud system sets the time periods and regions in which heavy or light precipitation occurred in the domain. The precipitation rate, the time and duration of precipitation were affected by the aerosol properties only at small area scales (with areas of about 20 km^{2}. Changes of the ice nucleation scheme from ice supersaturation dependent parameterization to a recent approach of aerosol concentration and temperature dependent parameterization modified the ice crystals concentrations but did not affect the total precipitation in the domain. Aerosol regeneration modified the concentration of cloud droplets at cloud base by dynamic recirculation of the aerosols but also had only a minor effect on precipitation.
The major conclusion from this study is that the effect of mineral dust particles on clouds and total precipitation is limited by the properties of the atmospheric dynamics and the only effect of aerosol on precipitation may come from significant increase in the concentration
Bayesian Approach to Neuro-Rough Models for Modelling HIV
Marwala, Tshilidzi
2007-01-01
This paper proposes a new neuro-rough model for modelling the risk of HIV from demographic data. The model is formulated using Bayesian framework and trained using Markov Chain Monte Carlo method and Metropolis criterion. When the model was tested to estimate the risk of HIV infection given the demographic data it was found to give the accuracy of 62% as opposed to 58% obtained from a Bayesian formulated rough set model trained using Markov chain Monte Carlo method and 62% obtained from a Bayesian formulated multi-layered perceptron (MLP) model trained using hybrid Monte. The proposed model is able to combine the accuracy of the Bayesian MLP model and the transparency of Bayesian rough set model.
3-Layered Bayesian Model Using in Text Classification
Directory of Open Access Journals (Sweden)
Chang Jiayu
2013-01-01
Full Text Available Naive Bayesian is one of quite effective classification methods in all of the text disaggregated models. Usually, the computed result will be large deviation from normal, with the reason of attribute relevance and so on. This study embarked from the degree of correlation, defined the node’s degree as well as the relations between nodes, proposed a 3-layered Bayesian Model. According to the conditional probability recurrence formula, the theory support of the 3-layered Bayesian Model is obtained. According to the theory analysis and the empirical datum contrast to the Naive Bayesian, the model has better attribute collection and classify. It can be also promoted to the Multi-layer Bayesian Model using in text classification.
Application of Bayesian Network Learning Methods to Land Resource Evaluation
Institute of Scientific and Technical Information of China (English)
HUANG Jiejun; HE Xiaorong; WAN Youchuan
2006-01-01
Bayesian network has a powerful ability for reasoning and semantic representation, which combined with qualitative analysis and quantitative analysis, with prior knowledge and observed data, and provides an effective way to deal with prediction, classification and clustering. Firstly, this paper presented an overview of Bayesian network and its characteristics, and discussed how to learn a Bayesian network structure from given data, and then constructed a Bayesian network model for land resource evaluation with expert knowledge and the dataset. The experimental results based on the test dataset are that evaluation accuracy is 87.5%, and Kappa index is 0.826. All these prove the method is feasible and efficient, and indicate that Bayesian network is a promising approach for land resource evaluation.
Bayesian signal processing classical, modern, and particle filtering methods
Candy, James V
2016-01-01
This book aims to give readers a unified Bayesian treatment starting from the basics (Baye's rule) to the more advanced (Monte Carlo sampling), evolving to the next-generation model-based techniques (sequential Monte Carlo sampling). This next edition incorporates a new chapter on "Sequential Bayesian Detection," a new section on "Ensemble Kalman Filters" as well as an expansion of Case Studies that detail Bayesian solutions for a variety of applications. These studies illustrate Bayesian approaches to real-world problems incorporating detailed particle filter designs, adaptive particle filters and sequential Bayesian detectors. In addition to these major developments a variety of sections are expanded to "fill-in-the gaps" of the first edition. Here metrics for particle filter (PF) designs with emphasis on classical "sanity testing" lead to ensemble techniques as a basic requirement for performance analysis. The expansion of information theory metrics and their application to PF designs is fully developed an...
Approximation methods for efficient learning of Bayesian networks
Riggelsen, C
2008-01-01
This publication offers and investigates efficient Monte Carlo simulation methods in order to realize a Bayesian approach to approximate learning of Bayesian networks from both complete and incomplete data. For large amounts of incomplete data when Monte Carlo methods are inefficient, approximations are implemented, such that learning remains feasible, albeit non-Bayesian. The topics discussed are: basic concepts about probabilities, graph theory and conditional independence; Bayesian network learning from data; Monte Carlo simulation techniques; and, the concept of incomplete data. In order to provide a coherent treatment of matters, thereby helping the reader to gain a thorough understanding of the whole concept of learning Bayesian networks from (in)complete data, this publication combines in a clarifying way all the issues presented in the papers with previously unpublished work.
Two-player conflicting interest Bayesian games and Bell nonlocality
Situ, Haozhen
2016-01-01
Nonlocality, one of the most remarkable aspects of quantum mechanics, is closely related to Bayesian game theory. Quantum mechanics can offer advantages to some Bayesian games, if the payoff functions are related to Bell inequalities in some way, most of these Bayesian games that have been discussed are common interest games. Recently, the first conflicting interest Bayesian game is proposed in Phys. Rev. Lett. 114, 020401 (2015). In the present paper, we present three new conflicting interest Bayesian games where quantum mechanics offers advantages. The first game is linked with Cereceda inequalities, the second game is linked with a generalized Bell inequality with three possible measurement outcomes, and the third game is linked with a generalized Bell inequality with three possible measurement settings.
Computational statistics using the bBayesian Inference Engine
Weinberg, Martin D
2012-01-01
This paper introduces the Bayesian Inference Engine (BIE), a general parallel-optimised software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organise and reuse expensive derived data. I describe key concepts that illustrate the power of Bayesian inference to address these needs and outline the computational challenge. The techniques presented are based on experience gained in modelling star-counts and stellar populations, analysing the morphology of galaxy images, and performing Bayesian investigations of semi-analytic models of galaxy formation. These inference problems require advanced Markov chain Monte Carlo (MCMC) algorithms that expedite sampling, mixing, and the analysis of the Bayesian posterior distribution. The BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. By providing a variety of statistical algorithms for all phases of the inference problem, a u...
Bayesian Models of Graphs, Arrays and Other Exchangeable Random Structures.
Orbanz, Peter; Roy, Daniel M
2015-02-01
The natural habitat of most Bayesian methods is data represented by exchangeable sequences of observations, for which de Finetti's theorem provides the theoretical foundation. Dirichlet process clustering, Gaussian process regression, and many other parametric and nonparametric Bayesian models fall within the remit of this framework; many problems arising in modern data analysis do not. This article provides an introduction to Bayesian models of graphs, matrices, and other data that can be modeled by random structures. We describe results in probability theory that generalize de Finetti's theorem to such data and discuss their relevance to nonparametric Bayesian modeling. With the basic ideas in place, we survey example models available in the literature; applications of such models include collaborative filtering, link prediction, and graph and network analysis. We also highlight connections to recent developments in graph theory and probability, and sketch the more general mathematical foundation of Bayesian methods for other types of data beyond sequences and arrays. PMID:26353253
Semisupervised learning using Bayesian interpretation: application to LS-SVM.
Adankon, Mathias M; Cheriet, Mohamed; Biem, Alain
2011-04-01
Bayesian reasoning provides an ideal basis for representing and manipulating uncertain knowledge, with the result that many interesting algorithms in machine learning are based on Bayesian inference. In this paper, we use the Bayesian approach with one and two levels of inference to model the semisupervised learning problem and give its application to the successful kernel classifier support vector machine (SVM) and its variant least-squares SVM (LS-SVM). Taking advantage of Bayesian interpretation of LS-SVM, we develop a semisupervised learning algorithm for Bayesian LS-SVM using our approach based on two levels of inference. Experimental results on both artificial and real pattern recognition problems show the utility of our method.
Bayesian nonparametric regression with varying residual density.
Pati, Debdeep; Dunson, David B
2014-02-01
We consider the problem of robust Bayesian inference on the mean regression function allowing the residual density to change flexibly with predictors. The proposed class of models is based on a Gaussian process prior for the mean regression function and mixtures of Gaussians for the collection of residual densities indexed by predictors. Initially considering the homoscedastic case, we propose priors for the residual density based on probit stick-breaking (PSB) scale mixtures and symmetrized PSB (sPSB) location-scale mixtures. Both priors restrict the residual density to be symmetric about zero, with the sPSB prior more flexible in allowing multimodal densities. We provide sufficient conditions to ensure strong posterior consistency in estimating the regression function under the sPSB prior, generalizing existing theory focused on parametric residual distributions. The PSB and sPSB priors are generalized to allow residual densities to change nonparametrically with predictors through incorporating Gaussian processes in the stick-breaking components. This leads to a robust Bayesian regression procedure that automatically down-weights outliers and influential observations in a locally-adaptive manner. Posterior computation relies on an efficient data augmentation exact block Gibbs sampler. The methods are illustrated using simulated and real data applications. PMID:24465053
Fully Bayesian Experimental Design for Pharmacokinetic Studies
Directory of Open Access Journals (Sweden)
Elizabeth G. Ryan
2015-03-01
Full Text Available Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future dataset drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature, which rapidly obtains samples from the posterior, is importance sampling, using the prior as the importance distribution. However, importance sampling from the prior will tend to break down if there is a reasonable number of experimental observations. In this paper, we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study, which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times that produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.
Bayesian Discovery of Linear Acyclic Causal Models
Hoyer, Patrik O
2012-01-01
Methods for automated discovery of causal relationships from non-interventional data have received much attention recently. A widely used and well understood model family is given by linear acyclic causal models (recursive structural equation models). For Gaussian data both constraint-based methods (Spirtes et al., 1993; Pearl, 2000) (which output a single equivalence class) and Bayesian score-based methods (Geiger and Heckerman, 1994) (which assign relative scores to the equivalence classes) are available. On the contrary, all current methods able to utilize non-Gaussianity in the data (Shimizu et al., 2006; Hoyer et al., 2008) always return only a single graph or a single equivalence class, and so are fundamentally unable to express the degree of certainty attached to that output. In this paper we develop a Bayesian score-based approach able to take advantage of non-Gaussianity when estimating linear acyclic causal models, and we empirically demonstrate that, at least on very modest size networks, its accur...
Bayesian Cosmic Web Reconstruction: BARCODE for Clusters
Patrick Bos, E. G.; van de Weygaert, Rien; Kitaura, Francisco; Cautun, Marius
2016-10-01
We describe the Bayesian \\barcode\\ formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the basis of observational data, without the need for a correction of redshift space artifacts. In this contribution we provide a general overview of the the Cosmic Web connection with clusters and a description of the Bayesian BARCODE formalism. We conclude with a presentation of its successful workings with respect to test runs based on a simulated large scale matter distribution, in physical space as well as in redshift space.
Bayesian Analysis of High Dimensional Classification
Mukhopadhyay, Subhadeep; Liang, Faming
2009-12-01
Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.
Bayesian nonparametric adaptive control using Gaussian processes.
Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A
2015-03-01
Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.
Bayesian Kinematic Finite Fault Source Models (Invited)
Minson, S. E.; Simons, M.; Beck, J. L.
2010-12-01
Finite fault earthquake source models are inherently under-determined: there is no unique solution to the inverse problem of determining the rupture history at depth as a function of time and space when our data are only limited observations at the Earth's surface. Traditional inverse techniques rely on model constraints and regularization to generate one model from the possibly broad space of all possible solutions. However, Bayesian methods allow us to determine the ensemble of all possible source models which are consistent with the data and our a priori assumptions about the physics of the earthquake source. Until now, Bayesian techniques have been of limited utility because they are computationally intractable for problems with as many free parameters as kinematic finite fault models. We have developed a methodology called Cascading Adaptive Tempered Metropolis In Parallel (CATMIP) which allows us to sample very high-dimensional problems in a parallel computing framework. The CATMIP algorithm combines elements of simulated annealing and genetic algorithms with the Metropolis algorithm to dynamically optimize the algorithm's efficiency as it runs. We will present synthetic performance tests of finite fault models made with this methodology as well as a kinematic source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. This earthquake was well recorded by multiple ascending and descending interferograms and a network of high-rate GPS stations whose records can be used as near-field seismograms.
A Hierarchical Bayesian Model for Crowd Emotions
Urizar, Oscar J.; Baig, Mirza S.; Barakova, Emilia I.; Regazzoni, Carlo S.; Marcenaro, Lucio; Rauterberg, Matthias
2016-01-01
Estimation of emotions is an essential aspect in developing intelligent systems intended for crowded environments. However, emotion estimation in crowds remains a challenging problem due to the complexity in which human emotions are manifested and the capability of a system to perceive them in such conditions. This paper proposes a hierarchical Bayesian model to learn in unsupervised manner the behavior of individuals and of the crowd as a single entity, and explore the relation between behavior and emotions to infer emotional states. Information about the motion patterns of individuals are described using a self-organizing map, and a hierarchical Bayesian network builds probabilistic models to identify behaviors and infer the emotional state of individuals and the crowd. This model is trained and tested using data produced from simulated scenarios that resemble real-life environments. The conducted experiments tested the efficiency of our method to learn, detect and associate behaviors with emotional states yielding accuracy levels of 74% for individuals and 81% for the crowd, similar in performance with existing methods for pedestrian behavior detection but with novel concepts regarding the analysis of crowds. PMID:27458366
Refining gene signatures: a Bayesian approach
Directory of Open Access Journals (Sweden)
Labbe Aurélie
2009-12-01
Full Text Available Abstract Background In high density arrays, the identification of relevant genes for disease classification is complicated by not only the curse of dimensionality but also the highly correlated nature of the array data. In this paper, we are interested in the question of how many and which genes should be selected for a disease class prediction. Our work consists of a Bayesian supervised statistical learning approach to refine gene signatures with a regularization which penalizes for the correlation between the variables selected. Results Our simulation results show that we can most often recover the correct subset of genes that predict the class as compared to other methods, even when accuracy and subset size remain the same. On real microarray datasets, we show that our approach can refine gene signatures to obtain either the same or better predictive performance than other existing methods with a smaller number of genes. Conclusions Our novel Bayesian approach includes a prior which penalizes highly correlated features in model selection and is able to extract key genes in the highly correlated context of microarray data. The methodology in the paper is described in the context of microarray data, but can be applied to any array data (such as micro RNA, for example as a first step towards predictive modeling of cancer pathways. A user-friendly software implementation of the method is available.
A Bayesian framework for active artificial perception.
Ferreira, João Filipe; Lobo, Jorge; Bessière, Pierre; Castelo-Branco, Miguel; Dias, Jorge
2013-04-01
In this paper, we present a Bayesian framework for the active multimodal perception of 3-D structure and motion. The design of this framework finds its inspiration in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common egocentric spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach. In the process, we will contribute with efficient and robust probabilistic solutions for cyclopean geometry-based stereovision and auditory perception based only on binaural cues, modeled using a consistent formalization that allows their hierarchical use as building blocks for the multimodal sensor fusion framework. We will explicitly or implicitly address the most important challenges of sensor fusion using this framework, for vision, audition, and vestibular sensing. Moreover, interaction and navigation require maximal awareness of spatial surroundings, which, in turn, is obtained through active attentional and behavioral exploration of the environment. The computational models described in this paper will support the construction of a simultaneously flexible and powerful robotic implementation of multimodal active perception to be used in real-world applications, such as human-machine interaction or mobile robot navigation. PMID:23014760
A Bayesian framework for active artificial perception.
Ferreira, João Filipe; Lobo, Jorge; Bessière, Pierre; Castelo-Branco, Miguel; Dias, Jorge
2013-04-01
In this paper, we present a Bayesian framework for the active multimodal perception of 3-D structure and motion. The design of this framework finds its inspiration in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common egocentric spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach. In the process, we will contribute with efficient and robust probabilistic solutions for cyclopean geometry-based stereovision and auditory perception based only on binaural cues, modeled using a consistent formalization that allows their hierarchical use as building blocks for the multimodal sensor fusion framework. We will explicitly or implicitly address the most important challenges of sensor fusion using this framework, for vision, audition, and vestibular sensing. Moreover, interaction and navigation require maximal awareness of spatial surroundings, which, in turn, is obtained through active attentional and behavioral exploration of the environment. The computational models described in this paper will support the construction of a simultaneously flexible and powerful robotic implementation of multimodal active perception to be used in real-world applications, such as human-machine interaction or mobile robot navigation.
Phycas: software for Bayesian phylogenetic analysis.
Lewis, Paul O; Holder, Mark T; Swofford, David L
2015-05-01
Phycas is open source, freely available Bayesian phylogenetics software written primarily in C++ but with a Python interface. Phycas specializes in Bayesian model selection for nucleotide sequence data, particularly the estimation of marginal likelihoods, central to computing Bayes Factors. Marginal likelihoods can be estimated using newer methods (Thermodynamic Integration and Generalized Steppingstone) that are more accurate than the widely used Harmonic Mean estimator. In addition, Phycas supports two posterior predictive approaches to model selection: Gelfand-Ghosh and Conditional Predictive Ordinates. The General Time Reversible family of substitution models, as well as a codon model, are available, and data can be partitioned with all parameters unlinked except tree topology and edge lengths. Phycas provides for analyses in which the prior on tree topologies allows polytomous trees as well as fully resolved trees, and provides for several choices for edge length priors, including a hierarchical model as well as the recently described compound Dirichlet prior, which helps avoid overly informative induced priors on tree length. PMID:25577605
Bayesian Integration of multiscale environmental data
Energy Technology Data Exchange (ETDEWEB)
2016-08-22
The software is designed for efficiently integrating large-size of multi-scale environmental data using the Bayesian framework. Suppose we need to estimate the spatial distribution of variable X with high spatial resolution. The available data include (1) direct measurements Z of the unknowns with high resolution in a subset of the spatial domain (small spatial coverage), (2) measurements C of the unknowns at the median scale, and (3) measurements A of the unknowns at the coarsest scale but with large spatial coverage. The goal is to estimate the unknowns at the fine grids by conditioning to all the available data. We first consider all the unknowns as random variables and estimate conditional probability distribution of those variables by conditioning to the limited high-resolution observations (Z). We then treat the estimated probability distribution as the prior distribution. Within the Bayesian framework, we combine the median and large-scale measurements (C and A) through likelihood functions. Since we assume that all the relevant multivariate distributions are Gaussian, the resulting posterior distribution is a multivariate Gaussian distribution. The developed software provides numerical solutions of the posterior probability distribution. The software can be extended in several different ways to solve more general multi-scale data integration problems.
Bayesian estimation of isotopic age differences
International Nuclear Information System (INIS)
Isotopic dating is subject to uncertainties arising from counting statistics and experimental errors. These uncertainties are additive when an isotopic age difference is calculated. If large, they can lead to no significant age difference by classical statistics. In many cases, relative ages are known because of stratigraphic order or other clues. Such information can be used to establish a Bayes estimate of age difference which will include prior knowledge of age order. Age measurement errors are assumed to be log-normal and a noninformative but constrained bivariate prior for two true ages in known order is adopted. True-age ratio is distributed as a truncated log-normal variate. Its expected value gives an age-ratio estimate, and its variance provides credible intervals. Bayesian estimates of ages are different and in correct order even if measured ages are identical or reversed in order. For example, age measurements on two samples might both yield 100 ka with coefficients of variation of 0.2. Bayesian estimates are 22.7 ka for age difference with a 75% credible interval of [4.4, 43.7] ka
Bayesian estimation of isotopic age differences
Energy Technology Data Exchange (ETDEWEB)
Curl, R.L.
1988-08-01
Isotopic dating is subject to uncertainties arising from counting statistics and experimental errors. These uncertainties are additive when an isotopic age difference is calculated. If large, they can lead to no significant age difference by classical statistics. In many cases, relative ages are known because of stratigraphic order or other clues. Such information can be used to establish a Bayes estimate of age difference which will include prior knowledge of age order. Age measurement errors are assumed to be log-normal and a noninformative but constrained bivariate prior for two true ages in known order is adopted. True-age ratio is distributed as a truncated log-normal variate. Its expected value gives an age-ratio estimate, and its variance provides credible intervals. Bayesian estimates of ages are different and in correct order even if measured ages are identical or reversed in order. For example, age measurements on two samples might both yield 100 ka with coefficients of variation of 0.2. Bayesian estimates are 22.7 ka for age difference with a 75% credible interval of (4.4, 43.7) ka.
Measure Transformer Semantics for Bayesian Machine Learning
Borgström, Johannes; Gordon, Andrew D.; Greenberg, Michael; Margetson, James; van Gael, Jurgen
The Bayesian approach to machine learning amounts to inferring posterior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of variables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we propose a core functional calculus with primitives for sampling prior distributions and observing variables. We define combinators for measure transformers, based on theorems in measure theory, and use these to give a rigorous semantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid measures, and, in particular, for observations of zero-probability events. We compile our core language to a small imperative language that has a straightforward semantics via factor graphs, data structures that enable many efficient inference algorithms. We use an existing inference engine for efficient approximate inference of posterior marginal distributions, treating thousands of observations per second for large instances of realistic models.
National Aeronautics and Space Administration — The OMI/Aura level-2 near UV Aerosol data product 'OMAERUV', recently re-processed using an enhanced algorithm, is now released (April 2012) to the public. The data...
Surrogate/spent fuel sabotage : aerosol ratio test program and Phase 2 test results.
Energy Technology Data Exchange (ETDEWEB)
Borek, Theodore Thaddeus III; Thompson, N. Slater (U.S. Department of Energy); Sorenson, Ken Bryce; Hibbs, R.S. (U.S. Department of Energy); Nolte, Oliver (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Molecke, Martin Alan; Autrusson, Bruno (Institut de Radioprotection et de Surete Nucleaire, France); Young, F. I. (U.S. Nuclear Regulatory Commission); Koch, Wolfgang (Fraunhofer Institut fur Toxikologie und Experimentelle Medizin, Germany); Brochard, Didier (Institut de Radioprotection et de Surete Nucleaire, France); Pretzsch, Gunter Guido (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany); Lange, Florentin (Gesellschaft fur Anlagen- und Reaktorsicherheit, Germany)
2004-05-01
A multinational test program is in progress to quantify the aerosol particulates produced when a high energy density device, HEDD, impacts surrogate material and actual spent fuel test rodlets. This program provides needed data that are relevant to some sabotage scenarios in relation to spent fuel transport and storage casks, and associated risk assessments; the program also provides significant political benefits in international cooperation. We are quantifying the spent fuel ratio, SFR, the ratio of the aerosol particles released from HEDD-impacted actual spent fuel to the aerosol particles produced from surrogate materials, measured under closely matched test conditions. In addition, we are measuring the amounts, nuclide content, size distribution of the released aerosol materials, and enhanced sorption of volatile fission product nuclides onto specific aerosol particle size fractions. These data are crucial for predicting radiological impacts. This document includes a thorough description of the test program, including the current, detailed test plan, concept and design, plus a description of all test components, and requirements for future components and related nuclear facility needs. It also serves as a program status report as of the end of FY 2003. All available test results, observations, and analyses - primarily for surrogate material Phase 2 tests using cerium oxide sintered ceramic pellets are included. This spent fuel sabotage - aerosol test program is coordinated with the international Working Group for Sabotage Concerns of Transport and Storage Casks, WGSTSC, and supported by both the U.S. Department of Energy and Nuclear Regulatory Commission.
Aerosol particle transport modeling for preclosure safety studies of nuclear waste repositories
International Nuclear Information System (INIS)
An important concern for preclosure safety analysis of a nuclear waste repository is the potential release to the environment of respirable aerosol particles. Such particles, less than 10 μm in aerodynamic diameter, may have significant adverse health effects if inhaled. To assess the potential health effects of these particles, it is not sufficient to determine the mass fraction of respirable aerosol. The chemical composition of the particles is also of importance since different radionuclides may pose vastly different health hazards. Thus, models are needed to determine under normal and accident conditions the particle size and the chemical composition distributions of aerosol particles as a function of time and of position in the repository. In this work a multicomponent sectional aerosol model is used to determine the aerosol particle size and composition distributions in the repository. A range of aerosol mass releases with varying mean particle sizes and chemical compositions is used to demonstrate the sensitivities and uncertainties of the model. Decontamination factors for some locations in the repository are presented. 8 refs., 1 tab
Stratospheric aerosol geoengineering
Robock, Alan
2015-03-01
The Geoengineering Model Intercomparison Project, conducting climate model experiments with standard stratospheric aerosol injection scenarios, has found that insolation reduction could keep the global average temperature constant, but global average precipitation would reduce, particularly in summer monsoon regions around the world. Temperature changes would also not be uniform; the tropics would cool, but high latitudes would warm, with continuing, but reduced sea ice and ice sheet melting. Temperature extremes would still increase, but not as much as without geoengineering. If geoengineering were halted all at once, there would be rapid temperature and precipitation increases at 5-10 times the rates from gradual global warming. The prospect of geoengineering working may reduce the current drive toward reducing greenhouse gas emissions, and there are concerns about commercial or military control. Because geoengineering cannot safely address climate change, global efforts to reduce greenhouse gas emissions and to adapt are crucial to address anthropogenic global warming.