A Bayesian Analysis of the Radioactive Releases of Fukushima
DEFF Research Database (Denmark)
Tomioka, Ryota; Mørup, Morten
2012-01-01
types of nuclides and their levels of concentration from the recorded mixture of radiations to take necessary measures. We presently formulate a Bayesian generative model for the data available on radioactive releases from the Fukushima Daiichi disaster across Japan. From the sparsely sampled...... Fukushima Daiichi plant we establish that the model is able to account for the data. We further demonstrate how the model extends to include all the available measurements recorded throughout Japan. The model can be considered a first attempt to apply Bayesian learning unsupervised in order to give a more......The Fukushima Daiichi disaster 11 March, 2011 is considered the largest nuclear accident since the 1986 Chernobyl disaster and has been rated at level 7 on the International Nuclear Event Scale. As different radioactive materials have different effects to human body, it is important to know the...
Status of the ORNL Aerosol Release and Transport Project
International Nuclear Information System (INIS)
The behavior of aerosols assumed to be characteristic of those generated during light water reactor (LWR) accident sequences and released into containment is being studied. Recent activities in the ORNL Aerosol Release and Transport Project include studies of (1) the thermal hydraulic conditions existing during Nuclear Safety Pilot Plant (NSPP) aerosol tests in steam-air environments, (2) the thermal output and aerosol mass generation rates for plasma torch aerosol generators, and (3) the influence of humidity on the shape of agglomerated aerosols of various materials. A new Aerosol-Moisture Interaction Test (AMIT) facility was prepared at the NSPP site to accommodate the aerosol shape studies; several tests with Fe2O3 aerosol have been conducted. In addition to the above activities a special study was conducted to determine the suitability of the technique of aerosol production by plasma torch under the operating conditions of future tests of the LWR Aerosol Containment Experiments (LACE) at the Hanford Engineering Development Laboratory. 3 refs., 2 figs., 7 tabs
Nanostructured Aerosol Particles: Fabrication, Pulmonary Drug Delivery, and Controlled Release
Directory of Open Access Journals (Sweden)
Xingmao Jiang
2011-01-01
Full Text Available Pulmonary drug delivery is the preferred route of administration in the treatment of respiratory diseases and some nonrespiratory diseases. Recent research has focused on developing structurally stable high-dosage drug delivery systems without premature release. To maximize the deposition in the desired lung regions, several factors must be considered in the formulation. The special issue includes seven papers deal with aerosol-assisted fabrication of nanostructured particles, aerosol deposition, nanoparticles pulmonary exposure, and controlled release.
Release of Free DNA by Membrane-Impaired Bacterial Aerosols Due to Aerosolization and Air Sampling
Zhen, Huajun; Han, Taewon; Fennell, Donna E.; Mainelis, Gediminas
2013-01-01
We report here that stress experienced by bacteria due to aerosolization and air sampling can result in severe membrane impairment, leading to the release of DNA as free molecules. Escherichia coli and Bacillus atrophaeus bacteria were aerosolized and then either collected directly into liquid or collected using other collection media and then transferred into liquid. The amount of DNA released was quantified as the cell membrane damage index (ID), i.e., the number of 16S rRNA gene copies in ...
Radon and aerosol release from open-pit uranium mining
International Nuclear Information System (INIS)
The quantity of 222Rn (hereafter called radon) released per unit of uranium produced from open pit mining has been determined. A secondary objective was to determine the nature and quantity of airborne particles resulting from mine operations. To accomplish these objectives, a comprehensive study of the release rates of radon and aerosol material to the atmosphere was made over a one-year period from April 1979 to May 1980 at the Morton Ranch Mine which was operated by United Nuclear Corporation (UNC) in partnership with Tennessee Valley Authority (TVA). The mine is now operated for TVA by Silver King Mines. Morton Ranch Mine was one of five open pit uranium mines studied in central Wyoming. Corroborative measurements were made of radon flux and 226Ra (hereafter called radium) concentrations of various surfaces at three of the other mines in October 1980 and again at these three mines plus a fourth in April of 1981. Three of these mines are located in the Powder River Basin, about 80 kilometers east by northeast of Casper. One is located in the Shirley Basin, about 60 km south of Casper, and the remaining one is located in the Gas Hills, approximately 100 km west of Casper. The one-year intensive study included simultaneous measurement of several parameters: continuous measurement of atmospheric radon concentration near the ground at three locations, monthly 24-hour radon flux measurements from various surfaces, radium analyses of soil samples collected under each of the flux monitoring devices, monthly integrations of aerosols on dichotomous aerosol samplers, analysis of aerosol samplers for total dust loading, aerosol elemental and radiochemical composition, aerosol elemental composition by particle size, wind speed, wind direction, temperature, barometric pressure, and rainfall
Rapid Detection and Identification of Biogenic Aerosol Releases and Sources
Wagner, J.; Macher, J.; Ghosal, S.; Ahmed, K.; Hemati, K.; Wall, S.; Kumagai, K.
2011-12-01
Biogenic aerosols can be important contributors to aerosol chemistry, cloud droplet and ice nucleation, absorption and scattering of radiation, human health and comfort, and plant, animal, and microbial ecology. Many types of bioaerosols, e.g., fungal spores, are released into the atmosphere in response to specific climatological and meteorological conditions. The rapid identification of bioaerosol releases is thus important for better characterization of the above phenomena, as well as enabling public officials to respond quickly and appropriately to releases of infectious agents or biological toxins. One approach to rapid and accurate bioaerosol detection is to employ sequential, automated samples that can be fed directly into an image acquisition and data analysis device. Raman spectroscopy-based identification of bioaerosols, automated analysis of microscopy images, and automated detection of near-monodisperse peaks in aerosol size-distribution data were investigated as complementary approaches to traditional, manual methods for the identification and counting of fungal and actinomycete spores. Manual light microscopy is a widely used analytical technique that is compatible with a number of air sample formats and requires minimal sample preparation. However, a major drawback is its dependence on a human analyst's ability to distinguish particles and accurately count, size, and identify them. Therefore, automated methods, such as those evaluated in this study, have the potential to provide cost-effective and rapid alternatives if demonstrated to be accurate and reliable. An exploratory examination of individual spores for several macro- and microfungi (those with and without large fruiting bodies) by Raman microspectroscopy found unique spectral features that were used to identify fungi to the genus level. Automated analyses of digital spore images accurately recognized and counted single fungal spores and clusters. An automated procedure to discriminate near
Aerosols released from solvent fire accidents in reprocessing plants
International Nuclear Information System (INIS)
Thermodynamic, aerosol characterizing and radiological data of solvent fires in reprocessing plants have been established in experiments. These are the main results: Depending on the ventilation in the containment, kerosene-TBP mixtures burn at a rate up to 120 kg/m2 h. The aqueous phase of inorganic-organic mixtures might be released during the fire. The gaseous reaction products contain unburnable acidic compounds. Solvents with TBP-nitrate complex shows higher (up to 25%) burning rates than pure solvents (kerosene-TBP). The nitrate complex decomposes violently at about 1300C with a release of acid and unburnable gases. Up to 20% of the burned kerosene-TBP solvents are released during the fire in the form of soot particles, phosphoric acid and TBP decomposition products. The particles have an aerodynamic mass median diameter of about 0.5 μm and up to 1.5% of the uranium fixed in the TBP-nitrate complex is released during solvent fires. (orig.)
Tang, Qingxin; Bo, Yanchen; Zhu, Yuxin
2016-04-01
Merging multisensor aerosol optical depth (AOD) products is an effective way to produce more spatiotemporally complete and accurate AOD products. A spatiotemporal statistical data fusion framework based on a Bayesian maximum entropy (BME) method was developed for merging satellite AOD products in East Asia. The advantages of the presented merging framework are that it not only utilizes the spatiotemporal autocorrelations but also explicitly incorporates the uncertainties of the AOD products being merged. The satellite AOD products used for merging are the Moderate Resolution Imaging Spectroradiometer (MODIS) Collection 5.1 Level-2 AOD products (MOD04_L2) and the Sea-viewing Wide Field-of-view Sensor (SeaWiFS) Deep Blue Level 2 AOD products (SWDB_L2). The results show that the average completeness of the merged AOD data is 95.2%,which is significantly superior to the completeness of MOD04_L2 (22.9%) and SWDB_L2 (20.2%). By comparing the merged AOD to the Aerosol Robotic Network AOD records, the results show that the correlation coefficient (0.75), root-mean-square error (0.29), and mean bias (0.068) of the merged AOD are close to those (the correlation coefficient (0.82), root-mean-square error (0.19), and mean bias (0.059)) of the MODIS AOD. In the regions where both MODIS and SeaWiFS have valid observations, the accuracy of the merged AOD is higher than those of MODIS and SeaWiFS AODs. Even in regions where both MODIS and SeaWiFS AODs are missing, the accuracy of the merged AOD is also close to the accuracy of the regions where both MODIS and SeaWiFS have valid observations.
Experimental program in core melt aerosol release and transport
International Nuclear Information System (INIS)
A survey of the requirements for an experimental demonstration of core-melt aerosol release has indicated that the most practical technique is that referred to as skull melting by rf induction. The implied skull would be a preformed ZrO2 or ThO2 shell composed of presintered powdered oxide. The advantages of this method include freedom from foreign container materials, a cold wall environment that ensures furnace integrity, and an almost unrestricted use of steam or other atmosphere as the cover gas. The major emphases of the project will be first to investigate chemical states and adsorption processes for simulant fission products, particularly iodine and cesium, and second, to measure the coagglomeration and total attenuation rate of all vaporized species with the structural material aerosols. The initial part of the effort has been dedicated to the development of a demonstration scale (1.0-kg), water-cooled, skull container with segmented copper components. A second part of the effort has been concerned with the design of a full 10- to 20-kg scale furnace and the selection of a 250-kW-rf power unit to match the furnace
Small-Scale Spray Releases: Initial Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Mahoney, Lenna A.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, Garrett N.; Kurath, Dean E.; Buchmiller, William C.; Smith, Dennese M.; Blanchard, Jeremy; Song, Chen; Daniel, Richard C.; Wells, Beric E.; Tran, Diana N.; Burns, Carolyn A.
2013-05-29
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and net generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of antifoam agents was assessed with most of the simulants. Orifices included round holes and
Small-Scale Spray Releases: Initial Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Mahoney, Lenna A.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, Garrett N.; Kurath, Dean E.; Buchmiller, William C.; Smith, Dennese M.; Blanchard, Jeremy; Song, Chen; Daniel, Richard C.; Wells, Beric E.; Tran, Diana N.; Burns, Carolyn A.
2012-11-01
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and
Large-Scale Spray Releases: Initial Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.
2012-12-01
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and
Characteristics of the aerosols released to the environment after a severe PWR accident
International Nuclear Information System (INIS)
In the event of a postulated severe accident on a pressurized water reactor (PWR) involving fuel degradation, gases and aerosols containing radioactive products could be released, with short, medium and long term consequences for the population and the environment. Under such accident conditions, the ESCADRE code system, developed at IPSN (Institute for Nuclear Safety and Protection) can be used to calculate the properties of the substances released and, especially with the AEROSOLS/B2 code, the main characteristics of the aerosols (concentration, size distribution, composition). For conditions representative of severe PWR accidents, by varying different main parameters (structural material aerosols, steam condensation in the containment, etc...), indications are given on the range of characteristics of the aerosols (containing notably Cs, Te, Sr, Ru, etc...) released to the atmosphere. Information is also given on how more accurate data (especially on the chemical forms) will be obtainable in the framework of current or planned experimental programs (HEVA, PITEAS, PHEBUS PF, etc...)
The US Nuclear Regulatory Commission aerosol release and the transport program
International Nuclear Information System (INIS)
An overview is presented of the U.S.N.R.C. research program for providing experimentally verified, quantitative methods for estimating the release and transport of sodium and radionuclide aerosols following postulated accidents. The program is directed towards radiological consequence assessment, however a number of aerosol behavior mechanisms being studied are applicable to LMFBR operational considerations. Related theoretical and experimental work on aerosol formation, agglomeration, settling and plating is noted. (author)
LMFBR aerosol release and transport program. Quarterly progress report, July-September 1981
International Nuclear Information System (INIS)
This report summarizes progress for the Aerosol Release and Transport Program sponsored by the Office of Nuclear Regulatory Research, Division of Accident Evaluation of the Nuclear Regulatory Commission for the period July-September 1981. Topics discussed include (1) preparations for under-sodium tests at the Fast Aerosol Simulant Test Facility, (2) progress in interpretation of Oak Ridge National Laboratory-Sandia Laboratory normalization test results, (3) U3O8 in steam (light-water reactor accident) aerosol experiments conducted in the Nuclear Safety Power Plant, (4) experiments on B2O3 and SiO2 aerosols at the Containment Research Installation-II Facility, (5) fuel-melting tests in small-scale experimental facilities for the core-melt aerosol program, (6) analytical comparison of simple adiabatic nonlinear and linear analytical models of bubble oscillation phenomena with experimental data
Fission product partitioning in aerosol release from simulated spent nuclear fuel
Di Lemma, F.G.; Colle, J.Y.; Rasmussen, G.; Konings, R J M
2014-01-01
Aerosols created by the vaporization of simulated spent nuclear fuel (simfuel) were produced by laser heating techniques and characterised by a wide range of post-analyses. In particular attention has been focused on determining the fission product behaviour in the aerosols, in order to improve the evaluation of the source term and consequently the risk associated with release from spent fuel sabotage or accidents. Different simulated spent fuels were tested with burn-up up to 8 at. %. The re...
International Nuclear Information System (INIS)
Nanoparticle-containing sprays are a critical class of consumer products, since human exposure may occur by inhalation of nanoparticles (NP) in the generated aerosols. In this work, the suspension and the released aerosol of six different commercially available consumer spray products were analyzed. Next to a broad spectrum of analytical methods for the characterization of the suspension, a standardized setup for the analysis of aerosol has been used. In addition, a new online coupling technique (SMPS–ICPMS) for the simultaneous analysis of particle size and elemental composition of aerosol particles has been applied. Results obtained with this new method were confirmed by other well-established techniques. Comparison of particles in the original suspensions and in the generated aerosol showed that during spraying single particles of size less than 20 nm had been formed, even though in none of the suspensions particles of size less than 280 nm were present (Aerosol size range scanned: 7–300 nm). Both pump sprays and propellant gas sprays were analyzed and both released particles in the nm size range. Also, both water-based and organic solvent-based sprays released NP. However, a trend was observed that spraying an aqueous suspension contained in a pump spray dispenser after drying resulted in bigger agglomerates than spraying organic suspensions in propellant gas dispensers.
Energy Technology Data Exchange (ETDEWEB)
Losert, Sabrina; Hess, Adrian [Empa Swiss Federal Laboratories for Materials Science and Technology, Laboratory for Analytical Chemistry (Switzerland); Ilari, Gabriele [Empa Swiss Federal Laboratories for Materials Science and Technology, Electron Microscopy Center (Switzerland); Goetz, Natalie von, E-mail: natalie.von.goetz@chem.ethz.ch; Hungerbuehler, Konrad [ETH Zürich Swiss Federal Institute of Technology Zürich, Institute for Chemical and Bioengineering (Switzerland)
2015-07-15
Nanoparticle-containing sprays are a critical class of consumer products, since human exposure may occur by inhalation of nanoparticles (NP) in the generated aerosols. In this work, the suspension and the released aerosol of six different commercially available consumer spray products were analyzed. Next to a broad spectrum of analytical methods for the characterization of the suspension, a standardized setup for the analysis of aerosol has been used. In addition, a new online coupling technique (SMPS–ICPMS) for the simultaneous analysis of particle size and elemental composition of aerosol particles has been applied. Results obtained with this new method were confirmed by other well-established techniques. Comparison of particles in the original suspensions and in the generated aerosol showed that during spraying single particles of size less than 20 nm had been formed, even though in none of the suspensions particles of size less than 280 nm were present (Aerosol size range scanned: 7–300 nm). Both pump sprays and propellant gas sprays were analyzed and both released particles in the nm size range. Also, both water-based and organic solvent-based sprays released NP. However, a trend was observed that spraying an aqueous suspension contained in a pump spray dispenser after drying resulted in bigger agglomerates than spraying organic suspensions in propellant gas dispensers.
Large-Scale Spray Releases: Additional Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.
2013-08-01
One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used
Small-Scale Spray Releases: Additional Aerosol Test Results
Energy Technology Data Exchange (ETDEWEB)
Schonewill, Philip P.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, G. N.; Mahoney, Lenna A.; Tran, Diana N.; Burns, Carolyn A.; Kurath, Dean E.
2013-08-01
One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are largely absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale. The small-scale testing and resultant data are described in Mahoney et al. (2012b) and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used to mimic the
Aerosol Release and Transport Program. Semiannual progress report, April-September 1985
International Nuclear Information System (INIS)
This report summarizes progress for the Aerosol Release and Transport Program sponsored by the Nuclear Regulatory Commission, Office of Nuclear Regulatory Research, Division of Accident Evaluation, for the period April 1985-September 1985. Topics discussed include (1) a steam-only test performed in the NSPP vessel; (2) development tests to study thermal input and generation efficiency; (3) Aerosol-Moisture Interaction Test (AMIT) preliminary and development tests to check various features of the AMIT facility; (4) data from the first two tests in the AMIT program; (5) an analysis of changes necessary in Andersen Mark-III impactor design for AMIT experiments; (6) the theory of capillary condensation on aerosols at nominally undersaturated humidity levels; (7) work in modifying data processing codes to accommodate data retrieval equipment changes; (8) correction of sample volume calculations for NSPP experiments on aerosols in steam-air environments; and (9) implementation and application of the CONTAIN code. 2 refs., 9 figs., 8 tabs
Pereira, Gabriel; Freitas, Saulo R.; Moraes, Elisabete Caria; Ferreira, Nelson Jesus; Shimabukuro, Yosio Edemir; Rao, Vadlamudi Brahmananda; Longo, Karla M.
2009-12-01
Contemporary human activities such as tropical deforestation, land clearing for agriculture, pest control and grassland management lead to biomass burning, which in turn leads to land-cover changes. However, biomass burning emissions are not correctly measured and the methods to assess these emissions form a part of current research area. The traditional methods for estimating aerosols and trace gases released into the atmosphere generally use emission factors associated with fuel loading and moisture characteristics and other parameters that are hard to estimate in near real-time applications. In this paper, fire radiative power (FRP) products were extracted from Moderate Resolution Imaging Spectroradiometer (MODIS) and from the Geostationary Operational Environmental Satellites (GOES) fire products and new South America generic biomes FRE-based smoke aerosol emission coefficients were derived and applied in 2002 South America fire season. The inventory estimated by MODIS and GOES FRP measurements were included in Coupled Aerosol-Tracer Transport model coupled to the Brazilian developments on the Regional Atmospheric Modeling System (CATT-BRAMS) and evaluated with ground truth collected in Large Scale Biosphere-Atmosphere Smoke, Aerosols, Clouds, rainfall, and Climate (SMOCC) and Radiation, Cloud, and Climate Interactions (RaCCI). Although the linear regression showed that GOES FRP overestimates MODIS FRP observations, the use of a common external parameter such as MODIS aerosol optical depth product could minimize the difference between sensors. The relationship between the PM 2.5μm (Particulate Matter with diameter less than 2.5 μm) and CO (Carbon Monoxide) model shows a good agreement with SMOCC/RaCCI data in the general pattern of temporal evolution. The results showed high correlations, with values between 0.80 and 0.95 (significant at 0.5 level by student t test), for the CATT-BRAMS simulations with PM 2.5μm and CO.
International Nuclear Information System (INIS)
This paper reports on a series of tests conducted to study the mechanical release behavior of sodium aerosols containing nonvolatile fission products during a sodium-concrete reaction in which release behavior due to hydrodynamic breakup of the hydrogen bubble is predominant at the sodium pool surface. In the tests, nonradioactive materials, namely, strontium oxide, europium oxide, and ruthenium particles, whose sizes range from a few microns to several tens of microns, are used as nonvolatile fission product stimulants. The following results are obtained: The sodium aerosol release rate during the sodium-concrete reaction is larger than that of natural evaporation. The difference, however, becomes smaller with increasing sodium temperature: nearly ten times smaller at 400 degrees C and three times at 700 degrees C. The retention factors for the nonvolatile materials in the sodium pool increase to the range of 0.5 to 104 with an increase in the sodium temperature from 400 to 700 degrees C
New data for aerosols generated by releases of pressurized powders and solutions in static air
International Nuclear Information System (INIS)
Safety assessments and environmental impact statements for nuclear fuel cycle facilities require an estimate of potential airborne releases. Aerosols generated by accidents are being investigated by Pacific Northwest Laboratory to develop radioactive source-term estimation methods. Experiments measuring the mass airborne and particle size distribution of aerosols produced by pressurized releases were run. Carbon dioxide was used to pressurize uranine solutions to 50, 250, and 500 psig before release. The mass airborne from these experiments was higher than for comparable air-pressurized systems, but not as great as expected based on the amount of gas dissolved in the liquid and the volume of liquid ejected from the release equipment. Flashing sprays of uranine at 60, 125, and 240 psig produced a much larger source term than all other pressurized releases performed under this program. Low-pressure releases of depleted uranium dioxide at 9, 17.5, and 24.5 psig provided data in the energy region between 3-m spills and 50-psig pressurized releases
Radiation-hygienic significance of gaseous aerosol releases from coal thermoelectric plants (review)
International Nuclear Information System (INIS)
Review data on the problem of radiation effect on the population of gaseous-aerosol releases from coal thermoelectric plants are presented. It is shown that effective dose equivalents to population due to 1 GW nominal thermoelectric plant releases much exceed radiation dose caused by releases of NPP of similar power. The mean individual dose equivalent to the lungs of population living near the rated power thermoelectric plant is about 0.42 mcZv. Polonium-210 is the most typical representative of natural radionuclides containing in releases. Carcinogenic effect of polonium on animals and man is considered. Significance of accounting synergism of radiation factor effects and main components of ash releases is stressed
Aerosols generated by releases of pressurized powders and solutions in static air
International Nuclear Information System (INIS)
Safety assessments and environmental impact statements for nuclear fuel cycle facilities require an estimate of potential airborne releases caused by accidents. Aerosols generated by accidents are being investigated by Pacific Northwest Laboratory to develop the source terms for these releases. An upper boundary accidental release event would be a pressurized release of powder or liquid in static air. Experiments were run using various source sizes and pressures and measuring the mass airborne and the particle size distribution of aerosols produced by these pressurized releases. Two powder and two liquid sources were used: TiO2 and depleted uranium dioxide (DUO); and aqueous uranine (sodium fluorescein) and uranyl nitrate solutions. Results of the experiments showed that pressurization level and source size were significant variables for the airborne powder releases. For this experimental configuration, the liquid releases were a function of pressure, but volume did not appear to be a significant variable. During the experiments 100 g and 350 g of DUO (1 μm dia) and TiO2 (1.7 μm dia) powders and 100 cm3 and 350 cm3 of uranine and uranyl nitrate solutions were released at pressures ranging from 50 to 500 psig. The average of the largest fractions of powder airborne was about 24%. The maximum amount of liquid source airborne was significantly less, about 0.15%. The median aerodynamic equivalent diameters (AED) for collected airborne powders ranged from 5 to 19 μm; liquids ranged from 2 to 29 μm. All of the releases produced a significant fraction of respirable particles of 10 μm and less. 12 references, 10 figures, 23 tables
Ofner, J.; Balzer, N.; Buxmann, J.; Grothe, H.; Krüger, H.; Platt, U.; Schmitt-Kopplin, P.; Zetzsch, C.
2011-12-01
Reactive halogen species are released by various sources like photo-activated sea-salt aerosol or salt pans and salt lakes. These heterogeneous release mechanisms have been overlooked so far, although their potential of interaction with organic aerosols like Secondary Organic Aerosol (SOA), Biomass Burning Organic Aerosol (BBOA) or Atmospheric Humic LIke Substances (HULIS) is completely unknown. Such reactions can constitute sources of gaseous organo-halogen compounds or halogenated organic particles in the atmospheric boundary layer. To study the interaction of organic aerosols with reactive halogen species (RHS), SOA was produced from α-pinene, catechol and guaiacol using an aerosol smog-chamber. The model SOAs were characterized in detail using a variety of physico-chemical methods (Ofner et al., 2011). Those aerosols were exposed to molecular halogens in the presence of UV/VIS irradiation and to halogens, released from simulated natural halogen sources like salt pans, in order to study the complex aerosol-halogen interaction. The heterogeneous reaction of RHS with those model aerosols leads to different gaseous species like CO2, CO and small reactive/toxic molecules like phosgene (COCl2). Hydrogen containing groups on the aerosol particles are destroyed to form HCl or HBr, and a significant formation of C-Br bonds could be verified in the particle phase. Carbonyl containing functional groups of the aerosol are strongly affected by the halogenation process. While changes of functional groups and gaseous species were visible using FTIR spectroscopy, optical properties were studied using Diffuse Reflectance UV/VIS spectroscopy. Overall, the optical properties of the processed organic aerosols are significantly changed. While chlorine causes a "bleaching" of the aerosol particles, bromine shifts the maximum of UV/VIS absorption to the red end of the UV/VIS spectrum. Further physico-chemical changes were recognized according to the aerosol size-distributions or the
Control of releases of radioactive aerosols from object ''Ukryttya'' in 2014
International Nuclear Information System (INIS)
The results of control of radioactive particulate emission are presented from the object ''Ukryttya'' in 2014. The maximal rate of unorganized releases of beta-radiating products of Chernobyl accident was in winter period and reached 3.6 MBq/day. The concentration of long-lived beta-radiating aerosols released in atmosphere from system ''Bypass'' was within the range 0.3 - 5 Bq/m3 (maximal concentration was 14 Bq/m3). Them carriers were particles with active median aerodynamic diameter (AMAD) 0.6 - 6 μm. Mean ratio of concentrations were: 137Cs/241Am = 97 i 241Am/154Eu = 6.2. The concentration of 212Pb - daughter products of thoron consisted as a rule 0.8 - 4 Bq/m3. Maximal concentration of 212Pb aerosols was 9 Bq/m3. The relation of concentrations of daughter products of radon and thoron and 212Pb were about 4. They have AMAD 0.06 - 0.3 μm. A volume activity and dispersity of radioactive aerosols in releases from object ''Ukryttya'' remain constant the last ten years
Aerosol Deposition Efficiencies and Upstream Release Positions for Different Inhalation Modes in an Upper Bronchial Airway Model Zhe Zhang, Clement Kleinstreuer, and Chong S. KimCenter for Environmental Medicine and Lung Biology, University of North Carolina at Ch...
Results of aerosol code comparisons with releases from ACE MCCI tests
International Nuclear Information System (INIS)
Results of aerosol release calculations by six groups from six countries are compared with the releases from ACE MCCI Test L6. The codes used for these calculations included: SOLGASMIX-PV, SOLGASMIX Reactor 1986, CORCON.UW, VANESA 1.01, and CORCON mod2.04/VANESA 1.01. Calculations were performed with the standard VANESA 1.01 code and with modifications to the VANESA code such as the inclusion of various zirconium-silica chemical reactions. Comparisons of results from these calculations were made with Test L6 release fractions for U, Zr, Si, the fission-product elements Te, Ba, Sr, Ce, La, Mo, Ru and control materials Ag and In. Reasonable agreement was obtained between calculations and Test L6 results for the volatile elements Ag, In and Te. Calculated releases of the low volatility fission products ranged from within an order of magnitude to five orders of magnitude of Test L6 values. Releases were over and underestimated by calculations. Poorest agreements were obtained for Mo and Si. In summary: Results of this code comparison effort are useful in assessing progress on fission-product release calculations and in providing guidance with respect to databases and further model development. Conclusions and recommendations are given below. 1. Significant progress has been made by the development of various SOLGASMIX chemical equilibrium codes with extensive databases and the development of the CORCON.UW code which gives better agreement with Test L6 than the CORCON mod2.04/VANESA 1.01 codes. The SOLGASMIX calculations on Test L6 and other ACE MCCI tests have provided valuable contributions on the importance of various species in the melt chemistry and the effects of various test parameters on the release. 2. Although some possible causes for discrepancies between calculated and measured releases have been proposed in this paper, the combined efforts of specialists are needed to identify the causes of discrepancies between the calculated and measured releases for each
Beta experiments on zirconium oxidation and aerosol release during melt-concrete interaction
International Nuclear Information System (INIS)
Three experiments on melt-concrete interaction have been carried out in the BETA facility to investigate the zirconium oxidation processes during concrete attack and their influence on concrete erosion and aerosol release. The results clearly show the dominance of the condensed phase chemistry, that is the chemical reaction of Zr and SiO2 leading to the rapid oxidation of 80 kg of Zr and the formation of Si in the metallic melt within a few minutes only. The high chemical energy release from this reaction produces fast concrete erosion and a pronounced gas spike dominated by hydrogen release. After the completion of Zr oxidation the erosion is determined by the much lower internal decay heat level with moderate interaction processes. The temperature of the melt is measured to decrease very fast to the freezing temperature which can be explained by the very effective heat removal to the melting concrete. The overall downward erosion of 40 to 50 cm of the concrete crucible produces characteristic 2-dimensional cavity shapes. Aerosol release including simulated fission product behavior is reported with respect to aerosol rates, chemical composition, and characteristic particle size. In conclusion: The three tests investigated the interaction of predominantly metallic melts of high initial Zr concentration with siliceous concrete in a cylindrical crucible. They give clear and consistent data on Zr oxidation and related processes which may be summarized as follows: - Oxidation of 80 kg Zry-4 in 300 kg metallic melt dominates the interaction during the first 2 or 3 minutes. Material investigation shows the depletion of Zr within only 1 minute and a simultaneous increase of Si concentration in the metallic melt as described by the condensed phase chemical reaction Zr + SiO2 ZrO2 + Si. - In spite of the high energy deposition from Zr oxidation and from electric heating the temperature of the metal in all three BETA tests drops to its freezing temperature within some 150 s
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef; Fast P. (Lawrence Livermore National Laboratory, Livermore, CA); Kraus, M. (Peterson AFB, CO); Ray, J. P.
2006-01-01
Terrorist attacks using an aerosolized pathogen preparation have gained credibility as a national security concern after the anthrax attacks of 2001. The ability to characterize such attacks, i.e., to estimate the number of people infected, the time of infection, and the average dose received, is important when planning a medical response. We address this question of characterization by formulating a Bayesian inverse problem predicated on a short time-series of diagnosed patients exhibiting symptoms. To be of relevance to response planning, we limit ourselves to 3-5 days of data. In tests performed with anthrax as the pathogen, we find that these data are usually sufficient, especially if the model of the outbreak used in the inverse problem is an accurate one. In some cases the scarcity of data may initially support outbreak characterizations at odds with the true one, but with sufficient data the correct inferences are recovered; in other words, the inverse problem posed and its solution methodology are consistent. We also explore the effect of model error-situations for which the model used in the inverse problem is only a partially accurate representation of the outbreak; here, the model predictions and the observations differ by more than a random noise. We find that while there is a consistent discrepancy between the inferred and the true characterizations, they are also close enough to be of relevance when planning a response.
Results of aerosol code comparisons with releases from ACE MCCI tests
International Nuclear Information System (INIS)
Results of aerosol release calculations by six groups from six countries are compared with the releases from ACE MCCI Test L6. The codes used for these calculations included: SOLGASMIX-PV, SOLGASMIX Reactor 1986, CORCON.UW, VANESA 1.01, and CORCON mod2.04/VANESA 1.01. Calculations were performed with the standard VANESA 1.01 code and with modifications to the VANESA code such as the inclusion of various zirconium-silica chemical reactions. Comparisons of results from these calculations were made with Test L6 release fractions for U, Zr, Si, the fission-product elements Te, Ba, Sr, Ce, La, Mo and control materials Ag, In, and Ru. Reasonable agreement was obtained between calculations and Test L6 results for the volatile elements Ag, In and Te. Calculated releases of the low volatility fission products ranged from within an order of magnitude to five orders of magnitude of Test L6 values. Releases were over and underestimated by calculations. Poorest agreements were obtained for Mo and Si
Large methane releases lead to strong aerosol forcing and reduced cloudiness
Directory of Open Access Journals (Sweden)
T. Kurtén
2011-07-01
Full Text Available The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH_{4} levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is predicted to significantly decrease hydroxyl radical (OH concentrations, while moderately increasing ozone (O_{3}. These changes lead to a 70 % increase in the atmospheric lifetime of methane, and an 18 % decrease in global mean cloud droplet number concentrations (CDNC. The CDNC change causes a radiative forcing that is comparable in magnitude to the longwave radiative forcing ("enhanced greenhouse effect" of the added methane. Together, the indirect CH_{4}-O_{3} and CH_{4}-OH-aerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously large temperature changes associated with historic methane releases.
Large methane releases lead to strong aerosol forcing and reduced cloudiness
Directory of Open Access Journals (Sweden)
T. Kurtén
2011-03-01
Full Text Available The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH_{4} levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is predicted to significantly decrease hydroxyl radical (OH concentrations, while moderately increasing ozone (O_{3}. These changes lead to a 70% increase in the atmospheric lifetime of methane, and an 18% decrease in global mean cloud droplet number concentrations (CDNC. The CDNC change causes a radiative forcing that is comparable in magnitude to the longwave radiative forcing ("enhanced greenhouse effect" of the added methane. Together, the indirect CH_{4}-O_{3} and CH_{4}-OH-aerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously large temperature changes associated with historic methane releases.
Large methane releases lead to strong aerosol forcing and reduced cloudiness
DEFF Research Database (Denmark)
Kurten, T.; Zhou, L.; Makkonen, R.;
2011-01-01
contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH4) levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is......The release of vast quantities of methane into the atmosphere as a result of clathrate destabilization is a potential mechanism for rapid amplification of global warming. Previous studies have calculated the enhanced warming based mainly on the radiative effect of the methane itself, with smaller...... forcing that is comparable in magnitude to the long-wave radiative forcing ("enhanced greenhouse effect") of the added methane. Together, the indirect CH4-O-3 and CH4-OHaerosol forcings could more than double the warming effect of large methane increases. Our findings may help explain the anomalously...
Kajino, Mizuo; Adachi, Kouji; Sekiyama, Tsuyoshi; Zaizen, Yuji; Igarashi, Yasuhito
2014-05-01
We recently revealed that the microphysical properties of aerosols carrying the radioactive Cs released from the Fukushima Daiichi Nuclear Power Plant (FDNPP) at an early stage (March 14-15, 2011) of the accident could be very different from what we assumed previously: super-micron and non-hygroscopic at the early stage, whereas sub-micron and hygroscopic afterwards (at least later than March 20-22). In the study, two sensitivity simulations with the two different aerosol microphysical properties were conducted using a regional scale meteorology- chemical transport model (NHM-Chem). The impact of the difference was quite significant. 17% (0.001%) of the radioactive Cs fell onto the ground by dry (wet) deposition processes, and the rest was deposited into the ocean or was transported out of the model domain, which is central and northern part of the main land of Japan, under the assumption that Cs-bearing aerosols are non-hygroscopic and super-micron. On the other hand, 5.7% (11.3%) fell onto the ground by dry (wet) deposition, for the cases under the assumption that the Cs-bearing aerosols are hygroscopic and sub-micron. For the accurate simulation of the deposition of radionuclides, knowledge of the aerosol microphysical properties is essential as well as the accuracy of the simulated wind fields and precipitation patterns.
Fission product partitioning in aerosol release from simulated spent nuclear fuel
Di Lemma, F.G.; Colle, J.Y.; Rasmussen, G.; Konings, R.J.M.
2015-01-01
Aerosols created by the vaporization of simulated spent nuclear fuel (simfuel) were produced by laser heating techniques and characterised by a wide range of post-analyses. In particular attention has been focused on determining the fission product behaviour in the aerosols, in order to improve the
International Nuclear Information System (INIS)
A program of laboratory investigations has been undertaken at Argonne National Laboratory, under sponsorship of the Electric Power Research Institute, in which the interaction between molten core materials and concrete is studied, with particular emphasis on measurements of the magnitude and chemical species present in the aerosol releases. The experiment technique used in these investigations is direct electrical heating in which a high electric current is passed through the core debris to sustain the high-temperature melt condition for potentially long periods of time. In the scoping experiments completed to date, this technique has been successfully used for corium masses of 5 and 20 kg, generating an internal heating rate of 1 kw/kg and achieving melt temperatures of 2000C. Experiments have been performed both with a concrete base and also with a cooled base with the addition of H2/CO sparging gas to represent chemical processes in a stratified layer. An aerosol and gas sampling system is being used to collect aerosol samples. Test results are now becoming available including masses of aerosols, x-ray diffraction, and scanning electron microscope analyses
International Nuclear Information System (INIS)
The event 'earthquake with subsequent solvent fire' in the low level active waste management area of the Wackersdorf reprocessing plant has been modelled by two different computer codes FIPLOC-M (GRS) and FIRAC (Los Alamos). The results have been compared and are described. The fire itself with its gas and aerosol generation rates has been simulated by the FIRIN code which is integrated in FIRAC. The results were fed identically into FIPLOC and FIRAC. FIRAC underestimated the quantity of aerosols released to the environment by about 20%. The reasons for this result have been determined and analysed. Other differences between the results and models are described too. For the simulation of transportation pathways which comprise one or more larger volumes and which are therefore not typical for the application of FIRAC the FIPLOC code gives better results concerning aerosol transport and deposition effects. Also aerosol modelling is more precise in FIPLOC (MAEROS module) than in FIRAC. On the other hand ventilation components such as filters or blowers are simulated much better in the FIRAC code. (orig./HP)
Containment behaviour in the event of core melt with gaseous and aerosol releases (CONGA)
Energy Technology Data Exchange (ETDEWEB)
Friesen, E. E-mail: Eckart.friesen@off1.siemens.de; Meseth, J.; Guentay, S.; Suckow, D.; Lopez Jimenez, J.; Herranz, L.; Peyres, V.; De Santi, G.F.; Krasenbrink, A.; Valisi, M.; Mazzocchi, L
2001-11-01
The CONGA project concentrated on theoretical and experimental studies investigating the behaviour of advanced light water reactor containments containing passive containment heat removal systems and catalytic recombiners expected to be effectively operational during a hypothetical severe accident involving large quantities of aerosol particles and noncondensable gases. The central point of interest was the investigation of the effect of aerosol deposition on the condensation heat transfer of specially designed finned-type heat exchangers (HX) as well as the recombination efficiency of catalytic recombiners. A conceptual double-wall Italian PWR design and a SWR1000 design from Siemens were considered specifically as the reference Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) designs. An assessment of selected accident scenarios was performed in order to define the range of boundary conditions necessary to perform the experimental studies of the other work packages. Experimental investigations indicated that aerosol deposition accounted for up to 37% loss in the heat removal capacity of the two-tube-layer BWR HX units. However, no significant heat transfer degradation could be observed for the PWR HX units. These results can be attributed to the important differences in the designs and operating conditions of the two units. The tests to study the effect of hydrogen (simulated by helium) on the heat transfer rate for heat exchanger units designed for BWR and PWR applications indicated a degradation less than 30% under various conditions. This was found to be acceptable within the over capacity designed for the heat exchangers or containment characteristics. The tests performed to study the long-term aerosol behaviour in the pressure suppression chamber of the current operating BWRs indicated that the water pool scrubs the aerosol particles effectively and reduces the ultimate aerosol load expected on the off-gas system. The efficiency of the
Containment behaviour in the event of core melt with gaseous and aerosol releases (CONGA)
International Nuclear Information System (INIS)
The CONGA project concentrated on theoretical and experimental studies investigating the behaviour of advanced light water reactor containments containing passive containment heat removal systems and catalytic recombiners expected to be effectively operational during a hypothetical severe accident involving large quantities of aerosol particles and noncondensable gases. The central point of interest was the investigation of the effect of aerosol deposition on the condensation heat transfer of specially designed finned-type heat exchangers (HX) as well as the recombination efficiency of catalytic recombiners. A conceptual double-wall Italian PWR design and a SWR1000 design from Siemens were considered specifically as the reference Pressurized Water Reactor (PWR) and Boiling Water Reactor (BWR) designs. An assessment of selected accident scenarios was performed in order to define the range of boundary conditions necessary to perform the experimental studies of the other work packages. Experimental investigations indicated that aerosol deposition accounted for up to 37% loss in the heat removal capacity of the two-tube-layer BWR HX units. However, no significant heat transfer degradation could be observed for the PWR HX units. These results can be attributed to the important differences in the designs and operating conditions of the two units. The tests to study the effect of hydrogen (simulated by helium) on the heat transfer rate for heat exchanger units designed for BWR and PWR applications indicated a degradation less than 30% under various conditions. This was found to be acceptable within the over capacity designed for the heat exchangers or containment characteristics. The tests performed to study the long-term aerosol behaviour in the pressure suppression chamber of the current operating BWRs indicated that the water pool scrubs the aerosol particles effectively and reduces the ultimate aerosol load expected on the off-gas system. The efficiency of the
Large methane releases lead to strong aerosol forcing and reduced cloudiness
DEFF Research Database (Denmark)
Kurten, T.; Zhou, L.; Makkonen, R.;
2011-01-01
contributions from the associated carbon dioxide or ozone increases. Here, we study the effect of strongly elevated methane (CH4) levels on oxidant and aerosol particle concentrations using a combination of chemistry-transport and general circulation models. A 10-fold increase in methane concentrations is...
Energy Technology Data Exchange (ETDEWEB)
Bourham, Mohamed A.; Gilligan, John G.
1999-08-14
Safety considerations in large future fusion reactors like ITER are important before licensing the reactor. Several scenarios are considered hazardous, which include safety of plasma-facing components during hard disruptions, high heat fluxes and thermal stresses during normal operation, accidental energy release, and aerosol formation and transport. Disruption events, in large tokamaks like ITER, are expected to produce local heat fluxes on plasma-facing components, which may exceed 100 GW/m{sup 2} over a period of about 0.1 ms. As a result, the surface temperature dramatically increases, which results in surface melting and vaporization, and produces thermal stresses and surface erosion. Plasma-facing components safety issues extends to cover a wide range of possible scenarios, including disruption severity and the impact of plasma-facing components on disruption parameters, accidental energy release and short/long term LOCA's, and formation of airborne particles by convective current transport during a LOVA (water/air ingress disruption) accident scenario. Study, and evaluation of, disruption-induced aerosol generation and mobilization is essential to characterize database on particulate formation and distribution for large future fusion tokamak reactor like ITER. In order to provide database relevant to ITER, the SIRENS electrothermal plasma facility at NCSU has been modified to closely simulate heat fluxes expected in ITER.
International Nuclear Information System (INIS)
During some postulated loss of coolant and loss of emergency coolant injection accidents, vapours of fission products and structural (fuel and cladding) materials may be released into steam-hydrogen mixtures flowing in a CANDU fuel channel. These vapours will condense into aerosol particles in the cooler parts of the primary heat transport system. As part of an on-going program to study aerosol formation, transport, deposition and associated fission product retention in the PHTS, experiments were conducted to measure aerosol mass release rates from a Zircaloy-4 clad UO2 pellet inductively heated to temperatures up to 2000 degrees C, in a forced-flow argon environment. A description of these experiments and the obtained results, including fractional mass release rates of structural materials, are presented in this paper
Possible way to prepare nanoparticles from aerosols released at plasma deposition
Czech Academy of Sciences Publication Activity Database
Brožek, Vlastimil; Mastný, L.; Moravec, Pavel; Ždímal, Vladimír
Ostrava: Tanger, 2012, s. 87-92. ISBN 978-80-87294-32-1. [NANOCON 2012. International Conference /4./. Brno (CZ), 23.10.2012-25.10.2012] Institutional research plan: CEZ:AV0Z20430508; CEZ:AV0Z40720504 Keywords : plasma spraying * aerosol technology * nanoparticles production * measurement of particle size distribution Subject RIV: BL - Plasma and Gas Discharge Physics; CI - Industrial Chemistry, Chemical Engineering (UCHP-M) http://www.nanocon.cz/files/proceedings/04/reports/780.pdf
BETA experiments on aerosol release during melt-concrete interaction and filtering of the offgas
International Nuclear Information System (INIS)
The BETA facility is a test rig for core melt accident experiments. This rig is described. Up to now, 7 melt-concrete interaction experiments have been carried out. Results of sampling and analysis are given for the aerosol size distribution and the chemical components of the simulating fission products added in the offgas line. The size distribution ranges from 0.1 to 1 μm. High-volatile aerosols are found in the samplers. The erosion data in downward and radial directions are summed up. The initial melt used in the tests was produced by a thermite chemical reaction of 300 kg steel, 80 kg Zircaloy 4 and 50 kg oxides with Al2O3, SiO2 and CaO. The starting temperature is typically 2250 K. In induction heating the net power inputs may differ between 200 kW and 1000 kW. A metal fiber filter is installed in the offgas line as a protection against the environment. The data of the filter will be presented. The amount of collected aerosols is in the range of 1.5 to 3.7 kg/per experiment. A video clip will be presented showing the melt and the optically visible difference in the offgas withstand without filtering
Directory of Open Access Journals (Sweden)
Hong Lei
2016-05-01
Full Text Available Microencapsulation is highly attractive for oral drug delivery. Microparticles are a common form of drug carrier for this purpose. There is still a high demand on efficient methods to fabricate microparticles with uniform sizes and well-controlled particle properties. In this paper, uniform hydroxypropyl methylcellulose phthalate (HPMCP-based pharmaceutical microparticles loaded with either hydrophobic or hydrophilic model drugs have been directly formulated by using a unique aerosol technique, i.e., the microfluidic spray drying technology. A series of microparticles of controllable particle sizes, shapes, and structures are fabricated by tuning the solvent composition and drying temperature. It is found that a more volatile solvent and a higher drying temperature can result in fast evaporation rates to form microparticles of larger lateral size, more irregular shape, and denser matrix. The nature of the model drugs also plays an important role in determining particle properties. The drug release behaviors of the pharmaceutical microparticles are dependent on their structural properties and the nature of a specific drug, as well as sensitive to the pH value of the release medium. Most importantly, drugs in the microparticles obtained by using a more volatile solvent or a higher drying temperature can be well protected from degradation in harsh simulated gastric fluids due to the dense structures of the microparticles, while they can be fast-released in simulated intestinal fluids through particle dissolution. These pharmaceutical microparticles are potentially useful for site-specific (enteric delivery of orally-administered drugs.
The Plinius/Colima CA-U3 test on fission-product aerosol release over a VVER-type corium pool
International Nuclear Information System (INIS)
In a hypothetical case of severe accident in a PWR type VVER-440, a complex corium pool could be formed and fission products could be released. In order to study aerosols release in terms of mechanisms, kinetics, nature or quantity, and to better precise the source term of VVER-440, a series of experiments have been performed in the Colima facility and the test Colima CA-U3 has been successfully performed thanks to technological modifications to melt a prototypical corium at 2760 C degrees. Specific instrumentation has allowed us to follow the evolution of the corium melt and the release, transport and deposition of the fission products. The main conclusions are: -) there is a large release of Cr, Te, Sr, Pr and Rh (>95%w), -) there is a significant release of Fe (50%w), -) there is a small release of Ba, Ce, La, Nb, Nd and Y (<90%w), -) there is a very small release of U in proportion (<5%w) but it is one of the major released species in mass, and -) there is no release of Zr. The Colima experimental results are consistent with previous experiments on irradiated fuels except for Ba, Fe and U releases. (A.C.)
International Nuclear Information System (INIS)
In this paper we made an estimation of radiological influences of gases and aerosoles releases over the population during the normal exploitation of Juragua's Nuclear Power Station. We determined the behaviour of the dilution factor and other factors that causes ground contamination. Also, it was calculated the quantities of equivalent doses by different exposition ways and it was done an evaluation of the individual and collective radiological risk
Lesaffre, Emmanuel
2012-01-01
The growth of biostatistics has been phenomenal in recent years and has been marked by considerable technical innovation in both methodology and computational practicality. One area that has experienced significant growth is Bayesian methods. The growing use of Bayesian methodology has taken place partly due to an increasing number of practitioners valuing the Bayesian paradigm as matching that of scientific discovery. In addition, computational advances have allowed for more complex models to be fitted routinely to realistic data sets. Through examples, exercises and a combination of introd
Draper, D.
2001-01-01
© 2012 Springer Science+Business Media, LLC. All rights reserved. Article Outline: Glossary Definition of the Subject and Introduction The Bayesian Statistical Paradigm Three Examples Comparison with the Frequentist Statistical Paradigm Future Directions Bibliography
International Nuclear Information System (INIS)
The behaviour of 134Cs, 110mAg and 85Sr was studied in different soil-plant systems, using two types of Mediterranean soil with contrasting properties (sandy and sandy-loam soils). The plant species used was lettuce (Lactuca sativa). Contamination was induced at different stages of plant growth, using a synthetic aerosol which simulated a distant contamination source. Characterisation of aerosol and soils, interception factors in the various growth stages, foliar and root uptake, leaching from leaves by irrigation and distribution and migration of radionuclides of soils were studied, in an attempt to understand the key factors involving radionuclide soil-to-plant transferance. (author)
International Nuclear Information System (INIS)
This study investigates via a parametric survey the essential factors determining the magnitude of the release mitigation to be obtained from aerosol deposition and thereby develops analytical and graphical representations which enable a rapid estimation of these attenuations over a range of source magnitudes, leak rates and containment geometries. The benefits obtained from aerosol deposition are conveniently exhibited by expressing airborne mass concentration and cumulative leaked mass in terms of non-dimensional attenuation factors which relate the quantities arising with deposition to those arising without deposition. Agglomeration is seen to play a crucial role in promoting deposition, and the reduction in release can amount to orders of magnitude on the time scale of a day for accidents involving high initial concentrations of airborne particulate. In the context of source magnitude variation for instantaneous releases the appearance of an envelope decay curve for the airborne mass concentration enables simple representations of airborne mass and leaked mass lo be developed. The leaked mass for constant fractional leak rate reaches an asymptotic limit at long times, a limit which varies only weakly with injected mass concentration. With regard to variation in containment size, the domination of deposition processes by gravitational settling results in attenuation factors for a given injected mass concentration (for containments of given height) being virtually independent of the area of walls (and roof). In addition, although the process coefficient for gravitational settling is inversely proportional to height, the net effect of a change in height on the attenuation factors for a given initial concentration is remarkably small where the attenuations are significantly different from unity on the time scale of a day. Thus, within the range of interest, attenuation factors for instantaneous releases are only a function of time and initial concentration to fair
International Nuclear Information System (INIS)
Analytical methods are described for (a) sodium; (b) the following anions of sodium aerosols: OH-, CO2- and HCO3-; (c) fission products Cs and Sr. For sodium, the ion selective electrode was used. The anions were determined by a titration method using phenolphthalein and methyl orange as indicators. Atomic absorption spectroscopy was used for Cs and Sr. (U.K.)
Effect of aerosolization on subsequent bacterial survival.
Walter, M V; Marthi, B; Fieland, V P; Ganio, L M
1990-01-01
To determine whether aerosolization could impair bacterial survival, Pseudomonas syringae and Erwinia herbicola were aerosolized in a greenhouse, the aerosol was sampled at various distances from the site of release by using all-glass impingers, and bacterial survival was followed in the impingers for 6 h. Bacterial survival subsequent to aerosolization of P. syringae and E. herbicola was not impaired 1 m from the site of release. P. syringae aerosolized at 3 to 15 m from the site of release ...
Kirstein, Roland
2005-01-01
This paper presents a modification of the inspection game: The ?Bayesian Monitoring? model rests on the assumption that judges are interested in enforcing compliant behavior and making correct decisions. They may base their judgements on an informative but imperfect signal which can be generated costlessly. In the original inspection game, monitoring is costly and generates a perfectly informative signal. While the inspection game has only one mixed strategy equilibrium, three Perfect Bayesia...
Bessiere, Pierre; Ahuactzin, Juan Manuel; Mekhnacha, Kamel
2013-01-01
Probability as an Alternative to Boolean LogicWhile logic is the mathematical foundation of rational reasoning and the fundamental principle of computing, it is restricted to problems where information is both complete and certain. However, many real-world problems, from financial investments to email filtering, are incomplete or uncertain in nature. Probability theory and Bayesian computing together provide an alternative framework to deal with incomplete and uncertain data. Decision-Making Tools and Methods for Incomplete and Uncertain DataEmphasizing probability as an alternative to Boolean
International Nuclear Information System (INIS)
A comprehensive review has been undertaken of appropriate analytical techniques to monitor and measure the chemical effects that occur in large-scale tests designed to study severe reactor accidents. Various methods have been developed to determine the chemical forms of the vapours, aerosols and deposits generated during and after such integral experiments. Other specific techniques have the long-term potential to provide some of the desired data in greater detail, although considerable efforts are still required to apply these techniques to the study of radioactive debris. Such in-situ and post-test methods of analysis have been also assessed in terms of their applicability to the analysis of samples from the Phebus-FP tests. The recommended in-situ methods of analysis are gamma-ray spectroscopy, potentiometry, mass spectrometry, and Raman/UV-visible absorption spectroscopy. Vapour/aerosol and deposition samples should also be obtained at well-defined time intervals during each experiment for subsequent post-test analysis. No single technique can provide all the necessary chemical data from these samples, and the most appropriate method of analysis involves a complementary combination of autoradiography, AES, IR, MRS, SEMS/EDS, SIMS/LMIS, XPS and XRD
International Nuclear Information System (INIS)
The behaviour of aerosols in LMFBR plant systems is of great importance for a number of problems, both normal operational and accident kind. This paper covers the following: aerosol modelling for LMFBR containment systems; aerosol size spectrometry by laser light scattering; experimental facilities and experimental results concerned with aerosol release under accident conditions; filtration of sodium oxide aerosols by multilayer sand bed filters
Bayesian artificial intelligence
Korb, Kevin B
2010-01-01
Updated and expanded, Bayesian Artificial Intelligence, Second Edition provides a practical and accessible introduction to the main concepts, foundation, and applications of Bayesian networks. It focuses on both the causal discovery of networks and Bayesian inference procedures. Adopting a causal interpretation of Bayesian networks, the authors discuss the use of Bayesian networks for causal modeling. They also draw on their own applied research to illustrate various applications of the technology.New to the Second EditionNew chapter on Bayesian network classifiersNew section on object-oriente
DEFF Research Database (Denmark)
Jensen, Finn Verner; Nielsen, Thomas Dyhre
2016-01-01
Mathematically, a Bayesian graphical model is a compact representation of the joint probability distribution for a set of variables. The most frequently used type of Bayesian graphical models are Bayesian networks. The structural part of a Bayesian graphical model is a graph consisting of nodes and...... largely due to the availability of efficient inference algorithms for answering probabilistic queries about the states of the variables in the network. Furthermore, to support the construction of Bayesian network models, learning algorithms are also available. We give an overview of the Bayesian network...
Gelman, Andrew; Stern, Hal S; Dunson, David B; Vehtari, Aki; Rubin, Donald B
2013-01-01
FUNDAMENTALS OF BAYESIAN INFERENCEProbability and InferenceSingle-Parameter Models Introduction to Multiparameter Models Asymptotics and Connections to Non-Bayesian ApproachesHierarchical ModelsFUNDAMENTALS OF BAYESIAN DATA ANALYSISModel Checking Evaluating, Comparing, and Expanding ModelsModeling Accounting for Data Collection Decision AnalysisADVANCED COMPUTATION Introduction to Bayesian Computation Basics of Markov Chain Simulation Computationally Efficient Markov Chain Simulation Modal and Distributional ApproximationsREGRESSION MODELS Introduction to Regression Models Hierarchical Linear
Yuan, Ying; MacKinnon, David P.
2009-01-01
This article proposes Bayesian analysis of mediation effects. Compared to conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian mediation analysis, inference is straightforward and exact, which makes it appealing for studies with small samples. Third, the Bayesian approach is conceptua...
Bayesian Games with Intentions
Bjorndahl, Adam; Halpern, Joseph Y.; Pass, Rafael
2016-01-01
We show that standard Bayesian games cannot represent the full spectrum of belief-dependent preferences. However, by introducing a fundamental distinction between intended and actual strategies, we remove this limitation. We define Bayesian games with intentions, generalizing both Bayesian games and psychological games, and prove that Nash equilibria in psychological games correspond to a special class of equilibria as defined in our setting.
Salama, Rania O; Traini, Daniela; Chan, Hak-Kim; Young, Paul M
2008-09-01
Three in vitro methodologies were evaluated as models for the analysis of drug release from controlled release (CR) microparticulates for inhalation. USP Apparatus 2 (dissolution model), USP Apparatus 4 (flow through model) and a modified Franz cell (diffusion model), were investigated using identical sink volumes and temperatures (1000 ml and 37 degrees C). Microparticulates containing DSCG and different percentages of PVA (0%, 30%, 50%, 70% and 90%) were used as model CR formulations. Evaluation of the release profiles of DSCG from the modified PVA formulations, suggested that all data fitted a Weibull distribution model with R2 > or =0.942. Statistical analysis of the t(d) (time for 63.2% drug release) indicated that all methodologies could distinguish between microparticles that did or did not contain PVA (Students t-test, p or =0.862 for the diffusion methodology data set). Due to the relatively low water content in the respiratory tract and the lack of differentiation between formulations for USP Apparatus 2 and 4, it is concluded that the diffusion model is more applicable for the evaluation of CR inhalation medicines. PMID:18534832
Energy Technology Data Exchange (ETDEWEB)
Journeau, Ch.; Piluso, P.; Correggio, P.; Godin-Jacqmin, L
2007-07-01
In a hypothetical case of severe accident in a PWR type VVER-440, a complex corium pool could be formed and fission products could be released. In order to study aerosols release in terms of mechanisms, kinetics, nature or quantity, and to better precise the source term of VVER-440, a series of experiments have been performed in the Colima facility and the test Colima CA-U3 has been successfully performed thanks to technological modifications to melt a prototypical corium at 2760 C degrees. Specific instrumentation has allowed us to follow the evolution of the corium melt and the release, transport and deposition of the fission products. The main conclusions are: -) there is a large release of Cr, Te, Sr, Pr and Rh (>95%w), -) there is a significant release of Fe (50%w), -) there is a small release of Ba, Ce, La, Nb, Nd and Y (<90%w), -) there is a very small release of U in proportion (<5%w) but it is one of the major released species in mass, and -) there is no release of Zr. The Colima experimental results are consistent with previous experiments on irradiated fuels except for Ba, Fe and U releases. (A.C.)
International Nuclear Information System (INIS)
The US program LACE (LWR Aerosol Containment Experiments), in which Italy participates together with several European countries, Canada and Japan, aims at evaluating by means of a large scale experimental activity at HEDL the retention in the pipings and primary container of the radioactive aerosol released following severe accidents in light water reactors. At the same time these experiences will make available data through which the codes used to analyse the behaviour of the aerosol in the containment and to verify whether by means of the codes of thermohydraulic computation it is possible to evaluate with sufficient accuracy variable influencing the aerosol behaviour, can be validated. This report shows and compares the results obtained by the participants in the LACE program with the aerosol containment codes NAVA 5 and CONTAIN for the pre-test computations of the test LA 1, in which an accident called containment by pass is simulated
Rubin, Donald B.
1981-01-01
The Bayesian bootstrap is the Bayesian analogue of the bootstrap. Instead of simulating the sampling distribution of a statistic estimating a parameter, the Bayesian bootstrap simulates the posterior distribution of the parameter; operationally and inferentially the methods are quite similar. Because both methods of drawing inferences are based on somewhat peculiar model assumptions and the resulting inferences are generally sensitive to these assumptions, neither method should be applied wit...
Bayesian statistics an introduction
Lee, Peter M
2012-01-01
Bayesian Statistics is the school of thought that combines prior beliefs with the likelihood of a hypothesis to arrive at posterior beliefs. The first edition of Peter Lee’s book appeared in 1989, but the subject has moved ever onwards, with increasing emphasis on Monte Carlo based techniques. This new fourth edition looks at recent techniques such as variational methods, Bayesian importance sampling, approximate Bayesian computation and Reversible Jump Markov Chain Monte Carlo (RJMCMC), providing a concise account of the way in which the Bayesian approach to statistics develops as wel
Understanding Computational Bayesian Statistics
Bolstad, William M
2011-01-01
A hands-on introduction to computational statistics from a Bayesian point of view Providing a solid grounding in statistics while uniquely covering the topics from a Bayesian perspective, Understanding Computational Bayesian Statistics successfully guides readers through this new, cutting-edge approach. With its hands-on treatment of the topic, the book shows how samples can be drawn from the posterior distribution when the formula giving its shape is all that is known, and how Bayesian inferences can be based on these samples from the posterior. These ideas are illustrated on common statistic
Frühwirth-Schnatter, Sylvia
1990-01-01
In the paper at hand we apply it to Bayesian statistics to obtain "Fuzzy Bayesian Inference". In the subsequent sections we will discuss a fuzzy valued likelihood function, Bayes' theorem for both fuzzy data and fuzzy priors, a fuzzy Bayes' estimator, fuzzy predictive densities and distributions, and fuzzy H.P.D .-Regions. (author's abstract)
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Granade, Christopher; Cory, D G
2015-01-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of- the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we solve all three problems. First, we use modern statistical methods, as pioneered by Husz\\'ar and Houlsby and by Ferrie, to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first informative priors on quantum states and channels. Finally, we develop a method that allows online tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Bayesian exploratory factor analysis
Gabriella Conti; Sylvia Frühwirth-Schnatter; James Heckman; Rémi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identifi cation criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study c...
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Bayesian Exploratory Factor Analysis
Gabriella Conti; Sylvia Fruehwirth-Schnatter; Heckman, James J.; Remi Piatek
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on \\emph{ad hoc} classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo s...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo st...
Bayesian exploratory factor analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study co...
Carbonetto, Peter; Kisynski, Jacek; De Freitas, Nando; Poole, David L
2012-01-01
The Bayesian Logic (BLOG) language was recently developed for defining first-order probability models over worlds with unknown numbers of objects. It handles important problems in AI, including data association and population estimation. This paper extends BLOG by adopting generative processes over function spaces - known as nonparametrics in the Bayesian literature. We introduce syntax for reasoning about arbitrary collections of objects, and their properties, in an intuitive manner. By expl...
Bayesian default probability models
Andrlíková, Petra
2014-01-01
This paper proposes a methodology for default probability estimation for low default portfolios, where the statistical inference may become troublesome. The author suggests using logistic regression models with the Bayesian estimation of parameters. The piecewise logistic regression model and Box-Cox transformation of credit risk score is used to derive the estimates of probability of default, which extends the work by Neagu et al. (2009). The paper shows that the Bayesian models are more acc...
International Nuclear Information System (INIS)
Organic aerosols scatter solar radiation. They may also either enhance or decrease concentrations of cloud condensation nuclei. This paper summarizes observed concentrations of aerosols in remote continental and marine locations and provides estimates for the sources of organic aerosol matter. The anthropogenic sources of organic aerosols may be as large as the anthropogenic sources of sulfate aerosols, implying a similar magnitude of direct forcing of climate. The source estimates are highly uncertain and subject to revision in the future. A slow secondary source of organic aerosols of unknown origin may contribute to the observed oceanic concentrations. The role of organic aerosols acting as cloud condensation nuclei (CCN) is described and it is concluded that they may either enhance or decrease the ability of anthropogenic sulfate aerosols to act as CCN
Saltzman, Es
2009-01-01
The aerosol over the world oceans plays an important role in determining the physical and chemical characteristics of the Earth's atmosphere and its interactions with the climate system. The oceans contribute to the aerosols in the overlying atmosphere by the production and emission of aerosol particles and precursor gases. The marine aerosol, in turn, influences the biogeochemistry of the surface ocean through long distance transport and deposition of terrestrial and marine-derived nutrients...
Review of models applicable to accident aerosols
International Nuclear Information System (INIS)
Estimations of potential airborne-particle releases are essential in safety assessments of nuclear-fuel facilities. This report is a review of aerosol behavior models that have potential applications for predicting aerosol characteristics in compartments containing accident-generated aerosol sources. Such characterization of the accident-generated aerosols is a necessary step toward estimating their eventual release in any accident scenario. Existing aerosol models can predict the size distribution, concentration, and composition of aerosols as they are acted on by ventilation, diffusion, gravity, coagulation, and other phenomena. Models developed in the fields of fluid mechanics, indoor air pollution, and nuclear-reactor accidents are reviewed with this nuclear fuel facility application in mind. The various capabilities of modeling aerosol behavior are tabulated and discussed, and recommendations are made for applying the models to problems of differing complexity
Bayesian least squares deconvolution
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Bayesian least squares deconvolution
Ramos, A Asensio
2015-01-01
Aims. To develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods. We consider LSD under the Bayesian framework and we introduce a flexible Gaussian Process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results. We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Loredo, T J
2004-01-01
I describe a framework for adaptive scientific exploration based on iterating an Observation--Inference--Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data--measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object--show the approach can significantly improve observational eff...
Bayesian and frequentist inequality tests
David M. Kaplan; Zhuo, Longhao
2016-01-01
Bayesian and frequentist criteria are fundamentally different, but often posterior and sampling distributions are asymptotically equivalent (and normal). We compare Bayesian and frequentist hypothesis tests of inequality restrictions in such cases. For finite-dimensional parameters, if the null hypothesis is that the parameter vector lies in a certain half-space, then the Bayesian test has (frequentist) size $\\alpha$; if the null hypothesis is any other convex subspace, then the Bayesian test...
Bayesian multiple target tracking
Streit, Roy L
2013-01-01
This second edition has undergone substantial revision from the 1999 first edition, recognizing that a lot has changed in the multiple target tracking field. One of the most dramatic changes is in the widespread use of particle filters to implement nonlinear, non-Gaussian Bayesian trackers. This book views multiple target tracking as a Bayesian inference problem. Within this framework it develops the theory of single target tracking, multiple target tracking, and likelihood ratio detection and tracking. In addition to providing a detailed description of a basic particle filter that implements
Bayesian Exploratory Factor Analysis
DEFF Research Database (Denmark)
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the...... corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...
Nuclear aerosol test facility studies using plasma torch aerosol generator
International Nuclear Information System (INIS)
In order to study the behavior of aerosols released into the reactor containment following accidents, an experimental simulation facility, called Nuclear Aerosol Test Facility (NATF) has recently been built and commissioned in BARC. It mainly consists of a Test vessel for simulating the containment, plasma torch aerosol generator (PTAG) system for generating metal-based aerosols and aerosol monitoring instrumentation. The main component of the PTAG is a 40 kW dc plasma torch, powered by a constant current power supply, operating in a non-transferred arc mode. Optimal operating conditions of PTAG have been established. Experiments consist of injecting the aerosols of a given material for about 20 minutes into the vessel, simultaneously monitoring the concentrations at various points in the vessel. The measurements of the size distribution and mass concentrations in the vessel are carried out at periodic intervals. Various combination of experiments with different metals such as zinc, tin and manganese, under varying turbulence conditions (with and without keeping the fan continuously on) have been performed. The aerosols were generally found to be fractal aggregates with low fractal dimension (∼1.6). The mass depletion data have been subjected to theoretical analysis and validation exercises with available aerosol behavior codes. The results are further discussed. (author)
Bayesian Geostatistical Design
DEFF Research Database (Denmark)
Diggle, Peter; Lophaven, Søren Nymand
2006-01-01
locations to, or deletion of locations from, an existing design, and prospective design, which consists of choosing positions for a new set of sampling locations. We propose a Bayesian design criterion which focuses on the goal of efficient spatial prediction whilst allowing for the fact that model...
Czech Academy of Sciences Publication Activity Database
Krejsa, Jiří; Věchet, S.
Bratislava: Slovak University of Technology in Bratislava, 2010, s. 217-222. ISBN 978-80-227-3353-3. [Robotics in Education . Bratislava (SK), 16.09.2010-17.09.2010] Institutional research plan: CEZ:AV0Z20760514 Keywords : mobile robot localization * bearing only beacons * Bayesian filters Subject RIV: JD - Computer Applications, Robotics
DEFF Research Database (Denmark)
Antoniou, Constantinos; Harrison, Glenn W.; Lau, Morten I.;
2015-01-01
A large literature suggests that many individuals do not apply Bayes’ Rule when making decisions that depend on them correctly pooling prior information and sample data. We replicate and extend a classic experimental study of Bayesian updating from psychology, employing the methods of experimenta...
Bayesian Independent Component Analysis
DEFF Research Database (Denmark)
Winther, Ole; Petersen, Kaare Brandt
2007-01-01
In this paper we present an empirical Bayesian framework for independent component analysis. The framework provides estimates of the sources, the mixing matrix and the noise parameters, and is flexible with respect to choice of source prior and the number of sources and sensors. Inside the engine...
Noncausal Bayesian Vector Autoregression
DEFF Research Database (Denmark)
Lanne, Markku; Luoto, Jani
We propose a Bayesian inferential procedure for the noncausal vector autoregressive (VAR) model that is capable of capturing nonlinearities and incorporating effects of missing variables. In particular, we devise a fast and reliable posterior simulator that yields the predictive distribution as a...
Loredo, Thomas J.
2004-04-01
I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.
Bayesian logistic regression analysis
Van Erp, H.R.N.; Van Gelder, P.H.A.J.M.
2012-01-01
In this paper we present a Bayesian logistic regression analysis. It is found that if one wishes to derive the posterior distribution of the probability of some event, then, together with the traditional Bayes Theorem and the integrating out of nuissance parameters, the Jacobian transformation is an
International Nuclear Information System (INIS)
This experimental study concerns the consequences on the environment of a PWR severe accident. A preliminary bibliographical survey has been undertaken in order to determine the elements to study, and the experimental protocols to use. 4 fission products (Cs, Sr, Ru, Ce) and 3 structure materials (Ag, Fe, In) have been chosen. Tests of cations (Cs+) retention by soils have been done. They showed up the great variability of the results according to experimental procedures (contact time, agitation, solid phase concentration...). The adoption of a standard procedure which would enable the different results comparison is suggested. Then, the dissolution of powders from the 7 elements has been studied in different solutions. Two different phenomena occurs for some elements. We observed a partial dissolution of Ag, In and Ce, according to solution compositions, but fine particles or colloid presence may contribute to the solution total activity. The Cs dissolution is more important but never complete, because of an amalgam formation during calcination with structure materials. Ru doesn't dissolve, and fine particles presence is the reason of solution activity. Soils retention is minimal for the elements that are neutral, like Ru, and maximal for cations, especially Cs+. High contents of organic matter and clay in soils enhance retention. Thanks to the new theoretical source term values, plurielementary aerosols fabrication has debuted. The installation we used (Inducing oven with an aerosol maturation enclosure) allows the obtention of temperatures as high as 2800 - 30000 C and the volatilization of 13 elements between the 16 presents. Suggestions are done that may increase the Ru, Ce and Zr emissions
International Nuclear Information System (INIS)
As part of the continuing studies of the effects of very severe reactor accidents, an effort was made to develop, test, and improve simple, effective, and inexpensive methods by which the average citizen, using only materials readily available, could protect his residence, himself, and his family from injury by toxic aerosols. The methods for protection against radioactive aerosols should be equally effective against a clandestine biological attack by terrorists. The results of the tests to date are limited to showing that spores of the harmless bacterium, bacillus globegii (BG), can be used as a simulant for the radioactive aerosols. An aerosol generator of Lauterbach type was developed which will produce an essentially monodisperse aerosol at the rate of 109 spores/min. Analytical techniques have been established which give reproducible results. Preliminary field tests have been conducted to check out the components of the system. Preliminary tests of protective devices, such as ordinary vacuum sweepers, have given protection factors of over 1000
International Nuclear Information System (INIS)
Stratospheric aerosol measurements can provide both spatial and temporal data of sufficient resolution to be of use in climate models. Relatively recent results from a wide range of instrument techniques for measuring stratospheric aerosol parameters are described. Such techniques include impactor sampling, lidar system sensing, filter sampling, photoelectric particle counting, satellite extinction-sensing using the sun as a source, and optical depth probing, at sites mainly removed from tropospheric aerosol sources. Some of these techniques have also had correlative and intercomparison studies. The main methods for determining the vertical profiles of stratospheric aerosols are outlined: lidar extinction measurements from satellites; impactor measurements from balloons and aircraft; and photoelectric particle counter measurements from balloons, aircraft, and rockets. The conversion of the lidar backscatter to stratospheric aerosol mass loading is referred to. Absolute measurements of total solar extinction from satellite orbits can be used to extract the aerosol extinction, and several examples of vertical profiles of extinction obtained with the SAGE satellite are given. Stratospheric mass loading can be inferred from extinction using approximate linear relationships but under restrictive conditions. Impactor sampling is essentially the only method in which the physical nature of the stratospheric aerosol is observed visually. Vertical profiles of stratospheric aerosol number concentration using impactor data are presented. Typical profiles using a dual-size-range photoelectric dustsonde particle counter are given for volcanically disturbed and inactive periods. Some measurements of the global distribution of stratospheric aerosols are also presented. Volatility measurements are described, indicating that stratospheric aerosols are composed primarily of about 75% sulfuric acid and 25% water
THE EFFECT OF AEROSOLIZATION ON SUBSEQUENT BACTERIAL SURVIVAL
To determine whether aerosolization could impair baterial survival, Pseudomonas syringae and Erwinia herbicola were aerosolized in a greenhouse, the aerosol was sampled at various distances from the site of release by using all-glass impingers, and bacterial survival was followed...
Probability and Bayesian statistics
1987-01-01
This book contains selected and refereed contributions to the "Inter national Symposium on Probability and Bayesian Statistics" which was orga nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...
Bayesian Magic in Asteroseismology
Kallinger, T.
2015-09-01
Only a few years ago asteroseismic observations were so rare that scientists had plenty of time to work on individual data sets. They could tune their algorithms in any possible way to squeeze out the last bit of information. Nowadays this is impossible. With missions like MOST, CoRoT, and Kepler we basically drown in new data every day. To handle this in a sufficient way statistical methods become more and more important. This is why Bayesian techniques started their triumph march across asteroseismology. I will go with you on a journey through Bayesian Magic Land, that brings us to the sea of granulation background, the forest of peakbagging, and the stony alley of model comparison.
Bayesian Nonparametric Graph Clustering
Banerjee, Sayantan; Akbani, Rehan; Baladandayuthapani, Veerabhadran
2015-01-01
We present clustering methods for multivariate data exploiting the underlying geometry of the graphical structure between variables. As opposed to standard approaches that assume known graph structures, we first estimate the edge structure of the unknown graph using Bayesian neighborhood selection approaches, wherein we account for the uncertainty of graphical structure learning through model-averaged estimates of the suitable parameters. Subsequently, we develop a nonparametric graph cluster...
Approximate Bayesian recursive estimation
Czech Academy of Sciences Publication Activity Database
Kárný, Miroslav
2014-01-01
Roč. 285, č. 1 (2014), s. 100-111. ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Approximate parameter estimation * Bayesian recursive estimation * Kullback–Leibler divergence * Forgetting Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.038, year: 2014 http://library.utia.cas.cz/separaty/2014/AS/karny-0425539.pdf
Bayesian Benchmark Dose Analysis
Fang, Qijun; Piegorsch, Walter W.; Barnes, Katherine Y.
2014-01-01
An important objective in environmental risk assessment is estimation of minimum exposure levels, called Benchmark Doses (BMDs) that induce a pre-specified Benchmark Response (BMR) in a target population. Established inferential approaches for BMD analysis typically involve one-sided, frequentist confidence limits, leading in practice to what are called Benchmark Dose Lower Limits (BMDLs). Appeal to Bayesian modeling and credible limits for building BMDLs is far less developed, however. Indee...
Bayesian Generalized Rating Curves
Helgi Sigurðarson 1985
2014-01-01
A rating curve is a curve or a model that describes the relationship between water elevation, or stage, and discharge in an observation site in a river. The rating curve is fit from paired observations of stage and discharge. The rating curve then predicts discharge given observations of stage and this methodology is applied as stage is substantially easier to directly observe than discharge. In this thesis a statistical rating curve model is proposed working within the framework of Bayesian...
Heteroscedastic Treed Bayesian Optimisation
Assael, John-Alexander M.; Wang, Ziyu; Shahriari, Bobak; De Freitas, Nando
2014-01-01
Optimising black-box functions is important in many disciplines, such as tuning machine learning models, robotics, finance and mining exploration. Bayesian optimisation is a state-of-the-art technique for the global optimisation of black-box functions which are expensive to evaluate. At the core of this approach is a Gaussian process prior that captures our belief about the distribution over functions. However, in many cases a single Gaussian process is not flexible enough to capture non-stat...
Efficient Bayesian Phase Estimation
Wiebe, Nathan; Granade, Chris
2016-07-01
We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.
Brody, Samuel; Lapata, Mirella
2009-01-01
Sense induction seeks to automatically identify word senses directly from a corpus. A key assumption underlying previous work is that the context surrounding an ambiguous word is indicative of its meaning. Sense induction is thus typically viewed as an unsupervised clustering problem where the aim is to partition a word’s contexts into different classes, each representing a word sense. Our work places sense induction in a Bayesian context by modeling the contexts of the ambiguous word as samp...
Bayesian Neural Word Embedding
Barkan, Oren
2016-01-01
Recently, several works in the domain of natural language processing presented successful methods for word embedding. Among them, the Skip-gram (SG) with negative sampling, known also as Word2Vec, advanced the state-of-the-art of various linguistics tasks. In this paper, we propose a scalable Bayesian neural word embedding algorithm that can be beneficial to general item similarity tasks as well. The algorithm relies on a Variational Bayes solution for the SG objective and a detailed step by ...
Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory
2016-04-01
Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.
Bayesian theory and applications
Dellaportas, Petros; Polson, Nicholas G; Stephens, David A
2013-01-01
The development of hierarchical models and Markov chain Monte Carlo (MCMC) techniques forms one of the most profound advances in Bayesian analysis since the 1970s and provides the basis for advances in virtually all areas of applied and theoretical Bayesian statistics. This volume guides the reader along a statistical journey that begins with the basic structure of Bayesian theory, and then provides details on most of the past and present advances in this field. The book has a unique format. There is an explanatory chapter devoted to each conceptual advance followed by journal-style chapters that provide applications or further advances on the concept. Thus, the volume is both a textbook and a compendium of papers covering a vast range of topics. It is appropriate for a well-informed novice interested in understanding the basic approach, methods and recent applications. Because of its advanced chapters and recent work, it is also appropriate for a more mature reader interested in recent applications and devel...
Unbounded Bayesian Optimization via Regularization
Shahriari, Bobak; Bouchard-Côté, Alexandre; De Freitas, Nando
2015-01-01
Bayesian optimization has recently emerged as a popular and efficient tool for global optimization and hyperparameter tuning. Currently, the established Bayesian optimization practice requires a user-defined bounding box which is assumed to contain the optimizer. However, when little is known about the probed objective function, it can be difficult to prescribe such bounds. In this work we modify the standard Bayesian optimization framework in a principled way to allow automatic resizing of t...
Bayesian optimization for materials design
Frazier, Peter I.; Wang, Jialei
2015-01-01
We introduce Bayesian optimization, a technique developed for optimizing time-consuming engineering simulations and for fitting machine learning models on large datasets. Bayesian optimization guides the choice of experiments during materials design and discovery to find good material designs in as few experiments as possible. We focus on the case when materials designs are parameterized by a low-dimensional vector. Bayesian optimization is built on a statistical technique called Gaussian pro...
Bayesian Network Based XP Process Modelling
Directory of Open Access Journals (Sweden)
Mohamed Abouelela
2010-07-01
Full Text Available A Bayesian Network based mathematical model has been used for modelling Extreme Programmingsoftware development process. The model is capable of predicting the expected finish time and theexpected defect rate for each XP release. Therefore, it can be used to determine the success/failure of anyXP Project. The model takes into account the effect of three XP practices, namely: Pair Programming,Test Driven Development and Onsite Customer practices. The model’s predictions were validated againsttwo case studies. Results show the precision of our model especially in predicting the project finish time.
MODIS 3 km aerosol product: algorithm and global perspective
Remer, L. A.; Mattoo, S; R. C. Levy; L. A. Munchak
2013-01-01
After more than a decade of producing a nominal 10 km aerosol product based on the dark target method, the MODerate resolution Imaging Spectroradiometer (MODIS) aerosol team will be releasing a nominal 3 km product as part of their Collection 6 release. The new product differs from the original 10 km product only in the manner in which reflectance pixels are ingested, organized and selected by the aerosol algorithm. Overall, the 3 km product closely mirrors the 10 km product. H...
Buseck, P. R.; Schwartz, S. E.
2003-12-01
It is widely believed that "On a clear day you can see forever," as proclaimed in the 1965 Broadway musical of the same name. While an admittedly beautiful thought, we all know that this concept is only figurative. Aside from Earth's curvature and Rayleigh scattering by air molecules, aerosols - colloidal suspensions of solid or liquid particles in a gas - limit our vision. Even on the clearest day, there are billions of aerosol particles per cubic meter of air.Atmospheric aerosols are commonly referred to as smoke, dust, haze, and smog, terms that are loosely reflective of their origin and composition. Aerosol particles have arisen naturally for eons from sea spray, volcanic emissions, wind entrainment of mineral dust, wildfires, and gas-to-particle conversion of hydrocarbons from plants and dimethylsulfide from the oceans. However, over the industrial period, the natural background aerosol has been greatly augmented by anthropogenic contributions, i.e., those produced by human activities. One manifestation of this impact is reduced visibility (Figure 1). Thus, perhaps more than in other realms of geochemistry, when considering the composition of the troposphere one must consider the effects of these activities. The atmosphere has become a reservoir for vast quantities of anthropogenic emissions that exert important perturbations on it and on the planetary ecosystem in general. Consequently, much recent research focuses on the effects of human activities on the atmosphere and, through them, on the environment and Earth's climate. For these reasons consideration of the geochemistry of the atmosphere, and of atmospheric aerosols in particular, must include the effects of human activities. (201K)Figure 1. Impairment of visibility by aerosols. Photographs at Yosemite National Park, California, USA. (a) Low aerosol concentration (particulate matter of aerodynamic diameter less than 2.5 μm, PM2.5=0.3 μg m-3; particulate matter of aerodynamic diameter less than 10
Bayesian nonparametric data analysis
Müller, Peter; Jara, Alejandro; Hanson, Tim
2015-01-01
This book reviews nonparametric Bayesian methods and models that have proven useful in the context of data analysis. Rather than providing an encyclopedic review of probability models, the book’s structure follows a data analysis perspective. As such, the chapters are organized by traditional data analysis problems. In selecting specific nonparametric models, simpler and more traditional models are favored over specialized ones. The discussed methods are illustrated with a wealth of examples, including applications ranging from stylized examples to case studies from recent literature. The book also includes an extensive discussion of computational methods and details on their implementation. R code for many examples is included in on-line software pages.
Decentralized Distributed Bayesian Estimation
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil; Sečkárová, Vladimíra
Praha: ÚTIA AVČR, v.v.i, 2011 - (Janžura, M.; Ivánek, J.). s. 16-16 [7th International Workshop on Data–Algorithms–Decision Making. 27.11.2011-29.11.2011, Mariánská] R&D Projects: GA ČR 102/08/0567; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : estimation * distributed estimation * model Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2011/AS/dedecius-decentralized distributed bayesian estimation.pdf
Congdon, Peter
2014-01-01
This book provides an accessible approach to Bayesian computing and data analysis, with an emphasis on the interpretation of real data sets. Following in the tradition of the successful first edition, this book aims to make a wide range of statistical modeling applications accessible using tested code that can be readily adapted to the reader's own applications. The second edition has been thoroughly reworked and updated to take account of advances in the field. A new set of worked examples is included. The novel aspect of the first edition was the coverage of statistical modeling using WinBU
Computationally efficient Bayesian tracking
Aughenbaugh, Jason; La Cour, Brian
2012-06-01
In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.
Improved iterative Bayesian unfolding
D'Agostini, G
2010-01-01
This paper reviews the basic ideas behind a Bayesian unfolding published some years ago and improves their implementation. In particular, uncertainties are now treated at all levels by probability density functions and their propagation is performed by Monte Carlo integration. Thus, small numbers are better handled and the final uncertainty does not rely on the assumption of normality. Theoretical and practical issues concerning the iterative use of the algorithm are also discussed. The new program, implemented in the R language, is freely available, together with sample scripts to play with toy models.
Bayesian Inference on Gravitational Waves
Directory of Open Access Journals (Sweden)
Asad Ali
2015-12-01
Full Text Available The Bayesian approach is increasingly becoming popular among the astrophysics data analysis communities. However, the Pakistan statistics communities are unaware of this fertile interaction between the two disciplines. Bayesian methods have been in use to address astronomical problems since the very birth of the Bayes probability in eighteenth century. Today the Bayesian methods for the detection and parameter estimation of gravitational waves have solid theoretical grounds with a strong promise for the realistic applications. This article aims to introduce the Pakistan statistics communities to the applications of Bayesian Monte Carlo methods in the analysis of gravitational wave data with an overview of the Bayesian signal detection and estimation methods and demonstration by a couple of simplified examples.
Toxicity of atmospheric aerosols on marine phytoplankton
Paytan, A.; Mackey, K.R.M.; Chen, Y.; Lima, I.D.; Doney, S.C.; Mahowald, N.; Labiosa, R.; Post, A.F.
2009-01-01
Atmospheric aerosol deposition is an important source of nutrients and trace metals to the open ocean that can enhance ocean productivity and carbon sequestration and thus influence atmospheric carbon dioxide concentrations and climate. Using aerosol samples from different back trajectories in incubation experiments with natural communities, we demonstrate that the response of phytoplankton growth to aerosol additions depends on specific components in aerosols and differs across phytoplankton species. Aerosol additions enhanced growth by releasing nitrogen and phosphorus, but not all aerosols stimulated growth. Toxic effects were observed with some aerosols, where the toxicity affected picoeukaryotes and Synechococcus but not Prochlorococcus.We suggest that the toxicity could be due to high copper concentrations in these aerosols and support this by laboratory copper toxicity tests preformed with Synechococcus cultures. However, it is possible that other elements present in the aerosols or unknown synergistic effects between these elements could have also contributed to the toxic effect. Anthropogenic emissions are increasing atmospheric copper deposition sharply, and based on coupled atmosphere-ocean calculations, we show that this deposition can potentially alter patterns of marine primary production and community structure in high aerosol, low chlorophyll areas, particularly in the Bay of Bengal and downwind of South and East Asia.
Adaptive Dynamic Bayesian Networks
Energy Technology Data Exchange (ETDEWEB)
Ng, B M
2007-10-26
A discrete-time Markov process can be compactly modeled as a dynamic Bayesian network (DBN)--a graphical model with nodes representing random variables and directed edges indicating causality between variables. Each node has a probability distribution, conditional on the variables represented by the parent nodes. A DBN's graphical structure encodes fixed conditional dependencies between variables. But in real-world systems, conditional dependencies between variables may be unknown a priori or may vary over time. Model errors can result if the DBN fails to capture all possible interactions between variables. Thus, we explore the representational framework of adaptive DBNs, whose structure and parameters can change from one time step to the next: a distribution's parameters and its set of conditional variables are dynamic. This work builds on recent work in nonparametric Bayesian modeling, such as hierarchical Dirichlet processes, infinite-state hidden Markov networks and structured priors for Bayes net learning. In this paper, we will explain the motivation for our interest in adaptive DBNs, show how popular nonparametric methods are combined to formulate the foundations for adaptive DBNs, and present preliminary results.
Bayesian analysis toolkit - BAT
International Nuclear Information System (INIS)
Statistical treatment of data is an essential part of any data analysis and interpretation. Different statistical methods and approaches can be used, however the implementation of these approaches is complicated and at times inefficient. The Bayesian analysis toolkit (BAT) is a software package developed in C++ framework that facilitates the statistical analysis of the data using Bayesian theorem. The tool evaluates the posterior probability distributions for models and their parameters using Markov Chain Monte Carlo which in turn provide straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as simulated annealing, allow extraction of the global mode of the posterior. BAT sets a well-tested environment for flexible model definition and also includes a set of predefined models for standard statistical problems. The package is interfaced to other software packages commonly used in high energy physics, such as ROOT, Minuit, RooStats and CUBA. We present a general overview of BAT and its algorithms. A few physics examples are shown to introduce the spectrum of its applications. In addition, new developments and features are summarized.
Book review: Bayesian analysis for population ecology
Link, William A.
2011-01-01
Brian Dennis described the field of ecology as “fertile, uncolonized ground for Bayesian ideas.” He continued: “The Bayesian propagule has arrived at the shore. Ecologists need to think long and hard about the consequences of a Bayesian ecology. The Bayesian outlook is a successful competitor, but is it a weed? I think so.” (Dennis 2004)
DEFF Research Database (Denmark)
Hartelius, Karsten; Carstensen, Jens Michael
2003-01-01
A method for locating distorted grid structures in images is presented. The method is based on the theories of template matching and Bayesian image restoration. The grid is modeled as a deformable template. Prior knowledge of the grid is described through a Markov random field (MRF) model which...... represents the spatial coordinates of the grid nodes. Knowledge of how grid nodes are depicted in the observed image is described through the observation model. The prior consists of a node prior and an arc (edge) prior, both modeled as Gaussian MRFs. The node prior models variations in the positions of grid...... nodes and the arc prior models variations in row and column spacing across the grid. Grid matching is done by placing an initial rough grid over the image and applying an ensemble annealing scheme to maximize the posterior distribution of the grid. The method can be applied to noisy images with missing...
DEFF Research Database (Denmark)
Morawska, L.; Afshari, Alireza; N. Bae, G.;
2013-01-01
understanding of the risks posed by personal exposure to indoor aerosols. Limited studies assessing integrated daily residential exposure to just one particle size fraction, ultrafine particles, show that the contribution of indoor sources ranged from 19% to 76%. This indicates a strong dependence on resident...
Current trends in Bayesian methodology with applications
Upadhyay, Satyanshu K; Dey, Dipak K; Loganathan, Appaia
2015-01-01
Collecting Bayesian material scattered throughout the literature, Current Trends in Bayesian Methodology with Applications examines the latest methodological and applied aspects of Bayesian statistics. The book covers biostatistics, econometrics, reliability and risk analysis, spatial statistics, image analysis, shape analysis, Bayesian computation, clustering, uncertainty assessment, high-energy astrophysics, neural networking, fuzzy information, objective Bayesian methodologies, empirical Bayes methods, small area estimation, and many more topics.Each chapter is self-contained and focuses on
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of...
Portfolio Allocation for Bayesian Optimization
Brochu, Eric; Hoffman, Matthew W.; De Freitas, Nando
2010-01-01
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It uses Bayesian methods to sample the objective efficiently using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several differen...
Neuronanatomy, neurology and Bayesian networks
Bielza Lozoya, Maria Concepcion
2014-01-01
Bayesian networks are data mining models with clear semantics and a sound theoretical foundation. In this keynote talk we will pinpoint a number of neuroscience problems that can be addressed using Bayesian networks. In neuroanatomy, we will show computer simulation models of dendritic trees and classification of neuron types, both based on morphological features. In neurology, we will present the search for genetic biomarkers in Alzheimer's disease and the prediction of health-related qualit...
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Probabilistic networks, also known as Bayesian networks and influence diagrams, have become one of the most promising technologies in the area of applied artificial intelligence, offering intuitive, efficient, and reliable methods for diagnosis, prediction, decision making, classification......, troubleshooting, and data mining under uncertainty. Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. Intended...
Dale Poirier
2008-01-01
This paper provides Bayesian rationalizations for White’s heteroskedastic consistent (HC) covariance estimator and various modifications of it. An informed Bayesian bootstrap provides the statistical framework.
Bayesian Stochastic Search for the Best Predictors: Nowcasting GDP Growth
Nikolaus Hautsch; Fuyu Yang
2014-01-01
We propose a Bayesian framework for nowcasting GDP growth in real time. Using vintage data on macroeconomic announcements we set up a state space system connecting latent GDP growth rates to agencies' releases of GDP and other economic indicators. We propose a Gibbs sampling scheme to filter out daily GDP growth rates using all available macroeconomic information. The sample draws from the resulting posterior distribution, thereby allowing us to simulate backcasting, nowcasting, and forecasti...
Bayesian disclosure risk assessment: predicting small frequencies in contingency tables
Forster, Jonathan J.; Webb, Emily L
2007-01-01
We propose an approach for assessing the risk of individual identification in the release of categorical data. This requires the accurate calculation of predictive probabilities for those cells in a contingency table which have small sample frequencies, making the problem somewhat different from usual contingency table estimation, where interest is generally focussed on regions of high probability. Our approach is Bayesian and provides posterior predictive probabilities of identification risk...
International Nuclear Information System (INIS)
This report summarizes the work on the development of fibre metallic prefilters to be placed upstream of HEPA filters for the exhaust gases of nuclear process plants. Investigations at ambient and high temperature were carried out. Measurements of the filtration performance of Bekipor porous webs and sintered mats were performed in the AFLT (aerosol filtration at low temperature) unit with a throughput of 15 m3/h. A parametric study on the influence of particle size, fibre diameter, number of layers and superficial velocity led to the optimum choice of the working parameters. Three selected filter types were then tested with polydisperse aerosols using a candle-type filter configuration or a flat-type filter configuration. The small-diameter candle type is not well suited for a spraying nozzles regeneration system so that only the flat-type filter was retained for high-temperature tests. A high-temperature test unit (AFHT) with a throughput of 8 to 10 m3/h at 4000C was used to test the three filter types with an aerosol generated by high-temperature calcination of a simulated nitric acid waste solution traced with 134Cs. The regeneration of the filter by spray washing and the effect of the regeneration on the filter performance was studied for the three filter types. The porous mats have a higher dust loading capacity than the sintered web which means that their regeneration frequency can be kept lower
Electrically Driven Technologies for Radioactive Aerosol Abatement
Energy Technology Data Exchange (ETDEWEB)
David W. DePaoli; Ofodike A. Ezekoye; Costas Tsouris; Valmor F. de Almeida
2003-01-28
The purpose of this research project was to develop an improved understanding of how electriexecy driven processes, including electrocoalescence, acoustic agglomeration, and electric filtration, may be employed to efficiently treat problems caused by the formation of aerosols during DOE waste treatment operations. The production of aerosols during treatment and retrieval operations in radioactive waste tanks and during thermal treatment operations such as calcination presents a significant problem of cost, worker exposure, potential for release, and increased waste volume.
Washington University St Louis — TOMS_AI_G is an aerosol related dataset derived from the Total Ozone Monitoring Satellite (TOMS) Sensor. The TOMS aerosol index arises from absorbing aerosols such...
Nonparametric Bayesian Classification
Coram, M A
2002-01-01
A Bayesian approach to the classification problem is proposed in which random partitions play a central role. It is argued that the partitioning approach has the capacity to take advantage of a variety of large-scale spatial structures, if they are present in the unknown regression function $f_0$. An idealized one-dimensional problem is considered in detail. The proposed nonparametric prior uses random split points to partition the unit interval into a random number of pieces. This prior is found to provide a consistent estimate of the regression function in the $\\L^p$ topology, for any $1 \\leq p < \\infty$, and for arbitrary measurable $f_0:[0,1] \\rightarrow [0,1]$. A Markov chain Monte Carlo (MCMC) implementation is outlined and analyzed. Simulation experiments are conducted to show that the proposed estimate compares favorably with a variety of conventional estimators. A striking resemblance between the posterior mean estimate and the bagged CART estimate is noted and discussed. For higher dimensions, a ...
BAT - Bayesian Analysis Toolkit
International Nuclear Information System (INIS)
One of the most vital steps in any data analysis is the statistical analysis and comparison with the prediction of a theoretical model. The many uncertainties associated with the theoretical model and the observed data require a robust statistical analysis tool. The Bayesian Analysis Toolkit (BAT) is a powerful statistical analysis software package based on Bayes' Theorem, developed to evaluate the posterior probability distribution for models and their parameters. It implements Markov Chain Monte Carlo to get the full posterior probability distribution that in turn provides a straightforward parameter estimation, limit setting and uncertainty propagation. Additional algorithms, such as Simulated Annealing, allow to evaluate the global mode of the posterior. BAT is developed in C++ and allows for a flexible definition of models. A set of predefined models covering standard statistical cases are also included in BAT. It has been interfaced to other commonly used software packages such as ROOT, Minuit, RooStats and CUBA. An overview of the software and its algorithms is provided along with several physics examples to cover a range of applications of this statistical tool. Future plans, new features and recent developments are briefly discussed.
The behaviour of sodium fire aerosols
International Nuclear Information System (INIS)
The knowledge of the behaviour of aerosols released in nuclear accidents is of great importance for nuclear reactor safety and environmental protection. In sodium cooled fast breeder reactor accidents may occur by leaking pipes resulting in considerable spill of sodium. Major amounts of airborne sodium fire aerosols will be formed due to the evaporation and reaction of hot sodium with oxygen. For estimating the environmental impact of these aerosols it is necessary to know their chemical and physical behaviour in containments and in the free atmosphere
Oak Ridge National Laboratory — The aerosol observation system (AOS) is the primary Atmospheric Radiation Measurement (ARM) platform for in situ aerosol measurements at the surface. The principal...
Bayesian seismic AVO inversion
Energy Technology Data Exchange (ETDEWEB)
Buland, Arild
2002-07-01
A new linearized AVO inversion technique is developed in a Bayesian framework. The objective is to obtain posterior distributions for P-wave velocity, S-wave velocity and density. Distributions for other elastic parameters can also be assessed, for example acoustic impedance, shear impedance and P-wave to S-wave velocity ratio. The inversion algorithm is based on the convolutional model and a linearized weak contrast approximation of the Zoeppritz equation. The solution is represented by a Gaussian posterior distribution with explicit expressions for the posterior expectation and covariance, hence exact prediction intervals for the inverted parameters can be computed under the specified model. The explicit analytical form of the posterior distribution provides a computationally fast inversion method. Tests on synthetic data show that all inverted parameters were almost perfectly retrieved when the noise approached zero. With realistic noise levels, acoustic impedance was the best determined parameter, while the inversion provided practically no information about the density. The inversion algorithm has also been tested on a real 3-D dataset from the Sleipner Field. The results show good agreement with well logs but the uncertainty is high. The stochastic model includes uncertainties of both the elastic parameters, the wavelet and the seismic and well log data. The posterior distribution is explored by Markov chain Monte Carlo simulation using the Gibbs sampler algorithm. The inversion algorithm has been tested on a seismic line from the Heidrun Field with two wells located on the line. The uncertainty of the estimated wavelet is low. In the Heidrun examples the effect of including uncertainty of the wavelet and the noise level was marginal with respect to the AVO inversion results. We have developed a 3-D linearized AVO inversion method with spatially coupled model parameters where the objective is to obtain posterior distributions for P-wave velocity, S
Bayesian modeling using WinBUGS
Ntzoufras, Ioannis
2009-01-01
A hands-on introduction to the principles of Bayesian modeling using WinBUGS Bayesian Modeling Using WinBUGS provides an easily accessible introduction to the use of WinBUGS programming techniques in a variety of Bayesian modeling settings. The author provides an accessible treatment of the topic, offering readers a smooth introduction to the principles of Bayesian modeling with detailed guidance on the practical implementation of key principles. The book begins with a basic introduction to Bayesian inference and the WinBUGS software and goes on to cover key topics, including: Markov Chain Monte Carlo algorithms in Bayesian inference Generalized linear models Bayesian hierarchical models Predictive distribution and model checking Bayesian model and variable evaluation Computational notes and screen captures illustrate the use of both WinBUGS as well as R software to apply the discussed techniques. Exercises at the end of each chapter allow readers to test their understanding of the presented concepts and all ...
Probability biases as Bayesian inference
Directory of Open Access Journals (Sweden)
Andre; C. R. Martins
2006-11-01
Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.
Bayesian Methods and Universal Darwinism
Campbell, John
2010-01-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a 'copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that system...
Bayesian methods for proteomic biomarker development
Directory of Open Access Journals (Sweden)
Belinda Hernández
2015-12-01
In this review we provide an introduction to Bayesian inference and demonstrate some of the advantages of using a Bayesian framework. We summarize how Bayesian methods have been used previously in proteomics and other areas of bioinformatics. Finally, we describe some popular and emerging Bayesian models from the statistical literature and provide a worked tutorial including code snippets to show how these methods may be applied for the evaluation of proteomic biomarkers.
A survey of aerosol research in European community programmes
International Nuclear Information System (INIS)
In the European Commission's (EC) 3rd Framework Programme (1990-1994) of community research and technological development, aerosol problems are of particular importance in the specific programmes Environment, Nuclear Fission Safety (with emphasis on Reactor Safety and on the Decommissioning of Nuclear Installations), Industrial and Materials Technologies, and Measurement and Testing. Under Environment, significant efforts are directed towards monitoring natural and anthropogenic aerosols in the atmosphere, understanding the role played by aerosols in ecosystem regulation, and the development of techniques to reduce aerosol emission from industrial plants. To ensure Nuclear Fission Safety, investigations are necessary to identify the mechanisms and determine the quantities of fission product aerosols released in the event of an accident and to develop measures for aerosol retention in such cases. The release of radioactive aerosols from nuclear installations in case of fire has been studied, and methods of aerosol abatement by acoustic techniques are under investigation. In decommissioning of nuclear installations the problem of aerosol formation and dispersion arises during dismantling operations. Industrial and Materials Technologies require information on aerosols ranging from welding fumes, asbestos fibres, lead compounds and quartz particles to aerosol/vapour mixtures of toxic products, aerosols from biotechnology industries and airborne micro-organisms. Finally, for Measurement and Testing, reference aerosols are needed for calibration purposes and to improve and harmonize particle counting characterisation. A brief summary of examples for each of the above activities, carried out in the form of EC cost shared actions or at the Commission's Joint Research Centre, will be given, together with a description of some aerosol problems still to be solved. (Author)
Bayesian test and Kuhn's paradigm
Institute of Scientific and Technical Information of China (English)
Chen Xiaoping
2006-01-01
Kuhn's theory of paradigm reveals a pattern of scientific progress,in which normal science alternates with scientific revolution.But Kuhn underrated too much the function of scientific test in his pattern,because he focuses all his attention on the hypothetico-deductive schema instead of Bayesian schema.This paper employs Bayesian schema to re-examine Kuhn's theory of paradigm,to uncover its logical and rational components,and to illustrate the tensional structure of logic and belief,rationality and irrationality,in the process of scientific revolution.
3D Bayesian contextual classifiers
DEFF Research Database (Denmark)
Larsen, Rasmus
2000-01-01
We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours.......We extend a series of multivariate Bayesian 2-D contextual classifiers to 3-D by specifying a simultaneous Gaussian distribution for the feature vectors as well as a prior distribution of the class variables of a pixel and its 6 nearest 3-D neighbours....
Bayesian Model Averaging for Propensity Score Analysis
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
Bayesian networks and food security - An introduction
Stein, A.
2004-01-01
This paper gives an introduction to Bayesian networks. Networks are defined and put into a Bayesian context. Directed acyclical graphs play a crucial role here. Two simple examples from food security are addressed. Possible uses of Bayesian networks for implementation and further use in decision sup
Directory of Open Access Journals (Sweden)
K. Hara
2013-10-01
Full Text Available Unusual aerosol enhancement is often observed at Syowa Station, Antarctica during winter through spring. Simultaneous aerosol measurements near the surface and in the upper atmosphere were conducted twice using a ground-based optical particle counter, a balloon-borne optical particle counter, and micro-pulse LIDAR (MPL in August and September 2012. During 13–15 August, aerosol enhancement occurred immediately after a storm condition. A high backscatter ratio and aerosol concentrations were observed from the surface to ca. 2.5 km over Syowa Station. Clouds appeared occasionally at the top of aerosol-enhanced layer during the episode. Aerosol enhancement was terminated on 15 August by strong winds caused by a cyclone's approach. In the second case on 5–7 September, aerosol number concentrations in Dp > 0.3 μm near the surface reached > 104 L−1 at about 15:00 UT on 5 September in spite of calm wind conditions, whereas MPL measurement exhibited aerosols were enhanced at about 04:00 UT at 1000–1500 m above Syowa Station. The aerosol enhancement occurred near the surface–ca. 4 km. In both cases, air masses with high aerosol enhancement below 2.5–3 km were transported mostly from the boundary layer over the sea-ice area. In addition, air masses at 3–4 km in the second case came from the boundary layer over the open-sea area. This air mass history strongly suggests that dispersion of sea-salt particles from the sea-ice surface contributes considerably to the aerosol enhancement in the lower free troposphere (about 3 km and that the release of sea-salt particles from the ocean surface engenders high aerosol concentrations in the free troposphere (3–4 km.
Bayesian variable order Markov models: Towards Bayesian predictive state representations
C. Dimitrakakis
2009-01-01
We present a Bayesian variable order Markov model that shares many similarities with predictive state representations. The resulting models are compact and much easier to specify and learn than classical predictive state representations. Moreover, we show that they significantly outperform a more st
Containment aerosol behaviour simulation studies in the BARC nuclear aerosol test facility
International Nuclear Information System (INIS)
A Nuclear Aerosol Test Facility (NATF) has been built and commissioned at Bhabha Atomic Research Centre to carry out simulation studies on the behaviour of aerosols released into the reactor containment under accident conditions. This report also discusses some new experimental techniques for estimation of density of metallic aggregates. The experimental studies have shown that the dynamic densities of aerosol aggregates are far lower than their material densities as expected by the well-known fractal theory of aggregates. In the context of codes, this has significant bearing in providing a mechanistic basis for the input density parameter used in estimating the aerosol evolution characteristics. The data generated under the quiescent and turbulent conditions and the information on aggregate densities are now being subjected to the validation of the aerosol behaviour codes. (author)
Institute of Scientific and Technical Information of China (English)
Fengfu Fu; Liangjun Xu; Wei Ye; Yiquan Chen; Mingyu Jiang; Xueqin Xu
2006-01-01
Different-sized aerosols were collected by an Andersen air sampler to observe the detailed morphology of the black carbon (BC) aerosols which were separated chemically from the other accompanying aerosols, using a Scanning Electron Microscope equipped with an Energy Dispersive X-ray Spectrometer (SEM-EDX). The results indicate that most BC aerosols are spherical particles of about 50 nm in diameter and with a homogeneous surface. Results also show that these particles aggregate with other aerosols or with themselves to form larger agglomerates in the micrometer range. The shape of these 50-nm BC spherical particles was found to be very similar to that of BC particles released from petroleum-powered vehicular internal combustion engines. These spherical BC particles were shown to be different from the previously reported fullerenes found using Matrix-Assisted Laser Desorption/Ionization Time-Of-Flight Mass Spectrometry (MALDI-TOF-MS).
Fracture prediction of cardiac lead medical devices using Bayesian networks
International Nuclear Information System (INIS)
A novel Bayesian network methodology has been developed to enable the prediction of fatigue fracture of cardiac lead medical devices. The methodology integrates in-vivo device loading measurements, patient demographics, patient activity level, in-vitro fatigue strength measurements, and cumulative damage modeling techniques. Many plausible combinations of these variables can be simulated within a Bayesian network framework to generate a family of fatigue fracture survival curves, enabling sensitivity analyses and the construction of confidence bounds on reliability predictions. The method was applied to the prediction of conductor fatigue fracture near the shoulder for two market-released cardiac defibrillation leads which had different product performance histories. The case study used recently published data describing the in-vivo curvature conditions and the in-vitro fatigue strength. The prediction results from the methodology aligned well with the observed qualitative ranking of field performance, as well as the quantitative field survival from fracture. This initial success suggests that study of further extension of this method to other medical device applications is warranted. - Highlights: • A new method to simulate the fatigue experience of an implanted cardiac lead. • Fatigue strength and use conditions are incorporated within a Bayesian network. • Confidence bounds reflect the uncertainty in all input parameters. • A case study is presented using market released cardiac leads
Aerosol Climate Time Series in ESA Aerosol_cci
Popp, Thomas; de Leeuw, Gerrit; Pinnock, Simon
2016-04-01
Within the ESA Climate Change Initiative (CCI) Aerosol_cci (2010 - 2017) conducts intensive work to improve algorithms for the retrieval of aerosol information from European sensors. Meanwhile, full mission time series of 2 GCOS-required aerosol parameters are completely validated and released: Aerosol Optical Depth (AOD) from dual view ATSR-2 / AATSR radiometers (3 algorithms, 1995 - 2012), and stratospheric extinction profiles from star occultation GOMOS spectrometer (2002 - 2012). Additionally, a 35-year multi-sensor time series of the qualitative Absorbing Aerosol Index (AAI) together with sensitivity information and an AAI model simulator is available. Complementary aerosol properties requested by GCOS are in a "round robin" phase, where various algorithms are inter-compared: fine mode AOD, mineral dust AOD (from the thermal IASI spectrometer, but also from ATSR instruments and the POLDER sensor), absorption information and aerosol layer height. As a quasi-reference for validation in few selected regions with sparse ground-based observations the multi-pixel GRASP algorithm for the POLDER instrument is used. Validation of first dataset versions (vs. AERONET, MAN) and inter-comparison to other satellite datasets (MODIS, MISR, SeaWIFS) proved the high quality of the available datasets comparable to other satellite retrievals and revealed needs for algorithm improvement (for example for higher AOD values) which were taken into account for a reprocessing. The datasets contain pixel level uncertainty estimates which were also validated and improved in the reprocessing. For the three ATSR algorithms the use of an ensemble method was tested. The paper will summarize and discuss the status of dataset reprocessing and validation. The focus will be on the ATSR, GOMOS and IASI datasets. Pixel level uncertainties validation will be summarized and discussed including unknown components and their potential usefulness and limitations. Opportunities for time series extension
Bayesian Analysis of Experimental Data
Directory of Open Access Journals (Sweden)
Lalmohan Bhar
2013-10-01
Full Text Available Analysis of experimental data from Bayesian point of view has been considered. Appropriate methodology has been developed for application into designed experiments. Normal-Gamma distribution has been considered for prior distribution. Developed methodology has been applied to real experimental data taken from long term fertilizer experiments.
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis Linda
2006-01-01
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for the salt and pepper noise. The inference in the model is discussed...
Bayesian image restoration, using configurations
DEFF Research Database (Denmark)
Thorarinsdottir, Thordis
configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in...
ANALYSIS OF BAYESIAN CLASSIFIER ACCURACY
Directory of Open Access Journals (Sweden)
Felipe Schneider Costa
2013-01-01
Full Text Available The naÃ¯ve Bayes classifier is considered one of the most effective classification algorithms today, competing with more modern and sophisticated classifiers. Despite being based on unrealistic (naÃ¯ve assumption that all variables are independent, given the output class, the classifier provides proper results. However, depending on the scenario utilized (network structure, number of samples or training cases, number of variables, the network may not provide appropriate results. This study uses a process variable selection, using the chi-squared test to verify the existence of dependence between variables in the data model in order to identify the reasons which prevent a Bayesian network to provide good performance. A detailed analysis of the data is also proposed, unlike other existing work, as well as adjustments in case of limit values between two adjacent classes. Furthermore, variable weights are used in the calculation of a posteriori probabilities, calculated with mutual information function. Tests were applied in both a naÃ¯ve Bayesian network and a hierarchical Bayesian network. After testing, a significant reduction in error rate has been observed. The naÃ¯ve Bayesian network presented a drop in error rates from twenty five percent to five percent, considering the initial results of the classification process. In the hierarchical network, there was not only a drop in fifteen percent error rate, but also the final result came to zero.
Bayesian Agglomerative Clustering with Coalescents
Teh, Yee Whye; Daumé III, Hal; Roy, Daniel
2009-01-01
We introduce a new Bayesian model for hierarchical clustering based on a prior over trees called Kingman's coalescent. We develop novel greedy and sequential Monte Carlo inferences which operate in a bottom-up agglomerative fashion. We show experimentally the superiority of our algorithms over others, and demonstrate our approach in document clustering and phylolinguistics.
Bayesian Networks and Influence Diagrams
DEFF Research Database (Denmark)
Kjærulff, Uffe Bro; Madsen, Anders Læsø
Bayesian Networks and Influence Diagrams: A Guide to Construction and Analysis, Second Edition, provides a comprehensive guide for practitioners who wish to understand, construct, and analyze intelligent systems for decision support based on probabilistic networks. This new edition contains six new...
Topics in Bayesian statistics and maximum entropy
International Nuclear Information System (INIS)
Notions of Bayesian decision theory and maximum entropy methods are reviewed with particular emphasis on probabilistic inference and Bayesian modeling. The axiomatic approach is considered as the best justification of Bayesian analysis and maximum entropy principle applied in natural sciences. Particular emphasis is put on solving the inverse problem in digital image restoration and Bayesian modeling of neural networks. Further topics addressed briefly include language modeling, neutron scattering, multiuser detection and channel equalization in digital communications, genetic information, and Bayesian court decision-making. (author)
Uncertainty quantification in aerosol dynamics
International Nuclear Information System (INIS)
The influence of uncertainty in coagulation and depositions mechanisms, as well as in the initial conditions, on the solution of the aerosol dynamic equation have been assessed using polynomial chaos theory. In this way, large uncertainties can be incorporated into the equations and their propagation as a function of space and time studied. We base our calculations on the simplified point model dynamic equation which includes coagulation and deposition removal mechanisms. Results are given for the stochastic mean aerosol density as a function of time as well as its variance. The stochastic mean and deterministic mean are shown to differ and the associated uncertainty, in the form of a sensitivity coefficient, is obtained as a function of time. In addition, we obtain the probability density function of the aerosol density and show how this varies with time. In view of the generally uncertain nature of an accidental aerosol release in a nuclear reactor accident, the polynomial chaos method is a particularly useful technique as it allows one to deal with a very large spread of input data and examine the effect this has on the quantities of interest. Convergence matters are studied and numerical values given.
Bayesian analysis of rare events
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Bayesian methods for measures of agreement
Broemeling, Lyle D
2009-01-01
Using WinBUGS to implement Bayesian inferences of estimation and testing hypotheses, Bayesian Methods for Measures of Agreement presents useful methods for the design and analysis of agreement studies. It focuses on agreement among the various players in the diagnostic process.The author employs a Bayesian approach to provide statistical inferences based on various models of intra- and interrater agreement. He presents many examples that illustrate the Bayesian mode of reasoning and explains elements of a Bayesian application, including prior information, experimental information, the likelihood function, posterior distribution, and predictive distribution. The appendices provide the necessary theoretical foundation to understand Bayesian methods as well as introduce the fundamentals of programming and executing the WinBUGS software.Taking a Bayesian approach to inference, this hands-on book explores numerous measures of agreement, including the Kappa coefficient, the G coefficient, and intraclass correlation...
Plug & Play object oriented Bayesian networks
DEFF Research Database (Denmark)
Bangsø, Olav; Flores, J.; Jensen, Finn Verner
2003-01-01
Object oriented Bayesian networks have proven themselves useful in recent years. The idea of applying an object oriented approach to Bayesian networks has extended their scope to larger domains that can be divided into autonomous but interrelated entities. Object oriented Bayesian networks have...... been shown to be quite suitable for dynamic domains as well. However, processing object oriented Bayesian networks in practice does not take advantage of their modular structure. Normally the object oriented Bayesian network is transformed into a Bayesian network and, inference is performed...... by constructing a junction tree from this network. In this paper we propose a method for translating directly from object oriented Bayesian networks to junction trees, avoiding the intermediate translation. We pursue two main purposes: firstly, to maintain the original structure organized in an instance tree...
Aerosol typing - key information from aerosol studies
Mona, Lucia; Kahn, Ralph; Papagiannopoulos, Nikolaos; Holzer-Popp, Thomas; Pappalardo, Gelsomina
2016-04-01
Aerosol typing is a key source of aerosol information from ground-based and satellite-borne instruments. Depending on the specific measurement technique, aerosol typing can be used as input for retrievals or represents an output for other applications. Typically aerosol retrievals require some a priori or external aerosol type information. The accuracy of the derived aerosol products strongly depends on the reliability of these assumptions. Different sensors can make use of different aerosol type inputs. A critical review and harmonization of these procedures could significantly reduce related uncertainties. On the other hand, satellite measurements in recent years are providing valuable information about the global distribution of aerosol types, showing for example the main source regions and typical transport paths. Climatological studies of aerosol load at global and regional scales often rely on inferred aerosol type. There is still a high degree of inhomogeneity among satellite aerosol typing schemes, which makes the use different sensor datasets in a consistent way difficult. Knowledge of the 4d aerosol type distribution at these scales is essential for understanding the impact of different aerosol sources on climate, precipitation and air quality. All this information is needed for planning upcoming aerosol emissions policies. The exchange of expertise and the communication among satellite and ground-based measurement communities is fundamental for improving long-term dataset consistency, and for reducing aerosol type distribution uncertainties. Aerosol typing has been recognized as one of its high-priority activities of the AEROSAT (International Satellite Aerosol Science Network, http://aero-sat.org/) initiative. In the AEROSAT framework, a first critical review of aerosol typing procedures has been carried out. The review underlines the high heterogeneity in many aspects: approach, nomenclature, assumed number of components and parameters used for the
The DRAGON aerosol research facility to study aerosol behaviour for reactor safety applications
International Nuclear Information System (INIS)
During a severe accident in a nuclear power plant fission products are expected to be released in form of aerosol particles and droplets. To study the behaviour of safety relevant reactor components under aerosol loads and prototypical severe accident conditions the multi-purpose aerosol generation facility DRAGON is used since 1994 for several projects. DRAGON can generate aerosol particles by the evaporation-condensation technique using a plasma torch system, fluidized bed and atomization of particles suspended in a liquid. Soluble, hygroscopic aerosol (i.e. CsOH) and insoluble aerosol particles (i.e. SnO2, TiO2) or mixtures of them can be used. DRAGON uses state-of-the-art thermal-hydraulic, data acquisition and aerosol measurement techniques and is mainly composed of a mixing chamber, the plasma torch system, a steam generator, nitrogen gas and compressed air delivery systems, several aerosol delivery piping, gas heaters and several auxiliary systems to provide vacuum, coolant and off-gas treatment. The facility can be operated at system pressure of 5 bars, temperatures of 300 deg. C, flow rates of non-condensable gas of 900 kg/h and steam of 270 kg/h, respectively. A test section under investigation is attached to DRAGON. The paper summarizes and demonstrates with the help of two project examples the capabilities of DRAGON for reactor safety studies. (authors)
MODIS 3 km aerosol product: algorithm and global perspective
Remer, L. A.; Mattoo, S; R. C. Levy; Munchak, L.
2013-01-01
After more than a decade of producing a nominal 10 km aerosol product based on the dark target method, the MODIS aerosol team will be releasing a nominal 3 km product as part of their Collection 6 release. The new product differs from the original 10 km product only in the manner in which reflectance pixels are ingested, organized and selected by the aerosol algorithm. Overall, the 3 km product closely mirrors the 10 km product. However, the finer resolution product is able to retrieve over o...
Flexible Bayesian Nonparametric Priors and Bayesian Computational Methods
Zhu, Weixuan
2016-01-01
The definition of vectors of dependent random probability measures is a topic of interest in Bayesian nonparametrics. They represent dependent nonparametric prior distributions that are useful for modelling observables for which specific covariate values are known. Our first contribution is the introduction of novel multivariate vectors of two-parameter Poisson-Dirichlet process. The dependence is induced by applying a L´evy copula to the marginal L´evy intensities. Our attenti...
Protection of air in premises and environment against beryllium aerosols
Energy Technology Data Exchange (ETDEWEB)
Bitkolov, N.Z.; Vishnevsky, E.P.; Krupkin, A.V. [Research Inst. of Industrial and Marine Medicine, St. Petersburg (Russian Federation)
1998-01-01
First and foremost, the danger of beryllium aerosols concerns a possibility of their inhalation. The situation is aggravated with high biological activity of the beryllium in a human lung. The small allowable beryllium aerosols` concentration in air poses a rather complex and expensive problem of the pollution prevention and clearing up of air. The delivery and transportation of beryllium aerosols from sites of their formation are defined by the circuit of ventilation, that forms aerodynamics of air flows in premises, and aerodynamic links between premises. The causes of aerosols release in air of premises from hoods, isolated and hermetically sealed vessels can be vibrations, as well as pulses of temperature and pressure. Furthermore, it is possible the redispersion of aerosols from dirty surfaces. The effective protection of air against beryllium aerosols at industrial plants is provided by a complex of hygienic measures: from individual means of breath protection up to collective means of the prevention of air pollution. (J.P.N.)
Aerosol mobility size spectrometer
Wang, Jian; Kulkarni, Pramod
2007-11-20
A device for measuring aerosol size distribution within a sample containing aerosol particles. The device generally includes a spectrometer housing defining an interior chamber and a camera for recording aerosol size streams exiting the chamber. The housing includes an inlet for introducing a flow medium into the chamber in a flow direction, an aerosol injection port adjacent the inlet for introducing a charged aerosol sample into the chamber, a separation section for applying an electric field to the aerosol sample across the flow direction and an outlet opposite the inlet. In the separation section, the aerosol sample becomes entrained in the flow medium and the aerosol particles within the aerosol sample are separated by size into a plurality of aerosol flow streams under the influence of the electric field. The camera is disposed adjacent the housing outlet for optically detecting a relative position of at least one aerosol flow stream exiting the outlet and for optically detecting the number of aerosol particles within the at least one aerosol flow stream.
Directory of Open Access Journals (Sweden)
A. Määttä
2013-09-01
Full Text Available We study uncertainty quantification in remote sensing of aerosols in the atmosphere with top of the atmosphere reflectance measurements from the nadir-viewing Ozone Monitoring Instrument (OMI. Focus is on the uncertainty in aerosol model selection of pre-calculated aerosol models and on the statistical modelling of the model inadequacies. The aim is to apply statistical methodologies that improve the uncertainty estimates of the aerosol optical thickness (AOT retrieval by propagating model selection and model error related uncertainties more realistically. We utilise Bayesian model selection and model averaging methods for the model selection problem and use Gaussian processes to model the smooth systematic discrepancies from the modelled to observed reflectance. The systematic model error is learned from an ensemble of operational retrievals. The operational OMI multi-wavelength aerosol retrieval algorithm OMAERO is used for cloud free, over land pixels of the OMI instrument with the additional Bayesian model selection and model discrepancy techniques. The method is demonstrated with four examples with different aerosol properties: weakly absorbing aerosols, forest fires over Greece and Russia, and Sahara dessert dust. The presented statistical methodology is general; it is not restricted to this particular satellite retrieval application.
Bayesian approach to rough set
Marwala, Tshilidzi
2007-01-01
This paper proposes an approach to training rough set models using Bayesian framework trained using Markov Chain Monte Carlo (MCMC) method. The prior probabilities are constructed from the prior knowledge that good rough set models have fewer rules. Markov Chain Monte Carlo sampling is conducted through sampling in the rough set granule space and Metropolis algorithm is used as an acceptance criteria. The proposed method is tested to estimate the risk of HIV given demographic data. The results obtained shows that the proposed approach is able to achieve an average accuracy of 58% with the accuracy varying up to 66%. In addition the Bayesian rough set give the probabilities of the estimated HIV status as well as the linguistic rules describing how the demographic parameters drive the risk of HIV.
Attention in a bayesian framework
DEFF Research Database (Denmark)
Whiteley, Louise Emma; Sahani, Maneesh
2012-01-01
include both selective phenomena, where attention is invoked by cues that point to particular stimuli, and integrative phenomena, where attention is invoked dynamically by endogenous processing. However, most previous Bayesian accounts of attention have focused on describing relatively simple experimental...... settings, where cues shape expectations about a small number of upcoming stimuli and thus convey "prior" information about clearly defined objects. While operationally consistent with the experiments it seeks to describe, this view of attention as prior seems to miss many essential elements of both its......The behavioral phenomena of sensory attention are thought to reflect the allocation of a limited processing resource, but there is little consensus on the nature of the resource or why it should be limited. Here we argue that a fundamental bottleneck emerges naturally within Bayesian models of...
Bayesian Sampling using Condition Indicators
DEFF Research Database (Denmark)
Faber, Michael H.; Sørensen, John Dalsgaard
2002-01-01
allows for a Bayesian formulation of the indicators whereby the experience and expertise of the inspection personnel may be fully utilized and consistently updated as frequentistic information is collected. The approach is illustrated on an example considering a concrete structure subject to corrosion......The problem of control quality of components is considered for the special case where the acceptable failure rate is low, the test costs are high and where it may be difficult or impossible to test the condition of interest directly. Based on the classical control theory and the concept of...... condition indicators introduced by Benjamin and Cornell (1970) a Bayesian approach to quality control is formulated. The formulation is then extended to the case where the quality control is based on sampling of indirect information about the condition of the components, i.e. condition indicators. This...
BAYESIAN IMAGE RESTORATION, USING CONFIGURATIONS
Directory of Open Access Journals (Sweden)
Thordis Linda Thorarinsdottir
2011-05-01
Full Text Available In this paper, we develop a Bayesian procedure for removing noise from images that can be viewed as noisy realisations of random sets in the plane. The procedure utilises recent advances in configuration theory for noise free random sets, where the probabilities of observing the different boundary configurations are expressed in terms of the mean normal measure of the random set. These probabilities are used as prior probabilities in a Bayesian image restoration approach. Estimation of the remaining parameters in the model is outlined for salt and pepper noise. The inference in the model is discussed in detail for 3 X 3 and 5 X 5 configurations and examples of the performance of the procedure are given.
Bayesian Seismology of the Sun
Gruberbauer, Michael
2013-01-01
We perform a Bayesian grid-based analysis of the solar l=0,1,2 and 3 p modes obtained via BiSON in order to deliver the first Bayesian asteroseismic analysis of the solar composition problem. We do not find decisive evidence to prefer either of the contending chemical compositions, although the revised solar abundances (AGSS09) are more probable in general. We do find indications for systematic problems in standard stellar evolution models, unrelated to the consequences of inadequate modelling of the outer layers on the higher-order modes. The seismic observables are best fit by solar models that are several hundred million years older than the meteoritic age of the Sun. Similarly, meteoritic age calibrated models do not adequately reproduce the observed seismic observables. Our results suggest that these problems will affect any asteroseismic inference that relies on a calibration to the Sun.
Bayesian priors for transiting planets
Kipping, David M
2016-01-01
As astronomers push towards discovering ever-smaller transiting planets, it is increasingly common to deal with low signal-to-noise ratio (SNR) events, where the choice of priors plays an influential role in Bayesian inference. In the analysis of exoplanet data, the selection of priors is often treated as a nuisance, with observers typically defaulting to uninformative distributions. Such treatments miss a key strength of the Bayesian framework, especially in the low SNR regime, where even weak a priori information is valuable. When estimating the parameters of a low-SNR transit, two key pieces of information are known: (i) the planet has the correct geometric alignment to transit and (ii) the transit event exhibits sufficient signal-to-noise to have been detected. These represent two forms of observational bias. Accordingly, when fitting transits, the model parameter priors should not follow the intrinsic distributions of said terms, but rather those of both the intrinsic distributions and the observational ...
Bayesian Inference for Radio Observations
Lochner, Michelle; Zwart, Jonathan T L; Smirnov, Oleg; Bassett, Bruce A; Oozeer, Nadeem; Kunz, Martin
2015-01-01
(Abridged) New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inaccurate uncertainty estimates and biased results because such methods ignore any correlations between parameters. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realisation of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. Thi...
Bayesian inference on proportional elections.
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259
A Bayesian Nonparametric IRT Model
Karabatsos, George
2015-01-01
This paper introduces a flexible Bayesian nonparametric Item Response Theory (IRT) model, which applies to dichotomous or polytomous item responses, and which can apply to either unidimensional or multidimensional scaling. This is an infinite-mixture IRT model, with person ability and item difficulty parameters, and with a random intercept parameter that is assigned a mixing distribution, with mixing weights a probit function of other person and item parameters. As a result of its flexibility...
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Mohammad-Djafari, Ali
2007-01-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian segmentation of hyperspectral images
Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali
2004-11-01
In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.
Bayesian Stable Isotope Mixing Models
Parnell, Andrew C.; Phillips, Donald L.; Bearhop, Stuart; Semmens, Brice X.; Ward, Eric J.; Moore, Jonathan W.; Andrew L Jackson; Inger, Richard
2012-01-01
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixture. The most widely used application is quantifying the diet of organisms based on the food sources they have been observed to consume. At the centre of the multivariate statistical model we propose is a compositional m...
Bayesian Network--Response Regression
WANG, LU; Durante, Daniele; Dunson, David B.
2016-01-01
There is an increasing interest in learning how human brain networks vary with continuous traits (e.g., personality, cognitive abilities, neurological disorders), but flexible procedures to accomplish this goal are limited. We develop a Bayesian semiparametric model, which combines low-rank factorizations and Gaussian process priors to allow flexible shifts of the conditional expectation for a network-valued random variable across the feature space, while including subject-specific random eff...
Bayesian estimation of turbulent motion
Héas, P.; Herzet, C.; Mémin, E.; Heitz, D.; P. D. Mininni
2013-01-01
International audience Based on physical laws describing the multi-scale structure of turbulent flows, this article proposes a regularizer for fluid motion estimation from an image sequence. Regularization is achieved by imposing some scale invariance property between histograms of motion increments computed at different scales. By reformulating this problem from a Bayesian perspective, an algorithm is proposed to jointly estimate motion, regularization hyper-parameters, and to select the ...
Elements of Bayesian experimental design
Energy Technology Data Exchange (ETDEWEB)
Sivia, D.S. [Rutherford Appleton Lab., Oxon (United Kingdom)
1997-09-01
We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.
Skill Rating by Bayesian Inference
Di Fatta, Giuseppe; Haworth, Guy McCrossan; Regan, Kenneth W.
2009-01-01
Systems Engineering often involves computer modelling the behaviour of proposed systems and their components. Where a component is human, fallibility must be modelled by a stochastic agent. The identification of a model of decision-making over quantifiable options is investigated using the game-domain of Chess. Bayesian methods are used to infer the distribution of players’ skill levels from the moves they play rather than from their competitive results. The approach is used on large sets of ...
Topics in Nonparametric Bayesian Statistics
2003-01-01
The intersection set of Bayesian and nonparametric statistics was almost empty until about 1973, but now seems to be growing at a healthy rate. This chapter gives an overview of various theoretical and applied research themes inside this field, partly complementing and extending recent reviews of Dey, Müller and Sinha (1998) and Walker, Damien, Laud and Smith (1999). The intention is not to be complete or exhaustive, but rather to touch on research areas of interest, partly by example.
Cover Tree Bayesian Reinforcement Learning
Tziortziotis, Nikolaos; Dimitrakakis, Christos; Blekas, Konstantinos
2013-01-01
This paper proposes an online tree-based Bayesian approach for reinforcement learning. For inference, we employ a generalised context tree model. This defines a distribution on multivariate Gaussian piecewise-linear models, which can be updated in closed form. The tree structure itself is constructed using the cover tree method, which remains efficient in high dimensional spaces. We combine the model with Thompson sampling and approximate dynamic programming to obtain effective exploration po...
Bayesian kinematic earthquake source models
Minson, S. E.; Simons, M.; Beck, J. L.; Genrich, J. F.; Galetzka, J. E.; Chowdhury, F.; Owen, S. E.; Webb, F.; Comte, D.; Glass, B.; Leiva, C.; Ortega, F. H.
2009-12-01
Most coseismic, postseismic, and interseismic slip models are based on highly regularized optimizations which yield one solution which satisfies the data given a particular set of regularizing constraints. This regularization hampers our ability to answer basic questions such as whether seismic and aseismic slip overlap or instead rupture separate portions of the fault zone. We present a Bayesian methodology for generating kinematic earthquake source models with a focus on large subduction zone earthquakes. Unlike classical optimization approaches, Bayesian techniques sample the ensemble of all acceptable models presented as an a posteriori probability density function (PDF), and thus we can explore the entire solution space to determine, for example, which model parameters are well determined and which are not, or what is the likelihood that two slip distributions overlap in space. Bayesian sampling also has the advantage that all a priori knowledge of the source process can be used to mold the a posteriori ensemble of models. Although very powerful, Bayesian methods have up to now been of limited use in geophysical modeling because they are only computationally feasible for problems with a small number of free parameters due to what is called the "curse of dimensionality." However, our methodology can successfully sample solution spaces of many hundreds of parameters, which is sufficient to produce finite fault kinematic earthquake models. Our algorithm is a modification of the tempered Markov chain Monte Carlo (tempered MCMC or TMCMC) method. In our algorithm, we sample a "tempered" a posteriori PDF using many MCMC simulations running in parallel and evolutionary computation in which models which fit the data poorly are preferentially eliminated in favor of models which better predict the data. We present results for both synthetic test problems as well as for the 2007 Mw 7.8 Tocopilla, Chile earthquake, the latter of which is constrained by InSAR, local high
Bayesian Kernel Mixtures for Counts
Canale, Antonio; David B Dunson
2011-01-01
Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviatio...
Bayesian Optimization for Adaptive MCMC
Mahendran, Nimalan; Wang, Ziyu; Hamze, Firas; De Freitas, Nando
2011-01-01
This paper proposes a new randomized strategy for adaptive MCMC using Bayesian optimization. This approach applies to non-differentiable objective functions and trades off exploration and exploitation to reduce the number of potentially costly objective function evaluations. We demonstrate the strategy in the complex setting of sampling from constrained, discrete and densely connected probabilistic graphical models where, for each variation of the problem, one needs to adjust the parameters o...
Inference in hybrid Bayesian networks
DEFF Research Database (Denmark)
Lanseth, Helge; Nielsen, Thomas Dyhre; Rumí, Rafael;
2009-01-01
and reliability block diagrams). However, limitations in the BNs' calculation engine have prevented BNs from becoming equally popular for domains containing mixtures of both discrete and continuous variables (so-called hybrid domains). In this paper we focus on these difficulties, and summarize some of the last...... decade's research on inference in hybrid Bayesian networks. The discussions are linked to an example model for estimating human reliability....
Quantile pyramids for Bayesian nonparametrics
2009-01-01
P\\'{o}lya trees fix partitions and use random probabilities in order to construct random probability measures. With quantile pyramids we instead fix probabilities and use random partitions. For nonparametric Bayesian inference we use a prior which supports piecewise linear quantile functions, based on the need to work with a finite set of partitions, yet we show that the limiting version of the prior exists. We also discuss and investigate an alternative model based on the so-called substitut...
Space Shuttle RTOS Bayesian Network
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Bayesian analysis of contingency tables
Gómez Villegas, Miguel A.; González Pérez, Beatriz
2005-01-01
The display of the data by means of contingency tables is used in different approaches to statistical inference, for example, to broach the test of homogeneity of independent multinomial distributions. We develop a Bayesian procedure to test simple null hypotheses versus bilateral alternatives in contingency tables. Given independent samples of two binomial distributions and taking a mixed prior distribution, we calculate the posterior probability that the proportion of successes in the first...
Bayesian Credit Ratings (new version)
Paola Cerchiello; Paolo Giudici
2013-01-01
In this contribution we aim at improving ordinal variable selection in the context of causal models. In this regard, we propose an approach that provides a formal inferential tool to compare the explanatory power of each covariate, and, therefore, to select an effective model for classification purposes. Our proposed model is Bayesian nonparametric, and, thus, keeps the amount of model specification to a minimum. We consider the case in which information from the covariates is at the ordinal ...
Bayesian second law of thermodynamics
Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.
Quantum Inference on Bayesian Networks
Yoder, Theodore; Low, Guang Hao; Chuang, Isaac
2014-03-01
Because quantum physics is naturally probabilistic, it seems reasonable to expect physical systems to describe probabilities and their evolution in a natural fashion. Here, we use quantum computation to speedup sampling from a graphical probability model, the Bayesian network. A specialization of this sampling problem is approximate Bayesian inference, where the distribution on query variables is sampled given the values e of evidence variables. Inference is a key part of modern machine learning and artificial intelligence tasks, but is known to be NP-hard. Classically, a single unbiased sample is obtained from a Bayesian network on n variables with at most m parents per node in time (nmP(e) - 1 / 2) , depending critically on P(e) , the probability the evidence might occur in the first place. However, by implementing a quantum version of rejection sampling, we obtain a square-root speedup, taking (n2m P(e) -1/2) time per sample. The speedup is the result of amplitude amplification, which is proving to be broadly applicable in sampling and machine learning tasks. In particular, we provide an explicit and efficient circuit construction that implements the algorithm without the need for oracle access.
12th Brazilian Meeting on Bayesian Statistics
Louzada, Francisco; Rifo, Laura; Stern, Julio; Lauretto, Marcelo
2015-01-01
Through refereed papers, this volume focuses on the foundations of the Bayesian paradigm; their comparison to objectivistic or frequentist Statistics counterparts; and the appropriate application of Bayesian foundations. This research in Bayesian Statistics is applicable to data analysis in biostatistics, clinical trials, law, engineering, and the social sciences. EBEB, the Brazilian Meeting on Bayesian Statistics, is held every two years by the ISBrA, the International Society for Bayesian Analysis, one of the most active chapters of the ISBA. The 12th meeting took place March 10-14, 2014 in Atibaia. Interest in foundations of inductive Statistics has grown recently in accordance with the increasing availability of Bayesian methodological alternatives. Scientists need to deal with the ever more difficult choice of the optimal method to apply to their problem. This volume shows how Bayes can be the answer. The examination and discussion on the foundations work towards the goal of proper application of Bayesia...
Facility of aerosol filtration
International Nuclear Information System (INIS)
Said invention relates to a facility of aerosol filtration, particularly of sodium aerosols. Said facility is of special interest for fast reactors where sodium fires involve the possibility of high concentrations of sodium aerosols which soon clog up conventional filters. The facility intended for continuous operation, includes at the pre-filtering stage, means for increasing the size of the aerosol particles and separating clustered particles (cyclone separator)
Bayesian Posterior Distributions Without Markov Chains
Cole, Stephen R.; Chu, Haitao; Greenland, Sander; Hamra, Ghassan; Richardson, David B.
2012-01-01
Bayesian posterior parameter distributions are often simulated using Markov chain Monte Carlo (MCMC) methods. However, MCMC methods are not always necessary and do not help the uninitiated understand Bayesian inference. As a bridge to understanding Bayesian inference, the authors illustrate a transparent rejection sampling method. In example 1, they illustrate rejection sampling using 36 cases and 198 controls from a case-control study (1976–1983) assessing the relation between residential ex...
Aerosol satellite remote sensing
Veefkind, Joris Pepijn
2001-01-01
Aerosols are inportant for many processes in the atmosphere. Aerosols are a leading uncertainty in predicting global climate change, To a large extent this uncertainty is caused by a lack of knowledge on the occurrence and concentration of aerosols. On global scale, this information can only be o
Investigation on aerosol transport in containment cracks
International Nuclear Information System (INIS)
Under severe accident conditions, the containment leak-tightness could be threatened by energetic phenomena that could yield a release to the environment of nuclear aerosols through penetrating concrete cracks. As few data are presently available to quantify this aerosol leakage, a specific action was launched in the framework of the Santar Project of the European 6th Framework Programme. In this context, both theoretical and experimental investigations have been managed to develop a model that can readily be applied within a code like Aster (Accident Source Term Evaluation Code). Particle diffusion, settling, turbulent deposition, diffusiophoresis and thermophoresis have been considered as deposition mechanisms inside the crack path. They have been encapsulated in numerical models set up to reproduce experiments with small tubes and capillaries and simulate the plug formation. Then, an original lagrangian approach has been used to evaluate the crack retention under typical PWR accident conditions, comparing its predictions with those given by the eulerian approach implemented in the ECART code. On the experimental side, the paper illustrates an aerosol production and measurement system developed to validate aerosol deposition models into cracks and the results that can be obtained: a series of tests were performed with monodispersed fluorescein aerosols injected into a cracked concrete sample. A key result that should be further explored refers to the high enhancement of aerosol retention that could be due to steam condensation. Recommendations concerning future experimentation are also given in the paper. (author)
Bayesian networks with applications in reliability analysis
Langseth, Helge
2002-01-01
A common goal of the papers in this thesis is to propose, formalize and exemplify the use of Bayesian networks as a modelling tool in reliability analysis. The papers span work in which Bayesian networks are merely used as a modelling tool (Paper I), work where models are specially designed to utilize the inference algorithms of Bayesian networks (Paper II and Paper III), and work where the focus has been on extending the applicability of Bayesian networks to very large domains (Paper IV and ...
Aerosol behavior in a steam-air environment
International Nuclear Information System (INIS)
The behavior of aerosols assumed to be characteristic of those generated during accident sequences and released into containment is being studied in the Nuclear Safety Pilot Plant (NSPP). Observation on the behavior of U3O8 aerosol, Fe2O3 aerosol, concrete aerosol, and various mixtures of these aerosols in a dry air environment and in a steam-air environment within the NSPP vessel are reported. Under dry conditions, the aerosols are agglomerated in the form of branched chains; the aerodynamic mass median diameter (AMMD) of the U3O8, Fe2O3 and mixed U3O8-Fe2O3 aerosols ranged between 1.5 and 3μm while that of the concrete aerosol was about 1 μm. A steam-air environment, which would be present in LWR containment during and following an accident, causes the U3O8, the Fe2O3, and mixed U3O8-Fe2O3 aerosols to behave differently from that in a dry atmosphere; the primary effect is an enhanced rate of removal of the aerosol from the vessel atmosphere. Steam does not have a significant effect on the removal rate of a concrete aerosol. Electron microscopy showed the agglomerated U3O8, Fe2O3, and mixed U3O8-Fe2O3 aerosols to be in the form of spherical clumps of particles differing from the intermingled branched chains observed in the dry air tests; the AMMD was in the range of 1 to 2 μm. Steam had a lesser influence on the physical shape of the concrete aerosol with the shape being intermediate between branched chain and spherical clumps. 9 figures
Tackett, J. L.; Getzewich, B. J.; Winker, D. M.; Vaughan, M. A.
2015-12-01
With nine years of retrievals, the CALIOP level 3 aerosol profile product provides an unprecedented synopsis of aerosol extinction in three dimensions and the potential to quantify changes in aerosol distributions over time. The CALIOP level 3 aerosol profile product, initially released as a beta product in 2011, reports monthly averages of quality-screened aerosol extinction profiles on a uniform latitude/longitude grid for different cloud-cover scenarios, called "sky conditions". This presentation demonstrates improvements to the second version of the product which will be released in September 2015. The largest improvements are the new sky condition definitions which parse the atmosphere into "cloud-free" views accessible to passive remote sensors, "all-sky" views accessible to active remote sensors and "cloudy-sky" views for opaque and transparent clouds which were previously inaccessible to passive remote sensors. Taken together, the new sky conditions comprehensively summarize CALIOP aerosol extinction profiles for a broad range of scientific queries. In addition to dust-only extinction profiles, the new version will include polluted-dust and smoke-only extinction averages. A new method is adopted for averaging dust-only extinction profiles to reduce high biases which exist in the beta version of the level 3 aerosol profile product. This presentation justifies the new averaging methodology and demonstrates vertical profiles of dust and smoke extinction over Africa during the biomass burning season. Another crucial advancement demonstrated in this presentation is a new approach for computing monthly mean aerosol optical depth which removes low biases reported in the beta version - a scenario unique to lidar datasets.
Characterization of Sodium Spray Aerosols
International Nuclear Information System (INIS)
The consequences of pool and spray fires require evaluation in the safety analysis of liquid metal-cooled fast breeder reactors. Sodium spray fires are characterized by high temperature and pressure, produced during the rapid combustion of sodium in air. Following the initial energy release, some fraction of the reaction products are available as aerosols which follow the normal laws of agglomeration, growth, settling, and plating. An experimental study is underway at Atomics International to study the characteristics of high concentration sprays of liquid sodium in reduced oxygen atmospheres and in air. The experiments are conducted in a 31.5 ft3 (2 ft diam. by 10 ft high) vessel, certified for a pressure of 100 lb/in2 (gauge). The spray injection apparatus consists of a heated sodium supply pot and a spray nozzle through which liquid sodium is driven by nitrogen pressure. Spray rate and droplet size can be varied by the injection velocity (nozzle size, nitrogen pressure, and sodium temperature). Aerosols produced in 0, 4, and 10 vol. % oxygen environments have been studied. The concentration and particle size distribution of the material remaining in the air after the spray injection and reaction period are measured. Fallout rates are found to be proportional to the concentration of aerosol which remains airborne following the spray period. (author)
Bayesian phylogeography finds its roots.
Directory of Open Access Journals (Sweden)
Philippe Lemey
2009-09-01
Full Text Available As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms.
Bayesian Methods and Universal Darwinism
Campbell, John
2009-12-01
Bayesian methods since the time of Laplace have been understood by their practitioners as closely aligned to the scientific method. Indeed a recent Champion of Bayesian methods, E. T. Jaynes, titled his textbook on the subject Probability Theory: the Logic of Science. Many philosophers of science including Karl Popper and Donald Campbell have interpreted the evolution of Science as a Darwinian process consisting of a `copy with selective retention' algorithm abstracted from Darwin's theory of Natural Selection. Arguments are presented for an isomorphism between Bayesian Methods and Darwinian processes. Universal Darwinism, as the term has been developed by Richard Dawkins, Daniel Dennett and Susan Blackmore, is the collection of scientific theories which explain the creation and evolution of their subject matter as due to the Operation of Darwinian processes. These subject matters span the fields of atomic physics, chemistry, biology and the social sciences. The principle of Maximum Entropy states that Systems will evolve to states of highest entropy subject to the constraints of scientific law. This principle may be inverted to provide illumination as to the nature of scientific law. Our best cosmological theories suggest the universe contained much less complexity during the period shortly after the Big Bang than it does at present. The scientific subject matter of atomic physics, chemistry, biology and the social sciences has been created since that time. An explanation is proposed for the existence of this subject matter as due to the evolution of constraints in the form of adaptations imposed on Maximum Entropy. It is argued these adaptations were discovered and instantiated through the Operations of a succession of Darwinian processes.
Bayesian Query-Focused Summarization
Daumé, Hal
2009-01-01
We present BayeSum (for ``Bayesian summarization''), a model for sentence extraction in query-focused summarization. BayeSum leverages the common case in which multiple documents are relevant to a single query. Using these documents as reinforcement for query terms, BayeSum is not afflicted by the paucity of information in short queries. We show that approximate inference in BayeSum is possible on large data sets and results in a state-of-the-art summarization system. Furthermore, we show how BayeSum can be understood as a justified query expansion technique in the language modeling for IR framework.
Numeracy, frequency, and Bayesian reasoning
Directory of Open Access Journals (Sweden)
Gretchen B. Chapman
2009-02-01
Full Text Available Previous research has demonstrated that Bayesian reasoning performance is improved if uncertainty information is presented as natural frequencies rather than single-event probabilities. A questionnaire study of 342 college students replicated this effect but also found that the performance-boosting benefits of the natural frequency presentation occurred primarily for participants who scored high in numeracy. This finding suggests that even comprehension and manipulation of natural frequencies requires a certain threshold of numeracy abilities, and that the beneficial effects of natural frequency presentation may not be as general as previously believed.
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Bayesian inference for Hawkes processes
DEFF Research Database (Denmark)
Rasmussen, Jakob Gulddahl
2013-01-01
The Hawkes process is a practically and theoretically important class of point processes, but parameter-estimation for such a process can pose various problems. In this paper we explore and compare two approaches to Bayesian inference. The first approach is based on the so-called conditional...... intensity function, while the second approach is based on an underlying clustering and branching structure in the Hawkes process. For practical use, MCMC (Markov chain Monte Carlo) methods are employed. The two approaches are compared numerically using three examples of the Hawkes process....
Collaborative Kalman Filtration: Bayesian Perspective
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil
Lisabon, Portugalsko: Institute for Systems and Technologies of Information, Control and Communication (INSTICC), 2014, s. 468-474. ISBN 978-989-758-039-0. [11th International Conference on Informatics in Control, Automation and Robotics - ICINCO 2014. Vien (AT), 01.09.2014-03.09.2014] R&D Projects: GA ČR(CZ) GP14-06678P Institutional support: RVO:67985556 Keywords : Bayesian analysis * Kalman filter * distributed estimation Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2014/AS/dedecius-0431324.pdf
Bayesian credible interval construction for Poisson statistics
Institute of Scientific and Technical Information of China (English)
ZHU Yong-Sheng
2008-01-01
The construction of the Bayesian credible (confidence) interval for a Poisson observable including both the signal and background with and without systematic uncertainties is presented.Introducing the conditional probability satisfying the requirement of the background not larger than the observed events to construct the Bayesian credible interval is also discussed.A Fortran routine,BPOCI,has been developed to implement the calculation.
Bayesian Decision Theoretical Framework for Clustering
Chen, Mo
2011-01-01
In this thesis, we establish a novel probabilistic framework for the data clustering problem from the perspective of Bayesian decision theory. The Bayesian decision theory view justifies the important questions: what is a cluster and what a clustering algorithm should optimize. We prove that the spectral clustering (to be specific, the…
Bayesian Statistics for Biological Data: Pedigree Analysis
Stanfield, William D.; Carlton, Matthew A.
2004-01-01
The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.
Using Bayesian Networks to Improve Knowledge Assessment
Millan, Eva; Descalco, Luis; Castillo, Gladys; Oliveira, Paula; Diogo, Sandra
2013-01-01
In this paper, we describe the integration and evaluation of an existing generic Bayesian student model (GBSM) into an existing computerized testing system within the Mathematics Education Project (PmatE--Projecto Matematica Ensino) of the University of Aveiro. This generic Bayesian student model had been previously evaluated with simulated…
Nonparametric Bayesian Modeling of Complex Networks
DEFF Research Database (Denmark)
Schmidt, Mikkel Nørgaard; Mørup, Morten
2013-01-01
Modeling structure in complex networks using Bayesian nonparametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data. This article provides a gentle introduction to nonparametric Bayesian modeling of complex networks: Using...... for complex networks can be derived and point out relevant literature....
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Chavira, Mark; Darwiche, Adnan
2004-01-01
We describe a system for exact inference with relational Bayesian networks as defined in the publicly available \\primula\\ tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by evaluating and ...
Compiling Relational Bayesian Networks for Exact Inference
DEFF Research Database (Denmark)
Jaeger, Manfred; Darwiche, Adnan; Chavira, Mark
We describe in this paper a system for exact inference with relational Bayesian networks as defined in the publicly available PRIMULA tool. The system is based on compiling propositional instances of relational Bayesian networks into arithmetic circuits and then performing online inference by eva...
Bayesian analysis of exoplanet and binary orbits
Schulze-Hartung, Tim; Henning, Thomas
2012-01-01
We introduce BASE (Bayesian astrometric and spectroscopic exoplanet detection and characterisation tool), a novel program for the combined or separate Bayesian analysis of astrometric and radial-velocity measurements of potential exoplanet hosts and binary stars. The capabilities of BASE are demonstrated using all publicly available data of the binary Mizar A.
Computational methods for Bayesian model choice
Robert, Christian P.; Wraith, Darren
2009-01-01
In this note, we shortly survey some recent approaches on the approximation of the Bayes factor used in Bayesian hypothesis testing and in Bayesian model choice. In particular, we reassess importance sampling, harmonic mean sampling, and nested sampling from a unified perspective.
2nd Bayesian Young Statisticians Meeting
Bitto, Angela; Kastner, Gregor; Posekany, Alexandra
2015-01-01
The Second Bayesian Young Statisticians Meeting (BAYSM 2014) and the research presented here facilitate connections among researchers using Bayesian Statistics by providing a forum for the development and exchange of ideas. WU Vienna University of Business and Economics hosted BAYSM 2014 from September 18th to 19th. The guidance of renowned plenary lecturers and senior discussants is a critical part of the meeting and this volume, which follows publication of contributions from BAYSM 2013. The meeting's scientific program reflected the variety of fields in which Bayesian methods are currently employed or could be introduced in the future. Three brilliant keynote lectures by Chris Holmes (University of Oxford), Christian Robert (Université Paris-Dauphine), and Mike West (Duke University), were complemented by 24 plenary talks covering the major topics Dynamic Models, Applications, Bayesian Nonparametrics, Biostatistics, Bayesian Methods in Economics, and Models and Methods, as well as a lively poster session ...
International Nuclear Information System (INIS)
This paper presents the results and assessment of the 'open' ISP37, which deals with the containment thermal-hydraulics and aerosol behavior during an unmitigated severe LWR accident with core melt-down and steam and aerosol release into the containment. Representatives of 22 organizations participated to the ISP37 using the codes CONTAIN, FIPLOC, MELCOR, RALOC, FUMO, MACRES, REMOVAL etc. The containment and aerosol behavior experiment VANAM M3 was selected as experimental comparison basis. The main phenomena investigated are the thermal behavior of a multi-compartment containment, e.g. pressure, temperature and the distribution and depletion of a soluble aerosol. The ISP37 has demonstrated that the codes used could calculate the thermal-hydraulic containment behavior in general with sufficient accuracy. But with respect to the needs of aerosol behavior analysis the accuracies, both analytical and experimental as well, for specific thermal-hydraulic variables should be improved. Although large progress has been made in the simulation of aerosol behavior in multi-compartment geometries the calculated local aerosol concentrations scatter widely. However, the aerosol source term to the environment is overestimated in general. The largest uncertainty concerning the aerosol results is caused by a limited number of thermal hydraulic variables like relative humidity, volume condensation rate and atmospheric flow rate. In some codes also a solubility model is missing
BAYESIAN BICLUSTERING FOR PATIENT STRATIFICATION.
Khakabimamaghani, Sahand; Ester, Martin
2016-01-01
The move from Empirical Medicine towards Personalized Medicine has attracted attention to Stratified Medicine (SM). Some methods are provided in the literature for patient stratification, which is the central task of SM, however, there are still significant open issues. First, it is still unclear if integrating different datatypes will help in detecting disease subtypes more accurately, and, if not, which datatype(s) are most useful for this task. Second, it is not clear how we can compare different methods of patient stratification. Third, as most of the proposed stratification methods are deterministic, there is a need for investigating the potential benefits of applying probabilistic methods. To address these issues, we introduce a novel integrative Bayesian biclustering method, called B2PS, for patient stratification and propose methods for evaluating the results. Our experimental results demonstrate the superiority of B2PS over a popular state-of-the-art method and the benefits of Bayesian approaches. Our results agree with the intuition that transcriptomic data forms a better basis for patient stratification than genomic data. PMID:26776199
Aerosol Chemistry Between Two Oceans: Auckland’s Urban Aerosol
Czech Academy of Sciences Publication Activity Database
Coulson, G.; Olivares, G.; Salmond, J.; Talbot, Nicholas
-: Italian Aerosol Society, 2015. ISBN N. [European Aerosol Conference EAC 2015. Milano (IT), 06.09.2015-11.09.2015] Institutional support: RVO:67985858 Keywords : urban pollution * aerosol processing * New Zealand Subject RIV: CF - Physical ; Theoretical Chemistry
Sensitivity of aerosol direct radiative forcing to aerosol vertical profile
Chung, Chul E.; Choi, Jung-Ok
2014-01-01
Aerosol vertical profile significantly affects the aerosol direct radiative forcing at the TOA level. The degree to which the aerosol profile impacts the aerosol forcing depends on many factors such as presence of cloud, surface albedo and aerosol single scattering albedo (SSA). Using a radiation model, we show that for absorbing aerosols (with an SSA of 0.7–0.8) whether aerosols are located above cloud or below induces at least one order of magnitude larger changes of the aerosol forcing tha...
Monitoring biological aerosols using UV fluorescence
Eversole, Jay D.; Roselle, Dominick; Seaver, Mark E.
1999-01-01
An apparatus has been designed and constructed to continuously monitor the number density, size, and fluorescent emission of ambient aerosol particles. The application of fluorescence to biological particles suspended in the atmosphere requires laser excitation in the UV spectral region. In this study, a Nd:YAG laser is quadrupled to provide a 266 nm wavelength to excite emission from single micrometer-sized particles in air. Fluorescent emission is used to continuously identify aerosol particles of biological origin. For calibration, biological samples of Bacillus subtilis spores and vegetative cells, Esherichia coli, Bacillus thuringiensis and Erwinia herbicola vegetative cells were prepared as suspensions in water and nebulized to produce aerosols. Detection of single aerosol particles, provides elastic scattering response as well as fluorescent emission in two spectral bands simultaneously. Our efforts have focuses on empirical characterization of the emission and scattering characteristics of various bacterial samples to determine the feasibility of optical discrimination between different cell types. Preliminary spectroscopic evidence suggest that different samples can be distinguished as separate bio-aerosol groups. In addition to controlled sample results, we will also discuss the most recent result on the effectiveness of detection outdoor releases and variations in environmental backgrounds.
Aerosols Science and Technology
Agranovski, Igor
2011-01-01
This self-contained handbook and ready reference examines aerosol science and technology in depth, providing a detailed insight into this progressive field. As such, it covers fundamental concepts, experimental methods, and a wide variety of applications, ranging from aerosol filtration to biological aerosols, and from the synthesis of carbon nanotubes to aerosol reactors.Written by a host of internationally renowned experts in the field, this is an essential resource for chemists and engineers in the chemical and materials disciplines across multiple industries, as well as ideal supplementary
Deposition and retention of radioactive aerosols on desert vegetation
International Nuclear Information System (INIS)
Deposition velocities and retention times were obtained for submicron aerosols of 134Cs and 141Ce on a shrub species (Artemisia tridentata) and a grass (Elymus elimoides) in a natural desert environment. Submicron aerosols of these two nuclides were artificially generated and released over a sagebrush community in southeast Idaho during each of three seasons: spring, summer and winter, to determine the effects of weathering and plant development on aerosol deposition and retention. Information on friction velocities, roughness lengths, and particle size was also obtained
Evaluation of a radioactive aerosol surveillance system
Energy Technology Data Exchange (ETDEWEB)
Scripsick, R.C.; Stafford, R.G.; Beckman, R.J.; Tillery, M.I.; Romero, P.O.
1978-06-26
Measurements of the dilution of air contaminants between worker breathing zone and area air samplers were made by releasing a test aerosol in a workroom equipped with an aerosol surveillance system. The data were used to evaluate performance, and suggest improvements in design of the workroom's alarming air monitor system. It was found that a breathing zone concentration of 960 times the maximum permissible concentration in air (MPC/sub a/) for a half-hour was required to trigger alarms of the existing monitoring system under some release conditions. Alternative air monitor placement, suggested from dilution measurements, would reduce this average triggering concentration to 354 MPC/sub a/. Deployment of additional air monitors could further reduce the average triggering concentration to 241 MPC/sub a/. The relation between number of monitors and triggering concentration was studied. No significant decrease in average triggering concentration was noted for arrays containing greater than five monitors.
Evaluation of a radioactive aerosol surveillance system
International Nuclear Information System (INIS)
Measurements of the dilution of air contaminants between worker breathing zone and area air samplers were made by releasing a test aerosol in a workroom equipped with an aerosol surveillance system. The data were used to evaluate performance, and suggest improvements in design of the workroom's alarming air monitor system. It was found that a breathing zone concentration of 960 times the maximum permissible concentration in air (MPC/sub a/) for a half-hour was required to trigger alarms of the existing monitoring system under some release conditions. Alternative air monitor placement, suggested from dilution measurements, would reduce this average triggering concentration to 354 MPC/sub a/. Deployment of additional air monitors could further reduce the average triggering concentration to 241 MPC/sub a/. The relation between number of monitors and triggering concentration was studied. No significant decrease in average triggering concentration was noted for arrays containing greater than five monitors
Bayesian networks in educational assessment
Almond, Russell G; Steinberg, Linda S; Yan, Duanli; Williamson, David M
2015-01-01
Bayesian inference networks, a synthesis of statistics and expert systems, have advanced reasoning under uncertainty in medicine, business, and social sciences. This innovative volume is the first comprehensive treatment exploring how they can be applied to design and analyze innovative educational assessments. Part I develops Bayes nets’ foundations in assessment, statistics, and graph theory, and works through the real-time updating algorithm. Part II addresses parametric forms for use with assessment, model-checking techniques, and estimation with the EM algorithm and Markov chain Monte Carlo (MCMC). A unique feature is the volume’s grounding in Evidence-Centered Design (ECD) framework for assessment design. This “design forward” approach enables designers to take full advantage of Bayes nets’ modularity and ability to model complex evidentiary relationships that arise from performance in interactive, technology-rich assessments such as simulations. Part III describes ECD, situates Bayes nets as ...
Quantum Bayesianism at the Perimeter
Fuchs, Christopher A
2010-01-01
The author summarizes the Quantum Bayesian viewpoint of quantum mechanics, developed originally by C. M. Caves, R. Schack, and himself. It is a view crucially dependent upon the tools of quantum information theory. Work at the Perimeter Institute for Theoretical Physics continues the development and is focused on the hard technical problem of a finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when one gambles on the consequences of interactions with physical systems. The article ends by outlining some directions for future work.
Bayesian Kernel Mixtures for Counts.
Canale, Antonio; Dunson, David B
2011-12-01
Although Bayesian nonparametric mixture models for continuous data are well developed, there is a limited literature on related approaches for count data. A common strategy is to use a mixture of Poissons, which unfortunately is quite restrictive in not accounting for distributions having variance less than the mean. Other approaches include mixing multinomials, which requires finite support, and using a Dirichlet process prior with a Poisson base measure, which does not allow smooth deviations from the Poisson. As a broad class of alternative models, we propose to use nonparametric mixtures of rounded continuous kernels. An efficient Gibbs sampler is developed for posterior computation, and a simulation study is performed to assess performance. Focusing on the rounded Gaussian case, we generalize the modeling framework to account for multivariate count data, joint modeling with continuous and categorical variables, and other complications. The methods are illustrated through applications to a developmental toxicity study and marketing data. This article has supplementary material online. PMID:22523437
Hedging Strategies for Bayesian Optimization
Brochu, Eric; de Freitas, Nando
2010-01-01
Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.
Nonparametric Bayesian inference in biostatistics
Müller, Peter
2015-01-01
As chapters in this book demonstrate, BNP has important uses in clinical sciences and inference for issues like unknown partitions in genomics. Nonparametric Bayesian approaches (BNP) play an ever expanding role in biostatistical inference from use in proteomics to clinical trials. Many research problems involve an abundance of data and require flexible and complex probability models beyond the traditional parametric approaches. As this book's expert contributors show, BNP approaches can be the answer. Survival Analysis, in particular survival regression, has traditionally used BNP, but BNP's potential is now very broad. This applies to important tasks like arrangement of patients into clinically meaningful subpopulations and segmenting the genome into functionally distinct regions. This book is designed to both review and introduce application areas for BNP. While existing books provide theoretical foundations, this book connects theory to practice through engaging examples and research questions. Chapters c...
On Bayesian System Reliability Analysis
International Nuclear Information System (INIS)
The view taken in this thesis is that reliability, the probability that a system will perform a required function for a stated period of time, depends on a person's state of knowledge. Reliability changes as this state of knowledge changes, i.e. when new relevant information becomes available. Most existing models for system reliability prediction are developed in a classical framework of probability theory and they overlook some information that is always present. Probability is just an analytical tool to handle uncertainty, based on judgement and subjective opinions. It is argued that the Bayesian approach gives a much more comprehensive understanding of the foundations of probability than the so called frequentistic school. A new model for system reliability prediction is given in two papers. The model encloses the fact that component failures are dependent because of a shared operational environment. The suggested model also naturally permits learning from failure data of similar components in non identical environments. 85 refs
Elvira, Clément; Dobigeon, Nicolas
2015-01-01
Sparse representations have proven their efficiency in solving a wide class of inverse problems encountered in signal and image processing. Conversely, enforcing the information to be spread uniformly over representation coefficients exhibits relevant properties in various applications such as digital communications. Anti-sparse regularization can be naturally expressed through an $\\ell_{\\infty}$-norm penalty. This paper derives a probabilistic formulation of such representations. A new probability distribution, referred to as the democratic prior, is first introduced. Its main properties as well as three random variate generators for this distribution are derived. Then this probability distribution is used as a prior to promote anti-sparsity in a Gaussian linear inverse problem, yielding a fully Bayesian formulation of anti-sparse coding. Two Markov chain Monte Carlo (MCMC) algorithms are proposed to generate samples according to the posterior distribution. The first one is a standard Gibbs sampler. The seco...
State Information in Bayesian Games
Cuff, Paul
2009-01-01
Two-player zero-sum repeated games are well understood. Computing the value of such a game is straightforward. Additionally, if the payoffs are dependent on a random state of the game known to one, both, or neither of the players, the resulting value of the game has been analyzed under the framework of Bayesian games. This investigation considers the optimal performance in a game when a helper is transmitting state information to one of the players. Encoding information for an adversarial setting (game) requires a different result than rate-distortion theory provides. Game theory has accentuated the importance of randomization (mixed strategy), which does not find a significant role in most communication modems and source coding codecs. Higher rates of communication, used in the right way, allow the message to include the necessary random component useful in games.
Cooperative extensions of the Bayesian game
Ichiishi, Tatsuro
2006-01-01
This is the very first comprehensive monograph in a burgeoning, new research area - the theory of cooperative game with incomplete information with emphasis on the solution concept of Bayesian incentive compatible strong equilibrium that encompasses the concept of the Bayesian incentive compatible core. Built upon the concepts and techniques in the classical static cooperative game theory and in the non-cooperative Bayesian game theory, the theory constructs and analyzes in part the powerful n -person game-theoretical model characterized by coordinated strategy-choice with individualistic ince
Bayesian models a statistical primer for ecologists
Hobbs, N Thompson
2015-01-01
Bayesian modeling has become an indispensable tool for ecological research because it is uniquely suited to deal with complexity in a statistically coherent way. This textbook provides a comprehensive and accessible introduction to the latest Bayesian methods-in language ecologists can understand. Unlike other books on the subject, this one emphasizes the principles behind the computations, giving ecologists a big-picture understanding of how to implement this powerful statistical approach. Bayesian Models is an essential primer for non-statisticians. It begins with a definition of probabili
Supra-Bayesian Combination of Probability Distributions
Czech Academy of Sciences Publication Activity Database
Sečkárová, Vladimíra
Veszprém : University of Pannonia, 2010, s. 112-117. ISBN 978-615-5044-00-7. [11th International PhD Workshop on Systems and Control. Veszprém (HU), 01.09.2010-03.09.2010] R&D Projects: GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Supra-Bayesian approach * sharing of probabilistic information * Bayesian decision making Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2010/AS/seckarova-supra-bayesian combination of probability distributions.pdf
Bayesian Soft Sensing in Cold Sheet Rolling
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil; Jirsa, Ladislav
Praha: ÚTIA AV ČR, v.v.i, 2010. s. 45-45. [6th International Workshop on Data–Algorithms–Decision Making. 2.12.2010-4.12.2010, Jindřichův Hradec] R&D Projects: GA MŠk(CZ) 7D09008 Institutional research plan: CEZ:AV0Z10750506 Keywords : soft sensor * bayesian statistics * bayesian model averaging Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2010/AS/dedecius-bayesian soft sensing in cold sheet rolling.pdf
Development of Multi-Wavelength Raman Lidar and its Application on Aerosol and Cloud Research
Directory of Open Access Journals (Sweden)
Liu Dong
2016-01-01
Full Text Available A movable multi-wavelength Raman lidar (TMPRL was built in Hefei, China. Emitting with three wavelengths at 1064, 532, and 355nm, receiving three above Mie scattering signals and two nitrogen Raman signals at 386 and 607nm, and depolarization signal at 532nm, TMPRL has the capacity to investigate the height resolved optical and microphysical properties of aerosol and cloud. The retrieval algorithms of optical parameters base on Mie-Raman technique and the microphysical parameters based on Bayesian optimization method were also developed and applied to observed lidar data. Designing to make unattended operation and 24/7 continuous working, TMPRL has joined several field campaigns to study on the aerosol, cloud and their interaction researches. Some observed results of aerosol and cloud optical properties and the first attempt to validate the vertical aerosol size distribution retrieved by TMPRL and in-situ measurement by airplane are presented and discussed.
Development of Multi-Wavelength Raman Lidar and its Application on Aerosol and Cloud Research
Liu, Dong; Wang, Yingjian; Wang, Zhenzhu; Tao, Zongming; Wu, Decheng; Wang, Bangxin; Zhong, Zhiqing; Xie, Chenbo
2016-06-01
A movable multi-wavelength Raman lidar (TMPRL) was built in Hefei, China. Emitting with three wavelengths at 1064, 532, and 355nm, receiving three above Mie scattering signals and two nitrogen Raman signals at 386 and 607nm, and depolarization signal at 532nm, TMPRL has the capacity to investigate the height resolved optical and microphysical properties of aerosol and cloud. The retrieval algorithms of optical parameters base on Mie-Raman technique and the microphysical parameters based on Bayesian optimization method were also developed and applied to observed lidar data. Designing to make unattended operation and 24/7 continuous working, TMPRL has joined several field campaigns to study on the aerosol, cloud and their interaction researches. Some observed results of aerosol and cloud optical properties and the first attempt to validate the vertical aerosol size distribution retrieved by TMPRL and in-situ measurement by airplane are presented and discussed.
Energy Technology Data Exchange (ETDEWEB)
Bauer, Susanne E.; Menon, Surabi; Koch, Dorothy; Bond, Tami; Tsigaridis, Kostas
2010-04-09
Recently, attention has been drawn towards black carbon aerosols as a likely short-term climate warming mitigation candidate. However the global and regional impacts of the direct, cloud-indirect and semi-direct forcing effects are highly uncertain, due to the complex nature of aerosol evolution and its climate interactions. Black carbon is directly released as particle into the atmosphere, but then interacts with other gases and particles through condensation and coagulation processes leading to further aerosol growth, aging and internal mixing. A detailed aerosol microphysical scheme, MATRIX, embedded within the global GISS modelE includes the above processes that determine the lifecycle and climate impact of aerosols. This study presents a quantitative assessment of the impact of microphysical processes involving black carbon, such as emission size distributions and optical properties on aerosol cloud activation and radiative forcing. Our best estimate for net direct and indirect aerosol radiative forcing change is -0.56 W/m{sup 2} between 1750 and 2000. However, the direct and indirect aerosol effects are very sensitive to the black and organic carbon size distribution and consequential mixing state. The net radiative forcing change can vary between -0.32 to -0.75 W/m{sup 2} depending on these carbonaceous particle properties. Assuming that sulfates, nitrates and secondary organics form a coating shell around a black carbon core, rather than forming a uniformly mixed particles, changes the overall net radiative forcing from a negative to a positive number. Black carbon mitigation scenarios showed generally a benefit when mainly black carbon sources such as diesel emissions are reduced, reducing organic and black carbon sources such as bio-fuels, does not lead to reduced warming.
The Diagnosis of Reciprocating Machinery by Bayesian Networks
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
A Bayesian Network is a reasoning tool based on probability theory and has many advantages that other reasoning tools do not have. This paper discusses the basic theory of Bayesian networks and studies the problems in constructing Bayesian networks. The paper also constructs a Bayesian diagnosis network of a reciprocating compressor. The example helps us to draw a conclusion that Bayesian diagnosis networks can diagnose reciprocating machinery effectively.
The Effect of Water Injection on the Fission Product Aerosol Behavior in Fukushima Unit 1
International Nuclear Information System (INIS)
The most important factor affects human health is fission product that is released from the plant. Fission products usually released with types of aerosol and vapor. The amount of released aerosols out of the plant is crucial, because it can be breathed by people. In this study, the best estimated scenario of Fukushima unit 1 accident was modeled with MELCOR. The amount of released fission product aerosols was estimated according to the amount of added water into reactor pressure vessel (RPV). The analysis of Fukushima unit 1 accident was conducted in view of fission product aerosol release using MELCOR. First of all, thermodynamic results of the plant were compared to the measured data, and then fission product aerosol (CsOH) behavior was calculated with changing the amount of water injection. Water injection affects the amount of aerosol which released into reactor building, because it decreases the temperature of deposition surface. In this study, only aerosol behavior was considered, further study will be conducted including hygroscopic model
Tsimpidi, A. P.; V. A. Karydis; Pandis, S. N.; Lelieveld, J.
2016-01-01
Emissions of organic compounds from biomass, biofuel and fossil fuel combustion strongly influence the global atmospheric aerosol load. Some of the organics are directly released as primary organic aerosol (POA). Most are emitted in the gas phase and undergo chemical transformations (i.e., oxidation by hydroxyl radical) and form secondary organic aerosol (SOA). In this work we use the global chemistry climate model EMAC with a computation...
Atmospheric fallout of sodium combustion aerosols
International Nuclear Information System (INIS)
Five sodium combustion product release tests were conducted in the open atmosphere at INEL, Idaho. About 100 kg of sodium was burned in 5 min at 30 m elevation in two of the tests. Fallout distribution and combustion product species determinations were made. The principal fallout occurred near the release point and decreased exponentially as the plume moved downwind. The tests indicated that little fallout of combustion product aerosols occurred beyond a few hundred meters from the source under the given meteorological conditions. 2 refs
An Intuitive Dashboard for Bayesian Network Inference
Reddy, Vikas; Charisse Farr, Anna; Wu, Paul; Mengersen, Kerrie; Yarlagadda, Prasad K. D. V.
2014-03-01
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
A Bayesian approach to model uncertainty
International Nuclear Information System (INIS)
A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given
An Intuitive Dashboard for Bayesian Network Inference
International Nuclear Information System (INIS)
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++
Bayesian Control for Concentrating Mixed Nuclear Waste
Welch, Robert L.; Smith, Clayton
2013-01-01
A control algorithm for batch processing of mixed waste is proposed based on conditional Gaussian Bayesian networks. The network is compiled during batch staging for real-time response to sensor input.
Learning Bayesian networks for discrete data
Liang, Faming
2009-02-01
Bayesian networks have received much attention in the recent literature. In this article, we propose an approach to learn Bayesian networks using the stochastic approximation Monte Carlo (SAMC) algorithm. Our approach has two nice features. Firstly, it possesses the self-adjusting mechanism and thus avoids essentially the local-trap problem suffered by conventional MCMC simulation-based approaches in learning Bayesian networks. Secondly, it falls into the class of dynamic importance sampling algorithms; the network features can be inferred by dynamically weighted averaging the samples generated in the learning process, and the resulting estimates can have much lower variation than the single model-based estimates. The numerical results indicate that our approach can mix much faster over the space of Bayesian networks than the conventional MCMC simulation-based approaches. © 2008 Elsevier B.V. All rights reserved.
DEFF Research Database (Denmark)
Butcher, Andrew Charles
Aerosols are important climactically. Their specific emissions are key to reducing the uncertainty in global climate models. Marine aerosols make up the largest source of primary aerosols to the Earth's atmosphere. Uncertainty in marine aerosol mass and number flux lies in separating primary...... emissions produced directly from bubble bursting as the result of air entrainment from breaking waves and particles generated from secondary emissions of volatile organic compounds. In the first paper, we study the chemical properties of particles produced from several sea water proxies with the use of a...... cloud condensation nuclei ounter. Proxy solutions with high inorganic salt concentrations and some organics produce sea spray aerosol particles with little change in cloud condensation activity relative to pure salts. Comparison is made between a frit based method for bubble production and a plunging...
Methods for Bayesian Power Spectrum Inference with Galaxy Surveys
Jasche, Jens; Wandelt, Benjamin D.
2013-12-01
We derive and implement a full Bayesian large scale structure inference method aiming at precision recovery of the cosmological power spectrum from galaxy redshift surveys. Our approach improves upon previous Bayesian methods by performing a joint inference of the three-dimensional density field, the cosmological power spectrum, luminosity dependent galaxy biases, and corresponding normalizations. We account for all joint and correlated uncertainties between all inferred quantities. Classes of galaxies with different biases are treated as separate subsamples. This method therefore also allows the combined analysis of more than one galaxy survey. In particular, it solves the problem of inferring the power spectrum from galaxy surveys with non-trivial survey geometries by exploring the joint posterior distribution with efficient implementations of multiple block Markov chain and Hybrid Monte Carlo methods. Our Markov sampler achieves high statistical efficiency in low signal-to-noise regimes by using a deterministic reversible jump algorithm. This approach reduces the correlation length of the sampler by several orders of magnitude, turning the otherwise numerically unfeasible problem of joint parameter exploration into a numerically manageable task. We test our method on an artificial mock galaxy survey, emulating characteristic features of the Sloan Digital Sky Survey data release 7, such as its survey geometry and luminosity-dependent biases. These tests demonstrate the numerical feasibility of our large scale Bayesian inference frame work when the parameter space has millions of dimensions. This method reveals and correctly treats the anti-correlation between bias amplitudes and power spectrum, which are not taken into account in current approaches to power spectrum estimation, a 20% effect across large ranges in k space. In addition, this method results in constrained realizations of density fields obtained without assuming the power spectrum or bias parameters
Bayesian Variable Selection in Spatial Autoregressive Models
Jesus Crespo Cuaresma; Philipp Piribauer
2015-01-01
This paper compares the performance of Bayesian variable selection approaches for spatial autoregressive models. We present two alternative approaches which can be implemented using Gibbs sampling methods in a straightforward way and allow us to deal with the problem of model uncertainty in spatial autoregressive models in a flexible and computationally efficient way. In a simulation study we show that the variable selection approaches tend to outperform existing Bayesian model averaging tech...
Bayesian Analysis of Multivariate Probit Models
Siddhartha Chib; Edward Greenberg
1996-01-01
This paper provides a unified simulation-based Bayesian and non-Bayesian analysis of correlated binary data using the multivariate probit model. The posterior distribution is simulated by Markov chain Monte Carlo methods, and maximum likelihood estimates are obtained by a Markov chain Monte Carlo version of the E-M algorithm. Computation of Bayes factors from the simulation output is also considered. The methods are applied to a bivariate data set, to a 534-subject, four-year longitudinal dat...
Kernel Bayesian Inference with Posterior Regularization
Song, Yang; Jun ZHU; Ren, Yong
2016-01-01
We propose a vector-valued regression problem whose solution is equivalent to the reproducing kernel Hilbert space (RKHS) embedding of the Bayesian posterior distribution. This equivalence provides a new understanding of kernel Bayesian inference. Moreover, the optimization problem induces a new regularization for the posterior embedding estimator, which is faster and has comparable performance to the squared regularization in kernel Bayes' rule. This regularization coincides with a former th...
Fitness inheritance in the Bayesian optimization algorithm
Pelikan, Martin; Sastry, Kumara
2004-01-01
This paper describes how fitness inheritance can be used to estimate fitness for a proportion of newly sampled candidate solutions in the Bayesian optimization algorithm (BOA). The goal of estimating fitness for some candidate solutions is to reduce the number of fitness evaluations for problems where fitness evaluation is expensive. Bayesian networks used in BOA to model promising solutions and generate the new ones are extended to allow not only for modeling and sampling candidate solutions...
Bayesian Network Models for Adaptive Testing
Czech Academy of Sciences Publication Activity Database
Plajner, Martin; Vomlel, Jiří
Achen: Sun SITE Central Europe, 2016 - (Agosta, J.; Carvalho, R.), s. 24-33. (CEUR Workshop Proceedings. Vol 1565). ISSN 1613-0073. [The Twelfth UAI Bayesian Modeling Applications Workshop (BMAW 2015). Amsterdam (NL), 16.07.2015] R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Bayesian networks * Computerized adaptive testing Subject RIV: JD - Computer Applications, Robotics http://library.utia.cas.cz/separaty/2016/MTR/plajner-0458062.pdf
Nomograms for Visualization of Naive Bayesian Classifier
Možina, Martin; Demšar, Janez; Michael W Kattan; Zupan, Blaz
2004-01-01
Besides good predictive performance, the naive Bayesian classifier can also offer a valuable insight into the structure of the training data and effects of the attributes on the class probabilities. This structure may be effectively revealed through visualization of the classifier. We propose a new way to visualize the naive Bayesian model in the form of a nomogram. The advantages of the proposed method are simplicity of presentation, clear display of the effects of individual attribute value...
Subjective Bayesian Analysis: Principles and Practice
Goldstein, Michael
2006-01-01
We address the position of subjectivism within Bayesian statistics. We argue, first, that the subjectivist Bayes approach is the only feasible method for tackling many important practical problems. Second, we describe the essential role of the subjectivist approach in scientific analysis. Third, we consider possible modifications to the Bayesian approach from a subjectivist viewpoint. Finally, we address the issue of pragmatism in implementing the subjectivist approach.
An Entropy Search Portfolio for Bayesian Optimization
Shahriari, Bobak; Wang, Ziyu; Hoffman, Matthew W.; Bouchard-Côté, Alexandre; De Freitas, Nando
2014-01-01
Bayesian optimization is a sample-efficient method for black-box global optimization. How- ever, the performance of a Bayesian optimization method very much depends on its exploration strategy, i.e. the choice of acquisition function, and it is not clear a priori which choice will result in superior performance. While portfolio methods provide an effective, principled way of combining a collection of acquisition functions, they are often based on measures of past performance which can be misl...
A Bayesian Framework for Active Artificial Perception
Ferreira, Joao; Lobo, Jorge; Bessiere, Pierre; Castelo-Branco, M; Dias, Jorge
2012-01-01
In this text, we present a Bayesian framework for active multimodal perception of 3D structure and motion. The design of this framework finds its inspiration in the role of the dorsal perceptual pathway of the human brain. Its composing models build upon a common egocentric spatial configuration that is naturally fitting for the integration of readings from multiple sensors using a Bayesian approach. In the process, we will contribute with efficient and robust probabilistic solutions for cycl...
Bayesian Classification in Medicine: The Transferability Question *
Zagoria, Ronald J.; Reggia, James A.; Price, Thomas R.; Banko, Maryann
1981-01-01
Using probabilities derived from a geographically distant patient population, we applied Bayesian classification to categorize stroke patients by etiology. Performance was assessed both by error rate and with a new linear accuracy coefficient. This approach to patient classification was found to be surprisingly accurate when compared to classification by two neurologists and to classification by the Bayesian method using “low cost” local and subjective probabilities. We conclude that for some...
Fuzzy Functional Dependencies and Bayesian Networks
Institute of Scientific and Technical Information of China (English)
LIU WeiYi(刘惟一); SONG Ning(宋宁)
2003-01-01
Bayesian networks have become a popular technique for representing and reasoning with probabilistic information. The fuzzy functional dependency is an important kind of data dependencies in relational databases with fuzzy values. The purpose of this paper is to set up a connection between these data dependencies and Bayesian networks. The connection is done through a set of methods that enable people to obtain the most information of independent conditions from fuzzy functional dependencies.
Evaluation System for a Bayesian Optimization Service
Dewancker, Ian; McCourt, Michael; Clark, Scott; Hayes, Patrick; Johnson, Alexandra; Ke, George
2016-01-01
Bayesian optimization is an elegant solution to the hyperparameter optimization problem in machine learning. Building a reliable and robust Bayesian optimization service requires careful testing methodology and sound statistical analysis. In this talk we will outline our development of an evaluation framework to rigorously test and measure the impact of changes to the SigOpt optimization service. We present an overview of our evaluation system and discuss how this framework empowers our resea...
Bayesian target tracking based on particle filter
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
For being able to deal with the nonlinear or non-Gaussian problems, particle filters have been studied by many researchers. Based on particle filter, the extended Kalman filter (EKF) proposal function is applied to Bayesian target tracking. Markov chain Monte Carlo (MCMC) method, the resampling step, etc novel techniques are also introduced into Bayesian target tracking. And the simulation results confirm the improved particle filter with these techniques outperforms the basic one.
Bayesian Models of Brain and Behaviour
Penny, William
2012-01-01
This paper presents a review of Bayesian models of brain and behaviour. We first review the basic principles of Bayesian inference. This is followed by descriptions of sampling and variational methods for approximate inference, and forward and backward recursions in time for inference in dynamical models. The review of behavioural models covers work in visual processing, sensory integration, sensorimotor integration, and collective decision making. The review of brain models covers a range of...
Bayesian Approach to Handling Informative Sampling
Sikov, Anna
2015-01-01
In the case of informative sampling the sampling scheme explicitly or implicitly depends on the response variable. As a result, the sample distribution of response variable can- not be used for making inference about the population. In this research I investigate the problem of informative sampling from the Bayesian perspective. Application of the Bayesian approach permits solving the problems, which arise due to complexity of the models, being used for handling informative sampling. The main...
Interactions of fission product vapours with aerosols
Energy Technology Data Exchange (ETDEWEB)
Benson, C.G.; Newland, M.S. [AEA Technology, Winfrith (United Kingdom)
1996-12-01
Reactions between structural and reactor materials aerosols and fission product vapours released during a severe accident in a light water reactor (LWR) will influence the magnitude of the radiological source term ultimately released to the environment. The interaction of cadmium aerosol with iodine vapour at different temperatures has been examined in a programme of experiments designed to characterise the kinetics of the system. Laser induced fluorescence (LIF) is a technique that is particularly amenable to the study of systems involving elemental iodine because of the high intensity of the fluorescence lines. Therefore this technique was used in the experiments to measure the decrease in the concentration of iodine vapour as the reaction with cadmium proceeded. Experiments were conducted over the range of temperatures (20-350{sup o}C), using calibrated iodine vapour and cadmium aerosol generators that gave well-quantified sources. The LIF results provided information on the kinetics of the process, whilst examination of filter samples gave data on the composition and morphology of the aerosol particles that were formed. The results showed that the reaction of cadmium with iodine was relatively fast, giving reaction half-lives of approximately 0.3 s. This suggests that the assumption used by primary circuit codes such as VICTORIA that reaction rates are mass-transfer limited, is justified for the cadmium-iodine reaction. The reaction was first order with respect to both cadmium and iodine, and was assigned as pseudo second order overall. However, there appeared to be a dependence of aerosol surface area on the overall rate constant, making the precise order of the reaction difficult to assign. The relatively high volatility of the cadmium iodide formed in the reaction played an important role in determining the composition of the particles. (author) 23 figs., 7 tabs., 22 refs.
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Bayesian inference of mass segregation of open clusters
Shao, Zhengyi; Chen, Li; Lin, Chien-Cheng; Zhong, Jing; Hou, Jinliang
2015-08-01
Based on the Bayesian inference (BI) method, the mixture-modeling approach is improved to combine all kinematic data, including the coordinative position, proper motion (PM) and radial velocity (RV), to separate the motion of the cluster from field stars in its area, as well as to describe the intrinsic kinematic status. Meanwhile, the membership probabilities of individual stars are determined as by product results. This method has been testified by simulation of toy models and it was found that the joint usage of multiple kinematic data can significantly reduce the missing rate of membership determination, say from ~15% for single data type to 1% for using all position, proper motion and radial velocity data.By combining kinematic data from multiple sources of photometric and redshift surveys, such as WIYN and APOGEE, M67 and NGC188 are revisited. Mass segregation is identified clearly for both of these two old open clusters, either in position or in PM spaces, since the Bayesian evidence (BE) of the model, which includes the segregation parameters, is much larger than that without it. The ongoing work is applying this method to the LAMOST released data which contains a large amount of RVs cover ~200 nearby open clusters. If the coming GAIA data can be used, the accuracy of tangential velocity will be largely improved and the intrinsic kinematics of open clusters can be well investigated, though they are usually less than 1 km/s.
Bayesian analysis of inflationary features in Planck and SDSS data
Benetti, Micol
2016-01-01
We perform a Bayesian analysis to study possible features in the primordial inflationary power spectrum of scalar perturbations. In particular, we analyse the possibility of detecting the imprint of these primordial features in the anisotropy temperature power spectrum of the Cosmic Microwave Background (CMB) and also in the matter power spectrum P (k). We use the most recent CMB data provided by the Planck Collaboration and P (k) measurements from the eleventh data release of the Sloan Digital Sky Survey. We focus our analysis on a class of potentials whose features are localised at different intervals of angular scales, corresponding to multipoles in the ranges 10 < l < 60 (Oscill-1) and 150 < l < 300 (Oscill-2). Our results show that one of the step-potentials (Oscill-1) provides a better fit to the CMB data than does the featureless LCDM scenario, with a moderate Bayesian evidence in favor of the former. Adding the P (k) data to the analysis weakens the evidence of the Oscill-1 potential relat...
Barbaro, Elena; Kirchgeorg, Torben; Zangrando, Roberta; Vecchiato, Marco; Piazza, Rossano; Barbante, Carlo; Gambaro, Andrea
2015-10-01
The processes and transformations occurring in the Antarctic aerosol during atmospheric transport were described using selected sugars as source tracers. Monosaccharides (arabinose, fructose, galactose, glucose, mannose, ribose, xylose), disaccharides (sucrose, lactose, maltose, lactulose), alcohol-sugars (erythritol, mannitol, ribitol, sorbitol, xylitol, maltitol, galactitol) and anhydrosugars (levoglucosan, mannosan and galactosan) were measured in the Antarctic aerosol collected during four different sampling campaigns. For quantification, a sensitive high-pressure anion exchange chromatography was coupled with a single quadrupole mass spectrometer. The method was validated, showing good accuracy and low method quantification limits. This study describes the first determination of sugars in the Antarctic aerosol. The total mean concentration of sugars in the aerosol collected at the "Mario Zucchelli" coastal station was 140 pg m-3; as for the aerosol collected over the Antarctic plateau during two consecutive sampling campaigns, the concentration amounted to 440 and 438 pg m-3. The study of particle-size distribution allowed us to identify the natural emission from spores or from sea-spray as the main sources of sugars in the coastal area. The enrichment of sugars in the fine fraction of the aerosol collected on the Antarctic plateau is due to the degradation of particles during long-range atmospheric transport. The composition of sugars in the coarse fraction was also investigated in the aerosol collected during the oceanographic cruise.
Inverse problems in the Bayesian framework
International Nuclear Information System (INIS)
The history of Bayesian methods dates back to the original works of Reverend Thomas Bayes and Pierre-Simon Laplace: the former laid down some of the basic principles on inverse probability in his classic article ‘An essay towards solving a problem in the doctrine of chances’ that was read posthumously in the Royal Society in 1763. Laplace, on the other hand, in his ‘Memoirs on inverse probability’ of 1774 developed the idea of updating beliefs and wrote down the celebrated Bayes’ formula in the form we know today. Although not identified yet as a framework for investigating inverse problems, Laplace used the formalism very much in the spirit it is used today in the context of inverse problems, e.g., in his study of the distribution of comets. With the evolution of computational tools, Bayesian methods have become increasingly popular in all fields of human knowledge in which conclusions need to be drawn based on incomplete and noisy data. Needless to say, inverse problems, almost by definition, fall into this category. Systematic work for developing a Bayesian inverse problem framework can arguably be traced back to the 1980s, (the original first edition being published by Elsevier in 1987), although articles on Bayesian methodology applied to inverse problems, in particular in geophysics, had appeared much earlier. Today, as testified by the articles in this special issue, the Bayesian methodology as a framework for considering inverse problems has gained a lot of popularity, and it has integrated very successfully with many traditional inverse problems ideas and techniques, providing novel ways to interpret and implement traditional procedures in numerical analysis, computational statistics, signal analysis and data assimilation. The range of applications where the Bayesian framework has been fundamental goes from geophysics, engineering and imaging to astronomy, life sciences and economy, and continues to grow. There is no question that Bayesian
Bayesian Vision for Shape Recovery
Jalobeanu, Andre
2004-01-01
We present a new Bayesian vision technique that aims at recovering a shape from two or more noisy observations taken under similar lighting conditions. The shape is parametrized by a piecewise linear height field, textured by a piecewise linear irradiance field, and we assume Gaussian Markovian priors for both shape vertices and irradiance variables. The observation process. also known as rendering, is modeled by a non-affine projection (e.g. perspective projection) followed by a convolution with a piecewise linear point spread function. and contamination by additive Gaussian noise. We assume that the observation parameters are calibrated beforehand. The major novelty of the proposed method consists of marginalizing out the irradiances considered as nuisance parameters, which is achieved by Laplace approximations. This reduces the inference to minimizing an energy that only depends on the shape vertices, and therefore allows an efficient Iterated Conditional Mode (ICM) optimization scheme to be implemented. A Gaussian approximation of the posterior shape density is computed, thus providing estimates both the geometry and its uncertainty. We illustrate the effectiveness of the new method by shape reconstruction results in a 2D case. A 3D version is currently under development and aims at recovering a surface from multiple images, reconstructing the topography by marginalizing out both albedo and shading.
Bayesian analysis of cosmic structures
Kitaura, Francisco-Shu
2011-01-01
We revise the Bayesian inference steps required to analyse the cosmological large-scale structure. Here we make special emphasis in the complications which arise due to the non-Gaussian character of the galaxy and matter distribution. In particular we investigate the advantages and limitations of the Poisson-lognormal model and discuss how to extend this work. With the lognormal prior using the Hamiltonian sampling technique and on scales of about 4 h^{-1} Mpc we find that the over-dense regions are excellent reconstructed, however, under-dense regions (void statistics) are quantitatively poorly recovered. Contrary to the maximum a posteriori (MAP) solution which was shown to over-estimate the density in the under-dense regions we obtain lower densities than in N-body simulations. This is due to the fact that the MAP solution is conservative whereas the full posterior yields samples which are consistent with the prior statistics. The lognormal prior is not able to capture the full non-linear regime at scales ...
BAYESIAN APPROACH OF DECISION PROBLEMS
Directory of Open Access Journals (Sweden)
DRAGOŞ STUPARU
2010-01-01
Full Text Available Management is nowadays a basic vector of economic development, a concept frequently used in our country as well as all over the world. Indifferently of the hierarchical level at which the managerial process is manifested, decision represents its essential moment, the supreme act of managerial activity. Its can be met in all fields of activity, practically having an unlimited degree of coverage, and in all the functions of management. It is common knowledge that the activity of any type of manger, no matter the hierarchical level he occupies, represents a chain of interdependent decisions, their aim being the elimination or limitation of the influence of disturbing factors that may endanger the achievement of predetermined objectives, and the quality of managerial decisions condition the progress and viability of any enterprise. Therefore, one of the principal characteristics of a successful manager is his ability to adopt the most optimal decisions of high quality. The quality of managerial decisions are conditioned by the manager’s general level of education and specialization, the manner in which they are preoccupied to assimilate the latest information and innovations in the domain of management’s theory and practice and the applying of modern managerial methods and techniques in the activity of management. We are presenting below the analysis of decision problems in hazardous conditions in terms of Bayesian theory – a theory that uses the probabilistic calculus.
Bayesian analysis of volcanic eruptions
Ho, Chih-Hsiang
1990-10-01
The simple Poisson model generally gives a good fit to many volcanoes for volcanic eruption forecasting. Nonetheless, empirical evidence suggests that volcanic activity in successive equal time-periods tends to be more variable than a simple Poisson with constant eruptive rate. An alternative model is therefore examined in which eruptive rate(λ) for a given volcano or cluster(s) of volcanoes is described by a gamma distribution (prior) rather than treated as a constant value as in the assumptions of a simple Poisson model. Bayesian analysis is performed to link two distributions together to give the aggregate behavior of the volcanic activity. When the Poisson process is expanded to accomodate a gamma mixing distribution on λ, a consequence of this mixed (or compound) Poisson model is that the frequency distribution of eruptions in any given time-period of equal length follows the negative binomial distribution (NBD). Applications of the proposed model and comparisons between the generalized model and simple Poisson model are discussed based on the historical eruptive count data of volcanoes Mauna Loa (Hawaii) and Etna (Italy). Several relevant facts lead to the conclusion that the generalized model is preferable for practical use both in space and time.
Contaminant source reconstruction by empirical Bayes and Akaike's Bayesian Information Criterion
Zanini, Andrea; Woodbury, Allan D.
2016-02-01
The objective of the paper is to present an empirical Bayesian method combined with Akaike's Bayesian Information Criterion (ABIC) to estimate the contaminant release history of a source in groundwater starting from few concentration measurements in space and/or in time. From the Bayesian point of view, the ABIC considers prior information on the unknown function, such as the prior distribution (assumed Gaussian) and the covariance function. The unknown statistical quantities, such as the noise variance and the covariance function parameters, are computed through the process; moreover the method quantifies also the estimation error through the confidence intervals. The methodology was successfully tested on three test cases: the classic Skaggs and Kabala release function, three sharp releases (both cases regard the transport in a one-dimensional homogenous medium) and data collected from laboratory equipment that consists of a two-dimensional homogeneous unconfined aquifer. The performances of the method were tested with two different covariance functions (Gaussian and exponential) and also with large measurement error. The obtained results were discussed and compared to the geostatistical approach of Kitanidis (1995).
International Nuclear Information System (INIS)
Submicron aerosols, ranging in particle diameter from 0.1 μm to 0.001 μm, and in number concentration from 10,000 to 100,000 per cm3, are more or less continuously suspended in the atmosphere we breathe. They usually require in situ measurement of concentration and size distribution with instruments such as diffusion batteries and condensation nucleus counters. Laboratory measurements require the development of submicron aerosol generators. The development of several of these devices and their use in the laboratory and field to measure radioactive as well as inactive aerosols is described
An introduction to Gaussian Bayesian networks.
Grzegorczyk, Marco
2010-01-01
The extraction of regulatory networks and pathways from postgenomic data is important for drug -discovery and development, as the extracted pathways reveal how genes or proteins regulate each other. Following up on the seminal paper of Friedman et al. (J Comput Biol 7:601-620, 2000), Bayesian networks have been widely applied as a popular tool to this end in systems biology research. Their popularity stems from the tractability of the marginal likelihood of the network structure, which is a consistent scoring scheme in the Bayesian context. This score is based on an integration over the entire parameter space, for which highly expensive computational procedures have to be applied when using more complex -models based on differential equations; for example, see (Bioinformatics 24:833-839, 2008). This chapter gives an introduction to reverse engineering regulatory networks and pathways with Gaussian Bayesian networks, that is Bayesian networks with the probabilistic BGe scoring metric [see (Geiger and Heckerman 235-243, 1995)]. In the BGe model, the data are assumed to stem from a Gaussian distribution and a normal-Wishart prior is assigned to the unknown parameters. Gaussian Bayesian network methodology for analysing static observational, static interventional as well as dynamic (observational) time series data will be described in detail in this chapter. Finally, we apply these Bayesian network inference methods (1) to observational and interventional flow cytometry (protein) data from the well-known RAF pathway to evaluate the global network reconstruction accuracy of Bayesian network inference and (2) to dynamic gene expression time series data of nine circadian genes in Arabidopsis thaliana to reverse engineer the unknown regulatory network topology for this domain. PMID:20824469
Lessons learned from case studies of worker exposures to radioactive aerosols
International Nuclear Information System (INIS)
Considerable efforts in the aerosol science and health protection communities are devoted to developing a defensible technical basis for measuring, modeling, and mitigating toxic aerosols. These efforts involve understanding aerosol source terms, projecting potential aerosol releases, describing their behavior in the workplace and environment, developing instruments and techniques to measure the aerosols, designing ways to contain or control the aerosols, modeling and measuring uptake by workers and other people, estimating health effects, and planning appropriate responses. To help in this effort, we have compiled a data base of case studies involving releases of aerosols and worker exposures in a wide range of industries. Sources of information have included personal communications, limited distribution reports, open literature publications, and reports of abnormal occurrences in U.S. Department of Energy facilities and among licensees of the U.S. Nuclear Regulatory Commission. The data base currently includes more than 100 cases. The case studies have been organized according to the radionuclides involved and the circumstances and consequences of the release. This information has been used to address a number of important questions, such as the adequacy of current aerosol sampling and monitoring procedures, areas needing improvement, and strategies for planning for or responding to accidents. One area of particular interest is related to strategies for prospective or retrospective characterization of aerosol source terms. In some cases, worker exposures have involved aerosols that are similar in particle size distribution, composition, and solubility to aerosols routinely produced in the normal process activities. In such cases, prospective characterization of aerosol source terms has provided relevant and useful information
Improved global aerosol datasets for 2008 from Aerosol_cci
Holzer-Popp, Thomas; de Leeuw, Gerrit
2013-04-01
. Users (MACC/ECMWF, AEROCOM) confirmed the relevance of this additional information and encouraged Aerosol_cci to release the current uncertainties. A thorough comparison was conducted for the three AATSR algorithms. Care was taken to compare equal data amounts by common point filtering. It was found that in some cases different filtering led to contradicting validation results. This is not yet completely understood and needs further analysis. Obviously one aspect is the anti-correlation between coverage and accuracy and thus the importance of the applied quality control methods (in particular to avoid cloud contamination). Also limitations of the available reference datasets over open ocean and in the Southern hemisphere became obvious. The validation showed that all three AATSR algorithms produce almost equal accuracy, but show differences in the resulting datasets (similar to those between MODIS and MISR). In conclusion the team recommends to use a combination of the three AATSR algorithms, since none of them can be identified which performs best under all conditions. The intensive validation provides a large wealth of information which needs to be fully exploited and can be used to determine future algorithm development priorities. The paper will summarize and discuss the validation results and conclude with an outline of future steps for validation and algorithm improvement.
Preliminary results of the aerosol optical depth retrieval in Johor, Malaysia
International Nuclear Information System (INIS)
Monitoring of atmospheric aerosols over the urban area is important as tremendous amounts of pollutants are released by industrial activities and heavy traffic flow. Air quality monitoring by satellite observation provides better spatial coverage, however, detailed aerosol properties retrieval remains a challenge. This is due to the limitation of aerosol retrieval algorithm on high reflectance (bright surface) areas. The aim of this study is to retrieve aerosol optical depth over urban areas of Iskandar Malaysia; the main southern development zone in Johor state, using Moderate Resolution Imaging Spectroradiometer (MODIS) 500 m resolution data. One of the important steps is the aerosol optical depth retrieval is to characterise different types of aerosols in the study area. This information will be used to construct a Look Up Table containing the simulated aerosol reflectance and corresponding aerosol optical depth. Thus, in this study we have characterised different aerosol types in the study area using Aerosol Robotic Network (AERONET) data. These data were processed using cluster analysis and the preliminary results show that the area is consisting of coastal urban (65%), polluted urban (27.5%), dust particles (6%) and heavy pollution (1.5%) aerosols
Aerosols from biomass combustion
Energy Technology Data Exchange (ETDEWEB)
Nussbaumer, T.
2001-07-01
This report is the proceedings of a seminar on biomass combustion and aerosol production organised jointly by the International Energy Agency's (IEA) Task 32 on bio energy and the Swiss Federal Office of Energy (SFOE). This collection of 16 papers discusses the production of aerosols and fine particles by the burning of biomass and their effects. Expert knowledge on the environmental impact of aerosols, formation mechanisms, measurement technologies, methods of analysis and measures to be taken to reduce such emissions is presented. The seminar, visited by 50 participants from 11 countries, shows, according to the authors, that the reduction of aerosol emissions resulting from biomass combustion will remain a challenge for the future.
Sodium oxide aerosol filtration
International Nuclear Information System (INIS)
In the scope of the sodium aerosol trapping research effort by the CEA/DSN, the retention capacity and yield were measured for very high efficiency fiberglass filters and several types of prefilters (cyclone agglomerator, fabric prefilters, water scrubbers). (author)
Aerosol removal by emergency spray in PWR containment: synthesis of the TOSQAN aerosol tests
International Nuclear Information System (INIS)
During the course of a severe accident in a nuclear Pressurized Water Reactor (PWR), containment reactor is pressurized by steam and hydrogen released from a primary circuit breach and distributed into the containment according to convective flows and steam wall condensation. In addition, core degradation leads to fission product release into the containment. Water spraying is used in the containment as mitigation means in order to reduce pressure, to remove fission products and to enhance the gas mixing in case of presence of hydrogen. This paper presents the synthesis of the results of the TOSQAN aerosol program undertaken by the Institut de Radioprotection et de Surete Nucleaire (IRSN) devoted to study the aerosol removal by a spray, for typical accidental thermal hydraulic conditions in PWR containment. (author)
Transfer of radioactive aerosol from unit shelter in boundary atmosphere layer
International Nuclear Information System (INIS)
The evaluation of transfer of radioactive aerosol in boundary atmosphere layer in case of normal conditions of unit Shelter and in ceases of different emergency scenarios was performed. In cases of normal condition of unit Shelter the additional radioactive contamination of surface air in close ChNPP zone is the result of simultaneous activities of two sources: unorganized removal of radioactive aerosols from 'Shelter' gaps and release of aerosol particles through ventilating duct of power block 3 and 4. A software shell was created to implement computation mathematical models to evaluate transfer of radioactive aerosol from unit 'Shelter'
Emergency protection from aerosols
International Nuclear Information System (INIS)
Expedient methods were developed that could be used by an average person, using only materials readily available, to protect himself and his family from injury by toxic (e.g., radioactive) aerosols. The most effective means of protection was the use of a household vacuum cleaner to maintain a small positive pressure on a closed house during passage of the aerosol cloud. Protection factors of 800 and above were achieved
Emergency Protection from Aerosols
Energy Technology Data Exchange (ETDEWEB)
Cristy, G.A.
2001-11-13
Expedient methods were developed that could be used by an average person, using only materials readily available, to protect himself and his family from injury by toxic (e.g., radioactive) aerosols. The most effective means of protection was the use of a household vacuum cleaner to maintain a small positive pressure on a closed house during passage of the aerosol cloud. Protection factors of 800 and above were achieved.
Kahn, Ralph A.
2014-01-01
AeroCom is an open international initiative of scientists interested in the advancement of the understanding of global aerosol properties and aerosol impacts on climate. A central goal is to more strongly tie and constrain modeling efforts to observational data. A major element for exchanges between data and modeling groups are annual meetings. The meeting was held September 20 through October 2, 1014 and the organizers would like to post the presentations.
Computationally efficient Bayesian inference for inverse problems.
Energy Technology Data Exchange (ETDEWEB)
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
Dimensionality reduction in Bayesian estimation algorithms
Directory of Open Access Journals (Sweden)
G. W. Petty
2013-03-01
Full Text Available An idealized synthetic database loosely resembling 3-channel passive microwave observations of precipitation against a variable background is employed to examine the performance of a conventional Bayesian retrieval algorithm. For this dataset, algorithm performance is found to be poor owing to an irreconcilable conflict between the need to find matches in the dependent database versus the need to exclude inappropriate matches. It is argued that the likelihood of such conflicts increases sharply with the dimensionality of the observation space of real satellite sensors, which may utilize 9 to 13 channels to retrieve precipitation, for example. An objective method is described for distilling the relevant information content from N real channels into a much smaller number (M of pseudochannels while also regularizing the background (geophysical plus instrument noise component. The pseudochannels are linear combinations of the original N channels obtained via a two-stage principal component analysis of the dependent dataset. Bayesian retrievals based on a single pseudochannel applied to the independent dataset yield striking improvements in overall performance. The differences between the conventional Bayesian retrieval and reduced-dimensional Bayesian retrieval suggest that a major potential problem with conventional multichannel retrievals – whether Bayesian or not – lies in the common but often inappropriate assumption of diagonal error covariance. The dimensional reduction technique described herein avoids this problem by, in effect, recasting the retrieval problem in a coordinate system in which the desired covariance is lower-dimensional, diagonal, and unit magnitude.
Tactile length contraction as Bayesian inference.
Tong, Jonathan; Ngo, Vy; Goldreich, Daniel
2016-08-01
To perceive, the brain must interpret stimulus-evoked neural activity. This is challenging: The stochastic nature of the neural response renders its interpretation inherently uncertain. Perception would be optimized if the brain used Bayesian inference to interpret inputs in light of expectations derived from experience. Bayesian inference would improve perception on average but cause illusions when stimuli violate expectation. Intriguingly, tactile, auditory, and visual perception are all prone to length contraction illusions, characterized by the dramatic underestimation of the distance between punctate stimuli delivered in rapid succession; the origin of these illusions has been mysterious. We previously proposed that length contraction illusions occur because the brain interprets punctate stimulus sequences using Bayesian inference with a low-velocity expectation. A novel prediction of our Bayesian observer model is that length contraction should intensify if stimuli are made more difficult to localize. Here we report a tactile psychophysical study that tested this prediction. Twenty humans compared two distances on the forearm: a fixed reference distance defined by two taps with 1-s temporal separation and an adjustable comparison distance defined by two taps with temporal separation t ≤ 1 s. We observed significant length contraction: As t was decreased, participants perceived the two distances as equal only when the comparison distance was made progressively greater than the reference distance. Furthermore, the use of weaker taps significantly enhanced participants' length contraction. These findings confirm the model's predictions, supporting the view that the spatiotemporal percept is a best estimate resulting from a Bayesian inference process. PMID:27121574
Bayesian Methods for Medical Test Accuracy
Directory of Open Access Journals (Sweden)
Lyle D. Broemeling
2011-05-01
Full Text Available Bayesian methods for medical test accuracy are presented, beginning with the basic measures for tests with binary scores: true positive fraction, false positive fraction, positive predictive values, and negative predictive value. The Bayesian approach is taken because of its efficient use of prior information, and the analysis is executed with a Bayesian software package WinBUGS®. The ROC (receiver operating characteristic curve gives the intrinsic accuracy of medical tests that have ordinal or continuous scores, and the Bayesian approach is illustrated with many examples from cancer and other diseases. Medical tests include X-ray, mammography, ultrasound, computed tomography, magnetic resonance imaging, nuclear medicine and tests based on biomarkers, such as blood glucose values for diabetes. The presentation continues with more specialized methods suitable for measuring the accuracies of clinical studies that have verification bias, and medical tests without a gold standard. Lastly, the review is concluded with Bayesian methods for measuring the accuracy of the combination of two or more tests.
Bayesian tomographic reconstruction of microsystems
Salem, Sofia Fekih; Vabre, Alexandre; Mohammad-Djafari, Ali
2007-11-01
The microtomography by X ray transmission plays an increasingly dominating role in the study and the understanding of microsystems. Within this framework, an experimental setup of high resolution X ray microtomography was developed at CEA-List to quantify the physical parameters related to the fluids flow in microsystems. Several difficulties rise from the nature of experimental data collected on this setup: enhanced error measurements due to various physical phenomena occurring during the image formation (diffusion, beam hardening), and specificities of the setup (limited angle, partial view of the object, weak contrast). To reconstruct the object we must solve an inverse problem. This inverse problem is known to be ill-posed. It therefore needs to be regularized by introducing prior information. The main prior information we account for is that the object is composed of a finite known number of different materials distributed in compact regions. This a priori information is introduced via a Gauss-Markov field for the contrast distributions with a hidden Potts-Markov field for the class materials in the Bayesian estimation framework. The computations are done by using an appropriate Markov Chain Monte Carlo (MCMC) technique. In this paper, we present first the basic steps of the proposed algorithms. Then we focus on one of the main steps in any iterative reconstruction method which is the computation of forward and adjoint operators (projection and backprojection). A fast implementation of these two operators is crucial for the real application of the method. We give some details on the fast computation of these steps and show some preliminary results of simulations.
Experimental investigations on nuclear aerosols in a severe accident
DELGADO TARDÁGUILA, ROSARIO
2016-01-01
[EN] In case of a severe accident in a NPP fission products are released from the degraded fuel and may reach the environment if their confinement is lost and/or bypassed. Given the high radio-toxic nature of nuclear aerosols for environment and population, their unrestricted release should be absolutely avoided. One particular situation is the core meltdown sequence with steam generator tube rupture (SGTR). The containment bypass turns this sequence into an indispensable scenario to mode...
A Large Sample Study of the Bayesian Bootstrap
Lo, Albert Y.
1987-01-01
An asymptotic justification of the Bayesian bootstrap is given. Large-sample Bayesian bootstrap probability intervals for the mean, the variance and bands for the distribution, the smoothed density and smoothed rate function are also provided.
Organic aerosols from biomass burning in Amazonian rain forest and their impact onto the environment
International Nuclear Information System (INIS)
A field campaign performed in Southern Brazilian Amazonia in 1993 has proved that this region is subjected to fallout of particulated exhausts released by fires of forestal biomass. In fact, organic content of aerosols collected at urban sites located on the border of pluvial forest, about 50 km from fires, was similar to that of biomass burning exhausts. Aerosol composition is indicative of dolous origin of fires. However, organic contents seems to be influenced by two additional sources, i. e. motor vehicle and high vegetation emission. Chemical pattern of organic aerosols released by biomass burning of forest seems to promote occurrence of photochemical smog episodes in that region
Calculations of sodium aerosol concentrations at breeder reactor air intake ports
International Nuclear Information System (INIS)
This report describes the methodology used and results obtained in efforts to estimate the sodium aerosol concentrations at air intake ports of a liquid-metal cooled, fast-breeder nuclear reactor. A range of wind speeds from 2 to 10 m/s is assumed, and an effort is made to include building wake effects which in many cases dominate the dispersal of aerosols near buildings. For relatively small release rates on the order of 1 to 10 kg/s, it is suggested that the plume rise will be small and that estimates of aerosol concentrations may be derived using the methodology of Wilson and Britter (1982), which describes releases from surface vents. For more acute releases with release rates on the order of 100 kg/s, much higher release velocities are expected, and plume rise must be considered. Both momentum-driven and density-driven plume rise are considered. An effective increase in release height is computed using the Split-H methodology with a parameterization suggested by Ramsdell (1983), and the release source strength was transformed to rooftop level. Evaluation of the acute release aerosol concentration was then based on the methodology for releases from a surface release of this transformed source strength
Bayesian Methods for Radiation Detection and Dosimetry
Groer, Peter G
2002-01-01
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed comp...
BAMBI: blind accelerated multimodal Bayesian inference
Graff, Philip; Hobson, Michael P; Lasenby, Anthony
2011-01-01
In this paper we present an algorithm for rapid Bayesian analysis that combines the benefits of nested sampling and artificial neural networks. The blind accelerated multimodal Bayesian inference (BAMBI) algorithm implements the MultiNest package for nested sampling as well as the training of an artificial neural network (NN) to learn the likelihood function. In the case of computationally expensive likelihoods, this allows the substitution of a much more rapid approximation in order to increase significantly the speed of the analysis. We begin by demonstrating, with a few toy examples, the ability of a NN to learn complicated likelihood surfaces. BAMBI's ability to decrease running time for Bayesian inference is then demonstrated in the context of estimating cosmological parameters from WMAP and other observations. We show that valuable speed increases are achieved in addition to obtaining NNs trained on the likelihood functions for the different model and data combinations. These NNs can then be used for an...
Learning Bayesian Networks from Correlated Data
Bae, Harold; Monti, Stefano; Montano, Monty; Steinberg, Martin H.; Perls, Thomas T.; Sebastiani, Paola
2016-05-01
Bayesian networks are probabilistic models that represent complex distributions in a modular way and have become very popular in many fields. There are many methods to build Bayesian networks from a random sample of independent and identically distributed observations. However, many observational studies are designed using some form of clustered sampling that introduces correlations between observations within the same cluster and ignoring this correlation typically inflates the rate of false positive associations. We describe a novel parameterization of Bayesian networks that uses random effects to model the correlation within sample units and can be used for structure and parameter learning from correlated data without inflating the Type I error rate. We compare different learning metrics using simulations and illustrate the method in two real examples: an analysis of genetic and non-genetic factors associated with human longevity from a family-based study, and an example of risk factors for complications of sickle cell anemia from a longitudinal study with repeated measures.
Dynamic Bayesian Combination of Multiple Imperfect Classifiers
Simpson, Edwin; Psorakis, Ioannis; Smith, Arfon
2012-01-01
Classifier combination methods need to make best use of the outputs of multiple, imperfect classifiers to enable higher accuracy classifications. In many situations, such as when human decisions need to be combined, the base decisions can vary enormously in reliability. A Bayesian approach to such uncertain combination allows us to infer the differences in performance between individuals and to incorporate any available prior knowledge about their abilities when training data is sparse. In this paper we explore Bayesian classifier combination, using the computationally efficient framework of variational Bayesian inference. We apply the approach to real data from a large citizen science project, Galaxy Zoo Supernovae, and show that our method far outperforms other established approaches to imperfect decision combination. We go on to analyse the putative community structure of the decision makers, based on their inferred decision making strategies, and show that natural groupings are formed. Finally we present ...
Bayesian Inference Methods for Sparse Channel Estimation
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand
2013-01-01
inference algorithms based on the proposed prior representation for sparse channel estimation in orthogonal frequency-division multiplexing receivers. The inference algorithms, which are mainly obtained from variational Bayesian methods, exploit the underlying sparse structure of wireless channel responses......This thesis deals with sparse Bayesian learning (SBL) with application to radio channel estimation. As opposed to the classical approach for sparse signal representation, we focus on the problem of inferring complex signals. Our investigations within SBL constitute the basis for the development of...... Bayesian inference algorithms for sparse channel estimation. Sparse inference methods aim at finding the sparse representation of a signal given in some overcomplete dictionary of basis vectors. Within this context, one of our main contributions to the field of SBL is a hierarchical representation of...
Bayesian Image Reconstruction Based on Voronoi Diagrams
Cabrera, G F; Hitschfeld, N
2007-01-01
We present a Bayesian Voronoi image reconstruction technique (VIR) for interferometric data. Bayesian analysis applied to the inverse problem allows us to derive the a-posteriori probability of a novel parameterization of interferometric images. We use a variable Voronoi diagram as our model in place of the usual fixed pixel grid. A quantization of the intensity field allows us to calculate the likelihood function and a-priori probabilities. The Voronoi image is optimized including the number of polygons as free parameters. We apply our algorithm to deconvolve simulated interferometric data. Residuals, restored images and chi^2 values are used to compare our reconstructions with fixed grid models. VIR has the advantage of modeling the image with few parameters, obtaining a better image from a Bayesian point of view.
Bayesian Fusion of Multi-Band Images
Wei, Qi; Tourneret, Jean-Yves
2013-01-01
In this paper, a Bayesian fusion technique for remotely sensed multi-band images is presented. The observed images are related to the high spectral and high spatial resolution image to be recovered through physical degradations, e.g., spatial and spectral blurring and/or subsampling defined by the sensor characteristics. The fusion problem is formulated within a Bayesian estimation framework. An appropriate prior distribution exploiting geometrical consideration is introduced. To compute the Bayesian estimator of the scene of interest from its posterior distribution, a Markov chain Monte Carlo algorithm is designed to generate samples asymptotically distributed according to the target distribution. To efficiently sample from this high-dimension distribution, a Hamiltonian Monte Carlo step is introduced in the Gibbs sampling strategy. The efficiency of the proposed fusion method is evaluated with respect to several state-of-the-art fusion techniques. In particular, low spatial resolution hyperspectral and mult...
International Nuclear Information System (INIS)
A special committee on 'Research on the analysis methods for accident consequence of nuclear fuel facilities (NFFs)' was organized by the Atomic Energy Society of Japan under the entrustment of Japan Atomic Energy Agency for research on the state-of-the-art consequence analysis method for Probabilistic Safety Assessment (PSA) of NFFs, such as fuel reprocessing and fuel fabrication facilities. The objective of this research is to obtain the basic useful information related to the establishment of the quantitative performance requirement and to risk-informed regulation through qualifying issues needed to be resolved for applying PSA to NFFs. The research activities of the committee were mainly focused on accidents with more severe consequences than design basis, such as events of criticality, explosion, fire, and boiling of radioactive solution postulated in NFFs resulting in the release of radioactive materials into the environment. The research results are summarized in this technical report about basic experimental data related to key physical and chemical phenomena postulated in a boiling event of a radioactive solution storage tank caused by the loss of the cooling function. (author)
Characterisation of Aerosols from Simulated Radiological Dispersion Events
Di Lemma, F.G.
2015-01-01
The research described in this thesis aims at improving the evaluation of the radiaoctive aerosol release from different Radiological Dispersion Events (RDE's), such as accidents and sabotage involving radioactive and nuclear materials. These studies help in a better assessment of the source term as
Comparison of the Bayesian and Frequentist Approach to the Statistics
Hakala, Michal
2015-01-01
The Thesis deals with introduction to Bayesian statistics and comparing Bayesian approach with frequentist approach to statistics. Bayesian statistics is modern branch of statistics which provides an alternative comprehensive theory to the frequentist approach. Bayesian concepts provides solution for problems not being solvable by frequentist theory. In the thesis are compared definitions, concepts and quality of statistical inference. The main interest is focused on a point estimation, an in...
Revisiting k-means: New Algorithms via Bayesian Nonparametrics
Kulis, Brian; Jordan, Michael I.
2011-01-01
Bayesian models offer great flexibility for clustering applications---Bayesian nonparametrics can be used for modeling infinite mixtures, and hierarchical Bayesian models can be utilized for sharing clusters across multiple data sets. For the most part, such flexibility is lacking in classical clustering methods such as k-means. In this paper, we revisit the k-means clustering algorithm from a Bayesian nonparametric viewpoint. Inspired by the asymptotic connection between k-means and mixtures...
An Improved Algorithm of Bayesian Text Categorization
Directory of Open Access Journals (Sweden)
Tao Dong
2011-08-01
Full Text Available Text categorization is a fundamental methodology of text mining and a hot topic of the research of data mining and web mining in recent years. It plays an important role in building traditional information retrieval, web indexing architecture, Web information retrieval, and so on. This paper presents an improved algorithm of text categorization that combines the feature weighting technique with Naïve Bayesian classifier. Experimental results show that using the improved Gini index algorithm to feature weight can improve the performance of Naïve Bayesian classifier effectively. This algorithm obtains good application in the sensitive information recognition system.
Bayesian Optimisation Algorithm for Nurse Scheduling
Li, Jingpeng
2008-01-01
Our research has shown that schedules can be built mimicking a human scheduler by using a set of rules that involve domain knowledge. This chapter presents a Bayesian Optimization Algorithm (BOA) for the nurse scheduling problem that chooses such suitable scheduling rules from a set for each nurses assignment. Based on the idea of using probabilistic models, the BOA builds a Bayesian network for the set of promising solutions and samples these networks to generate new candidate solutions. Computational results from 52 real data instances demonstrate the success of this approach. It is also suggested that the learning mechanism in the proposed algorithm may be suitable for other scheduling problems.
Bayesian estimation and tracking a practical guide
Haug, Anton J
2012-01-01
A practical approach to estimating and tracking dynamic systems in real-worl applications Much of the literature on performing estimation for non-Gaussian systems is short on practical methodology, while Gaussian methods often lack a cohesive derivation. Bayesian Estimation and Tracking addresses the gap in the field on both accounts, providing readers with a comprehensive overview of methods for estimating both linear and nonlinear dynamic systems driven by Gaussian and non-Gaussian noices. Featuring a unified approach to Bayesian estimation and tracking, the book emphasizes the derivation
Bayesian Just-So Stories in Psychology and Neuroscience
Bowers, Jeffrey S.; Davis, Colin J.
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make 3 main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak.…
A Gentle Introduction to Bayesian Analysis : Applications to Developmental Research
Van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A G
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, t
A SAS Interface for Bayesian Analysis with WinBUGS
Zhang, Zhiyong; McArdle, John J.; Wang, Lijuan; Hamagami, Fumiaki
2008-01-01
Bayesian methods are becoming very popular despite some practical difficulties in implementation. To assist in the practical application of Bayesian methods, we show how to implement Bayesian analysis with WinBUGS as part of a standard set of SAS routines. This implementation procedure is first illustrated by fitting a multiple regression model…
A Fast Iterative Bayesian Inference Algorithm for Sparse Channel Estimation
DEFF Research Database (Denmark)
Pedersen, Niels Lovmand; Manchón, Carles Navarro; Fleury, Bernard Henri
2013-01-01
representation of the Bessel K probability density function; a highly efficient, fast iterative Bayesian inference method is then applied to the proposed model. The resulting estimator outperforms other state-of-the-art Bayesian and non-Bayesian estimators, either by yielding lower mean squared estimation error...
International Nuclear Information System (INIS)
When gaseous uranium hexafluoride (UF6) is released into the atmosphere, it rapidly reacts with ambient moisture to form an aerosol of uranyl fluoride (UO2F2) and hydrogen fluoride (HF). As part of our Safety Analysis program, we have performed several experimental releases of HF6 in contained volumes in order to investigate techniques for sampling and characterizing the aerosol materials. The aggregate particle morphology and size distribution have been found to be dependent upon several conditions, including the temperature of the UF6 at the time of its release, the relative humidity of the air into which it is released, and the elapsed time after the release. Aerosol composition and settling rate have been investigated using stationary samplers for the separate collection of UO2F2 and HF and via laser spectroscopic remote sensing (Mie scatter and infrared spectroscopy). 25 refs., 16 figs., 5 tabs
Physical metrology of aerosols; Metrologie physique des aerosols
Energy Technology Data Exchange (ETDEWEB)
Boulaud, D.; Vendel, J. [CEA Saclay, 91 - Gif-sur-Yvette (France). Inst. de Protection et de Surete Nucleaire
1996-12-31
The various detection and measuring methods for aerosols are presented, and their selection is related to aerosol characteristics (size range, concentration or mass range), thermo-hydraulic conditions (carrier fluid temperature, pressure and flow rate) and to the measuring system conditions (measuring frequency, data collection speed, cost...). Methods based on aerosol dynamic properties (inertial, diffusional and electrical methods) and aerosol optical properties (localized and integral methods) are described and their performances and applications are compared
Kopka, P.; Wawrzynczak, A.; Borysiewicz, M.
2015-09-01
In many areas of application, a central problem is a solution to the inverse problem, especially estimation of the unknown model parameters to model the underlying dynamics of a physical system precisely. In this situation, the Bayesian inference is a powerful tool to combine observed data with prior knowledge to gain the probability distribution of searched parameters. We have applied the modern methodology named Sequential Approximate Bayesian Computation (S-ABC) to the problem of tracing the atmospheric contaminant source. The ABC is technique commonly used in the Bayesian analysis of complex models and dynamic system. Sequential methods can significantly increase the efficiency of the ABC. In the presented algorithm, the input data are the on-line arriving concentrations of released substance registered by distributed sensor network from OVER-LAND ATMOSPHERIC DISPERSION (OLAD) experiment. The algorithm output are the probability distributions of a contamination source parameters i.e. its particular location, release rate, speed and direction of the movement, start time and duration. The stochastic approach presented in this paper is completely general and can be used in other fields where the parameters of the model bet fitted to the observable data should be found.
Biological aerosol background characterization
Blatny, Janet; Fountain, Augustus W., III
2011-05-01
To provide useful information during military operations, or as part of other security situations, a biological aerosol detector has to respond within seconds or minutes to an attack by virulent biological agents, and with low false alarms. Within this time frame, measuring virulence of a known microorganism is extremely difficult, especially if the microorganism is of unknown antigenic or nucleic acid properties. Measuring "live" characteristics of an organism directly is not generally an option, yet only viable organisms are potentially infectious. Fluorescence based instruments have been designed to optically determine if aerosol particles have viability characteristics. Still, such commercially available biological aerosol detection equipment needs to be improved for their use in military and civil applications. Air has an endogenous population of microorganisms that may interfere with alarm software technologies. To design robust algorithms, a comprehensive knowledge of the airborne biological background content is essential. For this reason, there is a need to study ambient live bacterial populations in as many locations as possible. Doing so will permit collection of data to define diverse biological characteristics that in turn can be used to fine tune alarm algorithms. To avoid false alarms, improving software technologies for biological detectors is a crucial feature requiring considerations of various parameters that can be applied to suppress alarm triggers. This NATO Task Group will aim for developing reference methods for monitoring biological aerosol characteristics to improve alarm algorithms for biological detection. Additionally, they will focus on developing reference standard methodology for monitoring biological aerosol characteristics to reduce false alarm rates.
Combustion aerosols from potassium-containing fuels
Energy Technology Data Exchange (ETDEWEB)
Balzer Nielsen, Lars
1998-12-31
The scope of the work presented in this thesis is the formation and evolution of aerosol particles in the submicron range during combustion processes, in particular where biomass is used alone or co-fired with coal. An introduction to the formation processes of fly ash in general and submicron aerosol in particular during combustion is presented, along with some known problems related to combustion of biomass for power generation. The work falls in two parts. The first is the design of a laboratory setup for investigation of homogeneous nucleation and particle dynamics at high temperature. The central unit of the setup is a laminar flow aerosol condenser (LFAC), which essentially is a 173 cm long tubular furnace with an externally cooled wall. A mathematical model is presented which describes the formation and evolution of the aerosol in the LFAC, where the rate of formation of new nuclei is calculated using the so-called classical theory. The model includes mass and energy conservation equations and an expression for the description of particle growth by diffusion. The resulting set of nonlinear second-order partial differential equations are solved numerically using the method of orthogonal collocation. The model is implemented in the FORTRAN code MONAERO. The second part of this thesis describes a comprehensive investigation of submicron aerosol formation during co-firing of coal and straw carried out at a 380 MW{sub Th} pulverized coal unit at Studstrup Power Plant, Aarhus. Three types of coal are used, and total boiler load and straw input is varied systematically. Straw contains large amounts of potassium, which is released during combustion. Submicron aerosol is sampled between the two banks of the economizer at a flue gas temperature of 350 deg. C using a novel ejector probe. The aerosol is characterized using the SMPS system and a Berner-type low pressure impactor. The chemical composition of the particles collected in the impactor is determined using
Combustion aerosols from potassium-containing fuels
International Nuclear Information System (INIS)
The scope of the work presented in this thesis is the formation and evolution of aerosol particles in the submicron range during combustion processes, in particular where biomass is used alone or co-fired with coal. An introduction to the formation processes of fly ash in general and submicron aerosol in particular during combustion is presented, along with some known problems related to combustion of biomass for power generation. The work falls in two parts. The first is the design of a laboratory setup for investigation of homogeneous nucleation and particle dynamics at high temperature. The central unit of the setup is a laminar flow aerosol condenser (LFAC), which essentially is a 173 cm long tubular furnace with an externally cooled wall. A mathematical model is presented which describes the formation and evolution of the aerosol in the LFAC, where the rate of formation of new nuclei is calculated using the so-called classical theory. The model includes mass and energy conservation equations and an expression for the description of particle growth by diffusion. The resulting set of nonlinear second-order partial differential equations are solved numerically using the method of orthogonal collocation. The model is implemented in the FORTRAN code MONAERO. The second part of this thesis describes a comprehensive investigation of submicron aerosol formation during co-firing of coal and straw carried out at a 380 MWTh pulverized coal unit at Studstrup Power Plant, Aarhus. Three types of coal are used, and total boiler load and straw input is varied systematically. Straw contains large amounts of potassium, which is released during combustion. Submicron aerosol is sampled between the two banks of the economizer at a flue gas temperature of 350 deg. C using a novel ejector probe. The aerosol is characterized using the SMPS system and a Berner-type low pressure impactor. The chemical composition of the particles collected in the impactor is determined using chemical
Jones, Matt; Love, Bradley C
2011-08-01
The prominence of Bayesian modeling of cognition has increased recently largely because of mathematical advances in specifying and deriving predictions from complex probabilistic models. Much of this research aims to demonstrate that cognitive behavior can be explained from rational principles alone, without recourse to psychological or neurological processes and representations. We note commonalities between this rational approach and other movements in psychology - namely, Behaviorism and evolutionary psychology - that set aside mechanistic explanations or make use of optimality assumptions. Through these comparisons, we identify a number of challenges that limit the rational program's potential contribution to psychological theory. Specifically, rational Bayesian models are significantly unconstrained, both because they are uninformed by a wide range of process-level data and because their assumptions about the environment are generally not grounded in empirical measurement. The psychological implications of most Bayesian models are also unclear. Bayesian inference itself is conceptually trivial, but strong assumptions are often embedded in the hypothesis sets and the approximation algorithms used to derive model predictions, without a clear delineation between psychological commitments and implementational details. Comparing multiple Bayesian models of the same task is rare, as is the realization that many Bayesian models recapitulate existing (mechanistic level) theories. Despite the expressive power of current Bayesian models, we argue they must be developed in conjunction with mechanistic considerations to offer substantive explanations of cognition. We lay out several means for such an integration, which take into account the representations on which Bayesian inference operates, as well as the algorithms and heuristics that carry it out. We argue this unification will better facilitate lasting contributions to psychological theory, avoiding the pitfalls
Integer variables estimation problems: the Bayesian approach
Directory of Open Access Journals (Sweden)
G. Venuti
1997-06-01
Full Text Available In geodesy as well as in geophysics there are a number of examples where the unknown parameters are partly constrained to be integer numbers, while other parameters have a continuous range of possible values. In all such situations the ordinary least square principle, with integer variates fixed to the most probable integer value, can lead to paradoxical results, due to the strong non-linearity of the manifold of admissible values. On the contrary an overall estimation procedure assigning the posterior distribution to all variables, discrete and continuous, conditional to the observed quantities, like the so-called Bayesian approach, has the advantage of weighting correctly the possible errors in choosing different sets of integer values, thus providing a more realistic and stable estimate even of the continuous parameters. In this paper, after a short recall of the basics of Bayesian theory in section 2, we present the natural Bayesian solution to the problem of assessing the estimable signal from noisy observations in section 3 and the Bayesian solution to cycle slips detection and repair for a stream of GPS measurements in section 4. An elementary synthetic example is discussed in section 3 to illustrate the theory presented and more elaborate, though synthetic, examples are discussed in section 4 where realistic streams of GPS observations, with cycle slips, are simulated and then back processed.
Von Neumann was not a Quantum Bayesian.
Stacey, Blake C
2016-05-28
Wikipedia has claimed for over 3 years now that John von Neumann was the 'first quantum Bayesian'. In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported. PMID:27091166
Von Neumann Was Not a Quantum Bayesian
Blake C. Stacey
2014-01-01
Wikipedia has claimed for over three years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.
A Bayesian Approach to Interactive Retrieval
Tague, Jean M.
1973-01-01
A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…
Bayesian Averaging is Well-Temperated
DEFF Research Database (Denmark)
Hansen, Lars Kai
2000-01-01
Bayesian predictions are stochastic just like predictions of any other inference scheme that generalize from a finite sample. While a simple variational argument shows that Bayes averaging is generalization optimal given that the prior matches the teacher parameter distribution the situation is l...
Perfect Bayesian equilibrium. Part II: epistemic foundations
Bonanno, Giacomo
2011-01-01
In a companion paper we introduced a general notion of perfect Bayesian equilibrium which can be applied to arbitrary extensive-form games. The essential ingredient of the proposed definition is the qualitative notion of AGM-consistency. In this paper we provide an epistemic foundation for AGM-consistency based on the AGM theory of belief revision.
Explanation mode for Bayesian automatic object recognition
Hazlett, Thomas L.; Cofer, Rufus H.; Brown, Harold K.
1992-09-01
One of the more useful techniques to emerge from AI is the provision of an explanation modality used by the researcher to understand and subsequently tune the reasoning of an expert system. Such a capability, missing in the arena of statistical object recognition, is not that difficult to provide. Long standing results show that the paradigm of Bayesian object recognition is truly optimal in a minimum probability of error sense. To a large degree, the Bayesian paradigm achieves optimality through adroit fusion of a wide range of lower informational data sources to give a higher quality decision--a very 'expert system' like capability. When various sources of incoming data are represented by C++ classes, it becomes possible to automatically backtrack the Bayesian data fusion process, assigning relative weights to the more significant datums and their combinations. A C++ object oriented engine is then able to synthesize 'English' like textural description of the Bayesian reasoning suitable for generalized presentation. Key concepts and examples are provided based on an actual object recognition problem.
Von Neumann Was Not a Quantum Bayesian
Stacey, Blake C
2014-01-01
Wikipedia has claimed for over two years now that John von Neumann was the "first quantum Bayesian." In context, this reads as stating that von Neumann inaugurated QBism, the approach to quantum theory promoted by Fuchs, Mermin and Schack. This essay explores how such a claim is, historically speaking, unsupported.
Scaling Bayesian network discovery through incremental recovery
Castelo, J.R.; Siebes, A.P.J.M.
1999-01-01
Bayesian networks are a type of graphical models that, e.g., allow one to analyze the interaction among the variables in a database. A well-known problem with the discovery of such models from a database is the ``problem of high-dimensionality''. That is, the discovery of a network from a database w
On Bayesian Nonparametric Continuous Time Series Models
Karabatsos, George; Walker, Stephen G.
2013-01-01
This paper is a note on the use of Bayesian nonparametric mixture models for continuous time series. We identify a key requirement for such models, and then establish that there is a single type of model which meets this requirement. As it turns out, the model is well known in multiple change-point problems.
Bayesian semiparametric dynamic Nelson-Siegel model
C. Cakmakli
2011-01-01
This paper proposes the Bayesian semiparametric dynamic Nelson-Siegel model where the density of the yield curve factors and thereby the density of the yields are estimated along with other model parameters. This is accomplished by modeling the error distributions of the factors according to a Diric
A Bayesian Bootstrap for a Finite Population
Lo, Albert Y.
1988-01-01
A Bayesian bootstrap for a finite population is introduced; its small-sample distributional properties are discussed and compared with those of the frequentist bootstrap for a finite population. It is also shown that the two are first-order asymptotically equivalent.
Bayesian analysis of Markov point processes
DEFF Research Database (Denmark)
Berthelsen, Kasper Klitgaard; Møller, Jesper
2006-01-01
Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...
Bayesian calibration of car-following models
Van Hinsbergen, C.P.IJ.; Van Lint, H.W.C.; Hoogendoorn, S.P.; Van Zuylen, H.J.
2010-01-01
Recent research has revealed that there exist large inter-driver differences in car-following behavior such that different car-following models may apply to different drivers. This study applies Bayesian techniques to the calibration of car-following models, where prior distributions on each model p
Inverse Problems in a Bayesian Setting
Matthies, Hermann G.
2016-02-13
In a Bayesian setting, inverse problems and uncertainty quantification (UQ)—the propagation of uncertainty through a computational (forward) model—are strongly connected. In the form of conditional expectation the Bayesian update becomes computationally attractive. We give a detailed account of this approach via conditional approximation, various approximations, and the construction of filters. Together with a functional or spectral approach for the forward UQ there is no need for time-consuming and slowly convergent Monte Carlo sampling. The developed sampling-free non-linear Bayesian update in form of a filter is derived from the variational problem associated with conditional expectation. This formulation in general calls for further discretisation to make the computation possible, and we choose a polynomial approximation. After giving details on the actual computation in the framework of functional or spectral approximations, we demonstrate the workings of the algorithm on a number of examples of increasing complexity. At last, we compare the linear and nonlinear Bayesian update in form of a filter on some examples.
Optimized Bayesian dynamic advising theory and algorithms
Karny, Miroslav
2006-01-01
Written by one of the world''s leading groups in the area of Bayesian identification, control, and decision making, this book provides the theoretical and algorithmic basis of optimized probabilistic advising. It is accompanied by a CD that contains a specialized Matlab-based Mixtools toolbox, and examples illustrating the important areas.
Bayesian Estimation of Thermonuclear Reaction Rates
Iliadis, Christian; Coc, Alain; Timmes, Frank; Starrfield, Sumner
2016-01-01
The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied in the past to this problem, all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extra-solar planets, gravitational waves, and type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present the first astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the d(p,$\\gamma$)$^3$He, $^3$He($^3$He,2p)$^4$He, and $^3$He($\\alpha$,$\\gamma$)$^7$Be reactions,...
An Approximate Bayesian Fundamental Frequency Estimator
DEFF Research Database (Denmark)
Nielsen, Jesper Kjær; Christensen, Mads Græsbøll; Jensen, Søren Holdt
Joint fundamental frequency and model order estimation is an important problem in several applications such as speech and music processing. In this paper, we develop an approximate estimation algorithm of these quantities using Bayesian inference. The inference about the fundamental frequency and...
Basics of Bayesian Learning - Basically Bayes
DEFF Research Database (Denmark)
Larsen, Jan
Tutorial presented at the IEEE Machine Learning for Signal Processing Workshop 2006, Maynooth, Ireland, September 8, 2006. The tutorial focuses on the basic elements of Bayesian learning and its relation to classical learning paradigms. This includes a critical discussion of the pros and cons. The...
Sensitivity to Sampling in Bayesian Word Learning
Xu, Fei; Tenenbaum, Joshua B.
2007-01-01
We report a new study testing our proposal that word learning may be best explained as an approximate form of Bayesian inference (Xu & Tenenbaum, in press). Children are capable of learning word meanings across a wide range of communicative contexts. In different contexts, learners may encounter different sampling processes generating the examples…
Aerosols behavior inside a PWR during an accident
International Nuclear Information System (INIS)
During very hypothetical accidents occurring in a pressurized water ractor, radioactive aerosols can be released, during core-melt, inside the reactor containment building. A good knowledge of their behavior in the humid containment atmosphere (mass concentration and size distribution) is essential in order to evaluate their harmfulness in case of environment contamination and to design possible filtration devices. Accordingly the Safety Analysis Department of the Atomic Energy Commission uses several computer models, describing the particle formation (BOIL/MARCH), then behavior in the primary circuits (TRAP-MELT), and in the reactor containment building (AEROSOLS-PARFDISEKO-III B). On the one hand, these models have been improved, in particular the one related to the aerosol formation (nature and mass of released particles) using recent experimental results. On the other hand, sensitivity analyses have been performed with the AEROSOLS code which emphasize the particle coagulation parameters: agglomerate shape factors and collision efficiency. Finally, the different computer models have been applied to the study of aerosol behavior during a 900 MWe PWR accident: loss-of-coolant-accident (small break with failure of all safety systems)
Formation of halogen-induced secondary organic aerosol (XOA)
Kamilli, Katharina; Ofner, Johannes; Zetzsch, Cornelius; Held, Andreas
2013-04-01
bromine with α-pinene. This work was funded by German Research Foundation (DFG) under grants HE 5214/5-1 and ZE792/5-2. References: Cai, X., and Griffin, R. J.: Secondary aerosol formation from the oxidation of biogenic hydrocarbons by chlorine atoms, J. Geophys. Res., 111, D14206/14201-D14206/14214, 2006. Ofner, J. Balzer, N., Buxmann, J., Grothe, H., Schmitt-Kopplin, Ph., Platt, U., and Zetzsch, C., Halogenation processes of secondary organic aerosol and implications on halogen release mechanisms, Atmos. Chem. Phys. Discuss. 12, 2975-3017, 2012.
DEFF Research Database (Denmark)
Schweda, Frank; Friis, Ulla; Wagner, Charlotte;
2007-01-01
The aspartyl-protease renin is the key regulator of the renin-angiotensin-aldosterone system, which is critically involved in salt, volume, and blood pressure homeostasis of the body. Renin is mainly produced and released into circulation by the so-called juxtaglomerular epithelioid cells, located...
A tutorial on Bayesian Normal linear regression
Klauenberg, Katy; Wübbeler, Gerd; Mickan, Bodo; Harris, Peter; Elster, Clemens
2015-12-01
Regression is a common task in metrology and often applied to calibrate instruments, evaluate inter-laboratory comparisons or determine fundamental constants, for example. Yet, a regression model cannot be uniquely formulated as a measurement function, and consequently the Guide to the Expression of Uncertainty in Measurement (GUM) and its supplements are not applicable directly. Bayesian inference, however, is well suited to regression tasks, and has the advantage of accounting for additional a priori information, which typically robustifies analyses. Furthermore, it is anticipated that future revisions of the GUM shall also embrace the Bayesian view. Guidance on Bayesian inference for regression tasks is largely lacking in metrology. For linear regression models with Gaussian measurement errors this tutorial gives explicit guidance. Divided into three steps, the tutorial first illustrates how a priori knowledge, which is available from previous experiments, can be translated into prior distributions from a specific class. These prior distributions have the advantage of yielding analytical, closed form results, thus avoiding the need to apply numerical methods such as Markov Chain Monte Carlo. Secondly, formulas for the posterior results are given, explained and illustrated, and software implementations are provided. In the third step, Bayesian tools are used to assess the assumptions behind the suggested approach. These three steps (prior elicitation, posterior calculation, and robustness to prior uncertainty and model adequacy) are critical to Bayesian inference. The general guidance given here for Normal linear regression tasks is accompanied by a simple, but real-world, metrological example. The calibration of a flow device serves as a running example and illustrates the three steps. It is shown that prior knowledge from previous calibrations of the same sonic nozzle enables robust predictions even for extrapolations.
Aerosol studies with Listeria innocua and Listeria monocytogenes.
Zhang, Guodong; Ma, Li; Oyarzabal, Omar A; Doyle, Michael P
2007-08-01
Aerosol studies of Listeria monocytogenes in food processing plants have been limited by lack of a suitable surrogate microorganism. The objective of this study was to investigate the potential of using green fluorescent protein-labeled strains of Listeria innocua as a surrogate for L. monocytogenes for aerosol studies. These studies were conducted in a laboratory bioaerosol chamber and a pilot food-processing facility. Four strains of L. innocua and five strains of L. monocytogenes were used. In the laboratory chamber study, Listeria cells were released into the environment at two different cell numbers and under two airflow conditions. Trypticase soy agar (TSA) plates and oven-roasted breasts of chicken and turkey were placed in the chamber to monitor Listeria cell numbers deposited from aerosols. A similar experimental design was used in the pilot plant study; however, only L. innocua was used. Results showed that L. monocytogenes and L. innocua survived equally well on chicken and turkey breast meats and TSA plates. No-fan and continuous fan applications, which affected airflow, had no significant effect on settling rates of aerosolized L. monocytogenes and L. innocua in the bioaerosol chamber or L. innocua in the pilot plant study. Listeriae cell numbers in the air decreased rapidly during the first 1.5 h following release, with few to no listeriae detected in the air at 3 h. Aerosol particles with diameters of 1 and 2 microM correlated directly with the number of Listeria cells in the aerosol but not with particles that were 0.3, 0.5, and 5 microM in diameter. Results indicate that L. innocua can be used as a surrogate for L. monocytogenes in an aerosol study. PMID:17803142
Aerosol sample inhomogeneity with debris from the Fukushima Daiichi accident
International Nuclear Information System (INIS)
Radionuclide aerosol sampling is a vital component in the detection of nuclear explosions, nuclear accidents, and other radiation releases. This was proven by the detection and tracking of emissions from the Fukushima Daiichi incident across the globe by IMS stations. Two separate aerosol samplers were operated in Richland, WA following the event and debris from the accident were measured at levels well above detection limits. While the atmospheric activity concentration of radionuclides generally compared well between the two stations, they did not agree within uncertainties. This paper includes a detailed study of the aerosol sample homogeneity of 134Cs and 137Cs, then relates it to the overall uncertainty of the original measurement. Our results show that sample inhomogeneity adds an additional 5−10% uncertainty to each aerosol measurement and that this uncertainty is in the same range as the discrepancies between the two aerosol sample measurements from Richland, WA. - Highlights: • Statistical discrepancies arise when comparing HVAS and RASA measurements. • Beta statistic was employed to quantize statistical discrepancies. • Aerosol sample inhomogeneity determined to be 5–10%. • Statistical discrepancies attributed to sample inhomogeneity
Universal Darwinism as a process of Bayesian inference
Campbell, John O
2016-01-01
Many of the mathematical frameworks describing natural selection are equivalent to Bayes Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment". Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description clo...
Gao, R. S.; Elkins, J. W.; Frost, G. J.; McComiskey, A. C.; Murphy, D. M.; Ogren, J. A.; Petropavlovskikh, I. V.; Rosenlof, K. H.
2014-12-01
Inverse modeling using measurements of ozone (O3) and aerosol is a powerful tool for deriving pollutant emissions. Because they have relatively long lifetimes, O3 and aerosol are transported over large distances. Frequent and globally spaced vertical profiles rather than ground-based measurements alone are therefore highly desired. Three requirements necessary for a successful global monitoring program are: Low equipment cost, low operation cost, and reliable measurements of known uncertainty. Conventional profiling using aircraft provides excellent data, but is cost prohibitive on a large scale. Here we describe a new platform and instruments meeting all three global monitoring requirements. The platform consists of a small balloon and an auto-homing glider. The glider is released from the balloon at about 5 km altitude, returning the light instrument package to the launch location, and allowing for consistent recovery of the payload. Atmospheric profiling can be performed either during ascent or descent (or both) depending on measurement requirements. We will present the specifications for two instrument packages currently under development. The first measures O3, RH, p, T, dry aerosol particle number and size distribution, and aerosol optical depth. The second measures dry aerosol particle number and size distribution, and aerosol absorption coefficient. Other potential instrument packages and the desired spatial/temporal resolution for the GOA2HEAD monitoring initiative will also be discussed.
Advancing Models and Evaluation of Cumulus, Climate and Aerosol Interactions
Energy Technology Data Exchange (ETDEWEB)
Gettelman, Andrew [University Corporation for Atmospheric Research (NCAR), Boulder, CO (United States)
2015-10-27
This project was successfully able to meet its’ goals, but faced some serious challenges due to personnel issues. Nonetheless, it was largely successful. The Project Objectives were as follows: 1. Develop a unified representation of stratifom and cumulus cloud microphysics for NCAR/DOE global community models. 2. Examine the effects of aerosols on clouds and their impact on precipitation in stratiform and cumulus clouds. We will also explore the effects of clouds and precipitation on aerosols. 3. Test these new formulations using advanced evaluation techniques and observations and release
Bayesian network learning for natural hazard assessments
Vogel, Kristin
2016-04-01
Even though quite different in occurrence and consequences, from a modelling perspective many natural hazards share similar properties and challenges. Their complex nature as well as lacking knowledge about their driving forces and potential effects make their analysis demanding. On top of the uncertainty about the modelling framework, inaccurate or incomplete event observations and the intrinsic randomness of the natural phenomenon add up to different interacting layers of uncertainty, which require a careful handling. Thus, for reliable natural hazard assessments it is crucial not only to capture and quantify involved uncertainties, but also to express and communicate uncertainties in an intuitive way. Decision-makers, who often find it difficult to deal with uncertainties, might otherwise return to familiar (mostly deterministic) proceedings. In the scope of the DFG research training group „NatRiskChange" we apply the probabilistic framework of Bayesian networks for diverse natural hazard and vulnerability studies. The great potential of Bayesian networks was already shown in previous natural hazard assessments. Treating each model component as random variable, Bayesian networks aim at capturing the joint distribution of all considered variables. Hence, each conditional distribution of interest (e.g. the effect of precautionary measures on damage reduction) can be inferred. The (in-)dependencies between the considered variables can be learned purely data driven or be given by experts. Even a combination of both is possible. By translating the (in-)dependences into a graph structure, Bayesian networks provide direct insights into the workings of the system and allow to learn about the underlying processes. Besides numerous studies on the topic, learning Bayesian networks from real-world data remains challenging. In previous studies, e.g. on earthquake induced ground motion and flood damage assessments, we tackled the problems arising with continuous variables
Aerosol and melt chemistry in the ACE molten core-concrete interaction experiments
International Nuclear Information System (INIS)
Experimental results are discussed from the internationally sponsored Advanced Containment Experiments (ACE) Program on the melt behavior and aerosols released during the interaction of molten reactor core material with concrete. A broad range of parameters were addressed in the experimental program: Seven large-scale tests were performed using four types of concrete (siliceous, limestone/sand, serpentine, and limestone) and a range of metal oxidations for both boiling water and pressurized waster reactor core debris. The release aerosols contained mainly constitutents of the concrete. In the tests with metal and limestone/sand siliceous concrete, silicon compounds comprised 50% or more of the aerosol mass. Releases of uranium and low-volatility fission-product elements were small in all tests. Releases of tellurium and neutron absorber materials (silver, indium, and boron from boron carbide) were high
Sampling and characterization of aerosols formed in the atmospheric hydrolysis of UF6
International Nuclear Information System (INIS)
When gaseous UF6 is released into the atmosphere, it rapidly reacts with ambient moisture to form an aerosol of uranyl fluoride and HF. As part of our Safety Analysis program, we have performed several experimental releases of UF6 (from natural uranium) in contained volumes in order to investigate techniques for sampling and characterizing the aerosol materials. The aggregrate particle morphology and size distribution have been found to be dependent upon several conditions, including the relative humidity at the time of the release and the elapse time after the release. Aerosol composition and settling rate have been investigated using isokinetic samplers for the separate collection of UO2F2 and HF, and via laser spectroscopic remote sensing (Mie scatter and infrared spectroscopy). 8 references
Importance of core/concrete aerosol production and some containment heat sources to the source term
International Nuclear Information System (INIS)
Production of aerosols by core/concrete interaction in a large break PWR severe accident is discussed, and both vaporization and mechanical production processes are examined. In the case of the former, equilibrium chemical thermodynamic studies are used to decide which chemical species should be considered, recognizing the uncertainty in the likely configuration of the core/concrete melt. Lanthanide release is found to be particularly sensitive to this configuration. It is found that kinetic effects are not important in preventing the attainment of chemical equilibrium in the gas bubbling through the melt. At early times aerosol production by bubble bursting at the melt surface is found to be less important than that due to vaporization, except for those materials released in small quantities, e.g. Mo. The bubble bursting mechanism becomes relatively more important at later times. Calculations for a large modern PWR show that environmental release from the core/concrete aerosol is likely to be of comparable or greater importance (in terms of released decay heat) than that from the in-vessel core-melt aerosol for all but very early containment failure or failure to isolate, neglecting attenuation of the core/concrete aerosol during its flow from the cavity to the main containment volume. The importance of performing linked thermal-hydraulic and aerosol physics calculations is highlighted by the blowdown aerosol in a large break accident. Treatment of the decay heat arising from the aerosol material released to the containment is discussed. It is shown that it is very important to consider this heat source in containment pressure calculations, but it was not found to be important to treat its spatial dependence accurately in the large break accident considered here. Some scoping calculations for material resuspension on containment overpressure failure, due to a hydrogen burn, are presented
Deposition of CsI aerosol in horizontal straight pipe under inert and superheated steam environment
International Nuclear Information System (INIS)
In a severe accident of an LWR, fission products (FPs) aerosol released from a reactor core region will be deposited on the inner surface of the reactor coolant piping. In such conditions, the piping might be subjected to a thermal load due to decay heat from the deposited FPs. It is very important to quantify the FP aerosol deposition on the piping surfaces. Therefore the FP aerosol behavior in piping is being investigated in the WIND (Wide Range Piping Integrity Demonstration) project at Japan Atomic Energy Research Institute. The objectives of present study are to characterize the aerosol deposition on piping surfaces under various thermal-hydraulic conditions and to obtain insights for the validation of analytical models. A chemical analysis of the deposited aerosol showed that no evidence was found for the decomposition of CsI under inert and superheated steam environments. The major deposition mechanisms are identified to be the condensation of CsI vapor and the thermophoretic aerosol transportation from the carrier gas to the colder piping surfaces. Thermo-fluiddynamic analyses of the carrier gas with WINDFLOW code implied that a precise prediction is required for the evaluation of the amount and the spatial distribution of the aerosol deposition. Remarkable aerosol deposition onto the floor area and enlargement of the deposited aerosol were observed in the test with a superheated steam environment. An additional test will be shortly performed in order to reconfirm the findings obtained under a superheated steam environment. (J.P.N.)
Bayesianism and inference to the best explanation
Directory of Open Access Journals (Sweden)
Valeriano IRANZO
2008-01-01
Full Text Available Bayesianism and Inference to the best explanation (IBE are two different models of inference. Recently there has been some debate about the possibility of “bayesianizing” IBE. Firstly I explore several alternatives to include explanatory considerations in Bayes’s Theorem. Then I distinguish two different interpretations of prior probabilities: “IBE-Bayesianism” (IBE-Bay and “frequentist-Bayesianism” (Freq-Bay. After detailing the content of the latter, I propose a rule for assessing the priors. I also argue that Freq-Bay: (i endorses a role for explanatory value in the assessment of scientific hypotheses; (ii avoids a purely subjectivist reading of prior probabilities; and (iii fits better than IBE-Bayesianism with two basic facts about science, i.e., the prominent role played by empirical testing and the existence of many scientific theories in the past that failed to fulfil their promises and were subsequently abandoned.
The NIFTY way of Bayesian signal inference
International Nuclear Information System (INIS)
We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D3PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy
Learning Bayesian networks using genetic algorithm
Institute of Scientific and Technical Information of China (English)
Chen Fei; Wang Xiufeng; Rao Yimei
2007-01-01
A new method to evaluate the fitness of the Bayesian networks according to the observed data is provided. The main advantage of this criterion is that it is suitable for both the complete and incomplete cases while the others not.Moreover it facilitates the computation greatly. In order to reduce the search space, the notation of equivalent class proposed by David Chickering is adopted. Instead of using the method directly, the novel criterion, variable ordering, and equivalent class are combined,moreover the proposed mthod avoids some problems caused by the previous one. Later, the genetic algorithm which allows global convergence, lack in the most of the methods searching for Bayesian network is applied to search for a good model in thisspace. To speed up the convergence, the genetic algorithm is combined with the greedy algorithm. Finally, the simulation shows the validity of the proposed approach.
QBism, the Perimeter of Quantum Bayesianism
Fuchs, Christopher A
2010-01-01
This article summarizes the Quantum Bayesian point of view of quantum mechanics, with special emphasis on the view's outer edges---dubbed QBism. QBism has its roots in personalist Bayesian probability theory, is crucially dependent upon the tools of quantum information theory, and most recently, has set out to investigate whether the physical world might be of a type sketched by some false-started philosophies of 100 years ago (pragmatism, pluralism, nonreductionism, and meliorism). Beyond conceptual issues, work at Perimeter Institute is focused on the hard technical problem of finding a good representation of quantum mechanics purely in terms of probabilities, without amplitudes or Hilbert-space operators. The best candidate representation involves a mysterious entity called a symmetric informationally complete quantum measurement. Contemplation of it gives a way of thinking of the Born Rule as an addition to the rules of probability theory, applicable when an agent considers gambling on the consequences of...
A Bayesian Probabilistic Framework for Rain Detection
Directory of Open Access Journals (Sweden)
Chen Yao
2014-06-01
Full Text Available Heavy rain deteriorates the video quality of outdoor imaging equipments. In order to improve video clearness, image-based and sensor-based methods are adopted for rain detection. In earlier literature, image-based detection methods fall into spatio-based and temporal-based categories. In this paper, we propose a new image-based method by exploring spatio-temporal united constraints in a Bayesian framework. In our framework, rain temporal motion is assumed to be Pathological Motion (PM, which is more suitable to time-varying character of rain steaks. Temporal displaced frame discontinuity and spatial Gaussian mixture model are utilized in the whole framework. Iterated expectation maximization solving method is taken for Gaussian parameters estimation. Pixels state estimation is finished by an iterated optimization method in Bayesian probability formulation. The experimental results highlight the advantage of our method in rain detection.
Bayesian networks for enterprise risk assessment
Bonafede, C E
2006-01-01
According to different typologies of activity and priority, risks can assume diverse meanings and it can be assessed in different ways. In general risk is measured in terms of a probability combination of an event (frequency) and its consequence (impact). To estimate the frequency and the impact (severity) historical data or expert opinions (either qualitative or quantitative data) are used. Moreover qualitative data must be converted in numerical values to be used in the model. In the case of enterprise risk assessment the considered risks are, for instance, strategic, operational, legal and of image, which many times are difficult to be quantified. So in most cases only expert data, gathered by scorecard approaches, are available for risk analysis. The Bayesian Network is a useful tool to integrate different information and in particular to study the risk's joint distribution by using data collected from experts. In this paper we want to show a possible approach for building a Bayesian networks in the parti...
Machine learning a Bayesian and optimization perspective
Theodoridis, Sergios
2015-01-01
This tutorial text gives a unifying perspective on machine learning by covering both probabilistic and deterministic approaches, which rely on optimization techniques, as well as Bayesian inference, which is based on a hierarchy of probabilistic models. The book presents the major machine learning methods as they have been developed in different disciplines, such as statistics, statistical and adaptive signal processing and computer science. Focusing on the physical reasoning behind the mathematics, all the various methods and techniques are explained in depth, supported by examples and problems, giving an invaluable resource to the student and researcher for understanding and applying machine learning concepts. The book builds carefully from the basic classical methods to the most recent trends, with chapters written to be as self-contained as possible, making the text suitable for different courses: pattern recognition, statistical/adaptive signal processing, statistical/Bayesian learning, as well as shor...
Bayesian Peak Picking for NMR Spectra
Cheng, Yichen
2014-02-01
Protein structure determination is a very important topic in structural genomics, which helps people to understand varieties of biological functions such as protein-protein interactions, protein–DNA interactions and so on. Nowadays, nuclear magnetic resonance (NMR) has often been used to determine the three-dimensional structures of protein in vivo. This study aims to automate the peak picking step, the most important and tricky step in NMR structure determination. We propose to model the NMR spectrum by a mixture of bivariate Gaussian densities and use the stochastic approximation Monte Carlo algorithm as the computational tool to solve the problem. Under the Bayesian framework, the peak picking problem is casted as a variable selection problem. The proposed method can automatically distinguish true peaks from false ones without preprocessing the data. To the best of our knowledge, this is the first effort in the literature that tackles the peak picking problem for NMR spectrum data using Bayesian method.
Approximate Bayesian Computation: a nonparametric perspective
Blum, Michael
2010-01-01
Approximate Bayesian Computation is a family of likelihood-free inference techniques that are well-suited to models defined in terms of a stochastic generating mechanism. In a nutshell, Approximate Bayesian Computation proceeds by computing summary statistics s_obs from the data and simulating summary statistics for different values of the parameter theta. The posterior distribution is then approximated by an estimator of the conditional density g(theta|s_obs). In this paper, we derive the asymptotic bias and variance of the standard estimators of the posterior distribution which are based on rejection sampling and linear adjustment. Additionally, we introduce an original estimator of the posterior distribution based on quadratic adjustment and we show that its bias contains a fewer number of terms than the estimator with linear adjustment. Although we find that the estimators with adjustment are not universally superior to the estimator based on rejection sampling, we find that they can achieve better perfor...
Probabilistic forecasting and Bayesian data assimilation
Reich, Sebastian
2015-01-01
In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...
Bayesian Magnetohydrodynamic Seismology of Coronal Loops
Arregui, Inigo
2011-01-01
We perform a Bayesian parameter inference in the context of resonantly damped transverse coronal loop oscillations. The forward problem is solved in terms of parametric results for kink waves in one-dimensional flux tubes in the thin tube and thin boundary approximations. For the inverse problem, we adopt a Bayesian approach to infer the most probable values of the relevant parameters, for given observed periods and damping times, and to extract their confidence levels. The posterior probability distribution functions are obtained by means of Markov Chain Monte Carlo simulations, incorporating observed uncertainties in a consistent manner. We find well localized solutions in the posterior probability distribution functions for two of the three parameters of interest, namely the Alfven travel time and the transverse inhomogeneity length-scale. The obtained estimates for the Alfven travel time are consistent with previous inversion results, but the method enables us to additionally constrain the transverse inho...
Bayesian parameter estimation for effective field theories
Wesolowski, S; Furnstahl, R J; Phillips, D R; Thapaliya, A
2015-01-01
We present procedures based on Bayesian statistics for effective field theory (EFT) parameter estimation from data. The extraction of low-energy constants (LECs) is guided by theoretical expectations that supplement such information in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools are developed that analyze the fit and ensure that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems and the extraction of LECs for the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Bayesian image reconstruction: Application to emission tomography
Energy Technology Data Exchange (ETDEWEB)
Nunez, J.; Llacer, J.
1989-02-01
In this paper we propose a Maximum a Posteriori (MAP) method of image reconstruction in the Bayesian framework for the Poisson noise case. We use entropy to define the prior probability and likelihood to define the conditional probability. The method uses sharpness parameters which can be theoretically computed or adjusted, allowing us to obtain MAP reconstructions without the problem of the grey'' reconstructions associated with the pre Bayesian reconstructions. We have developed several ways to solve the reconstruction problem and propose a new iterative algorithm which is stable, maintains positivity and converges to feasible images faster than the Maximum Likelihood Estimate method. We have successfully applied the new method to the case of Emission Tomography, both with simulated and real data. 41 refs., 4 figs., 1 tab.
Software Health Management with Bayesian Networks
Mengshoel, Ole; Schumann, JOhann
2011-01-01
Most modern aircraft as well as other complex machinery is equipped with diagnostics systems for its major subsystems. During operation, sensors provide important information about the subsystem (e.g., the engine) and that information is used to detect and diagnose faults. Most of these systems focus on the monitoring of a mechanical, hydraulic, or electromechanical subsystem of the vehicle or machinery. Only recently, health management systems that monitor software have been developed. In this paper, we will discuss our approach of using Bayesian networks for Software Health Management (SWHM). We will discuss SWHM requirements, which make advanced reasoning capabilities for the detection and diagnosis important. Then we will present our approach to using Bayesian networks for the construction of health models that dynamically monitor a software system and is capable of detecting and diagnosing faults.
The Bayesian Who Knew Too Much
Benétreau-Dupin, Yann
2014-01-01
In several papers, John Norton has argued that Bayesianism cannot handle ignorance adequately due to its inability to distinguish between neutral and disconfirming evidence. He argued that this inability sows confusion in, e.g., anthropic reasoning in cosmology or the Doomsday argument, by allowing one to draw unwarranted conclusions from a lack of knowledge. Norton has suggested criteria for a candidate for representation of neutral support. Imprecise credences (families of credal probability functions) constitute a Bayesian-friendly framework that allows us to avoid inadequate neutral priors and better handle ignorance. The imprecise model generally agrees with Norton's representation of ignorance but requires that his criterion of self-duality be reformulated or abandoned
Social optimality in quantum Bayesian games
Iqbal, Azhar; Chappell, James M.; Abbott, Derek
2015-10-01
A significant aspect of the study of quantum strategies is the exploration of the game-theoretic solution concept of the Nash equilibrium in relation to the quantization of a game. Pareto optimality is a refinement on the set of Nash equilibria. A refinement on the set of Pareto optimal outcomes is known as social optimality in which the sum of players' payoffs is maximized. This paper analyzes social optimality in a Bayesian game that uses the setting of generalized Einstein-Podolsky-Rosen experiments for its physical implementation. We show that for the quantum Bayesian game a direct connection appears between the violation of Bell's inequality and the social optimal outcome of the game and that it attains a superior socially optimal outcome.
Distributed Bayesian Networks for User Modeling
DEFF Research Database (Denmark)
Tedesco, Roberto; Dolog, Peter; Nejdl, Wolfgang;
2006-01-01
The World Wide Web is a popular platform for providing eLearning applications to a wide spectrum of users. However – as users differ in their preferences, background, requirements, and goals – applications should provide personalization mechanisms. In the Web context, user models used by such...... adaptive applications are often partial fragments of an overall user model. The fragments have then to be collected and merged into a global user profile. In this paper we investigate and present algorithms able to cope with distributed, fragmented user models – based on Bayesian Networks – in the context...... mechanism efficiently combines distributed learner models without the need to exchange internal structure of local Bayesian networks, nor local evidence between the involved platforms....
Bayesian parameter estimation for effective field theories
Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.
2016-07-01
We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Applications of Bayesian spectrum representation in acoustics
Botts, Jonathan M.
This dissertation utilizes a Bayesian inference framework to enhance the solution of inverse problems where the forward model maps to acoustic spectra. A Bayesian solution to filter design inverts a acoustic spectra to pole-zero locations of a discrete-time filter model. Spatial sound field analysis with a spherical microphone array is a data analysis problem that requires inversion of spatio-temporal spectra to directions of arrival. As with many inverse problems, a probabilistic analysis results in richer solutions than can be achieved with ad-hoc methods. In the filter design problem, the Bayesian inversion results in globally optimal coefficient estimates as well as an estimate the most concise filter capable of representing the given spectrum, within a single framework. This approach is demonstrated on synthetic spectra, head-related transfer function spectra, and measured acoustic reflection spectra. The Bayesian model-based analysis of spatial room impulse responses is presented as an analogous problem with equally rich solution. The model selection mechanism provides an estimate of the number of arrivals, which is necessary to properly infer the directions of simultaneous arrivals. Although, spectrum inversion problems are fairly ubiquitous, the scope of this dissertation has been limited to these two and derivative problems. The Bayesian approach to filter design is demonstrated on an artificial spectrum to illustrate the model comparison mechanism and then on measured head-related transfer functions to show the potential range of application. Coupled with sampling methods, the Bayesian approach is shown to outperform least-squares filter design methods commonly used in commercial software, confirming the need for a global search of the parameter space. The resulting designs are shown to be comparable to those that result from global optimization methods, but the Bayesian approach has the added advantage of a filter length estimate within the same unified
Quantum-like Representation of Bayesian Updating
Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Khrennikov, Andrei; Basieva, Irina
2011-03-01
Recently, applications of quantum mechanics to coginitive psychology have been discussed, see [1]-[11]. It was known that statistical data obtained in some experiments of cognitive psychology cannot be described by classical probability model (Kolmogorov's model) [12]-[15]. Quantum probability is one of the most advanced mathematical models for non-classical probability. In the paper of [11], we proposed a quantum-like model describing decision-making process in a two-player game, where we used the generalized quantum formalism based on lifting of density operators [16]. In this paper, we discuss the quantum-like representation of Bayesian inference, which has been used to calculate probabilities for decision making under uncertainty. The uncertainty is described in the form of quantum superposition, and Bayesian updating is explained as a reduction of state by quantum measurement.
Distributed Detection via Bayesian Updates and Consensus
Liu, Qipeng; Wang, Xiaofan
2014-01-01
In this paper, we discuss a class of distributed detection algorithms which can be viewed as implementations of Bayes' law in distributed settings. Some of the algorithms are proposed in the literature most recently, and others are first developed in this paper. The common feature of these algorithms is that they all combine (i) certain kinds of consensus protocols with (ii) Bayesian updates. They are different mainly in the aspect of the type of consensus protocol and the order of the two operations. After discussing their similarities and differences, we compare these distributed algorithms by numerical examples. We focus on the rate at which these algorithms detect the underlying true state of an object. We find that (a) The algorithms with consensus via geometric average is more efficient than that via arithmetic average; (b) The order of consensus aggregation and Bayesian update does not apparently influence the performance of the algorithms; (c) The existence of communication delay dramatically slows do...
Advanced Bayesian Method for Planetary Surface Navigation
Center, Julian
2015-01-01
Autonomous Exploration, Inc., has developed an advanced Bayesian statistical inference method that leverages current computing technology to produce a highly accurate surface navigation system. The method combines dense stereo vision and high-speed optical flow to implement visual odometry (VO) to track faster rover movements. The Bayesian VO technique improves performance by using all image information rather than corner features only. The method determines what can be learned from each image pixel and weighs the information accordingly. This capability improves performance in shadowed areas that yield only low-contrast images. The error characteristics of the visual processing are complementary to those of a low-cost inertial measurement unit (IMU), so the combination of the two capabilities provides highly accurate navigation. The method increases NASA mission productivity by enabling faster rover speed and accuracy. On Earth, the technology will permit operation of robots and autonomous vehicles in areas where the Global Positioning System (GPS) is degraded or unavailable.
Modification of combustion aerosols in the atmosphere
Energy Technology Data Exchange (ETDEWEB)
Weingartner, E. [Paul Scherrer Inst. (PSI), Villigen (Switzerland)
1996-07-01
Combustion aerosols particles are released on large scale into the atmosphere in the industrialized regions as well as in the tropics (by wood fires). The particles are subjected to various aging processes which depend on the size, morphology, and chemical composition of the particles. The interaction of combustion particles with sunlight and humidity as well as adsorption and desorption of volatile material to or from the particles considerably changes their physical and chemical properties and thus their residence time in the atmosphere. This is of importance because combustion particles are known to have a variety of health effects on people. Moreover, atmospheric aerosol particles have an influence on climate, directly through the reflection and absorption of solar radiation and indirectly through modifying the optical properties and lifetime of clouds. In a first step, a field experiment was carried out to study the sources and characteristics of combustion aerosols that are emitted from vehicles in a road tunnel. It was found that most of the fine particles were tail pipe emissions of diesel powered vehicles. The calculation shows that on an average these vehicles emit about 300 mg fine particulate matter per driven kilometer. This emission factor is at least 100 times higher than the mean emission factor estimated for gasoline powered vehicles. Furthermore, it is found that during their residence time in the tunnel, the particles undergo significant changes: The particles change towards a more compact structure. The conclusion is reached that this is mainly due to adsorption of volatile material from the gas phase to the particle surface. In the atmosphere, the life cycle as well as the radiative and chemical properties of an aerosol particle is strongly dependent on its response to humidity. Therefore the hygroscopic behavior of combustion particles emitted from single sources (i.e. from a gasoline and a diesel engine) were studied in laboratory experiments.
Bayesian nonparametric regression with varying residual density
Pati, Debdeep; Dunson, David B.
2013-01-01
We consider the problem of robust Bayesian inference on the mean regression function allowing the residual density to change flexibly with predictors. The proposed class of models is based on a Gaussian process prior for the mean regression function and mixtures of Gaussians for the collection of residual densities indexed by predictors. Initially considering the homoscedastic case, we propose priors for the residual density based on probit stick-breaking (PSB) scale mixtures and symmetrized ...
Informed Source Separation: A Bayesian Tutorial
Knuth, Kevin
2013-01-01
Source separation problems are ubiquitous in the physical sciences; any situation where signals are superimposed calls for source separation to estimate the original signals. In this tutorial I will discuss the Bayesian approach to the source separation problem. This approach has a specific advantage in that it requires the designer to explicitly describe the signal model in addition to any other information or assumptions that go into the problem description. This leads naturally to the idea...
A Bayesian Modelling of Wildfires in Portugal
Silva, Giovani L.; Soares, Paulo; Marques, Susete; Dias, Inês M.; Oliveira, Manuela M.; Borges, Guilherme J.
2015-01-01
In the last decade wildfires became a serious problem in Portugal due to different issues such as climatic characteristics and nature of Portuguese forest. In order to analyse wildfire data, we employ beta regression for modelling the proportion of burned forest area, under a Bayesian perspective. Our main goal is to find out fire risk factors that influence the proportion of area burned and what may make a forest type susceptible or resistant to fire. Then, we analyse wildfire...
Market Segmentation Using Bayesian Model Based Clustering
Van Hattum, P.
2009-01-01
This dissertation deals with two basic problems in marketing, that are market segmentation, which is the grouping of persons who share common aspects, and market targeting, which is focusing your marketing efforts on one or more attractive market segments. For the grouping of persons who share common aspects a Bayesian model based clustering approach is proposed such that it can be applied to data sets that are specifically used for market segmentation. The cluster algorithm can handle very l...
Centralized Bayesian reliability modelling with sensor networks
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil; Sečkárová, Vladimíra
2013-01-01
Roč. 19, č. 5 (2013), s. 471-482. ISSN 1387-3954 R&D Projects: GA MŠk 7D12004 Grant ostatní: GA MŠk(CZ) SVV-265315 Keywords : Bayesian modelling * Sensor network * Reliability Subject RIV: BD - Theory of Information Impact factor: 0.984, year: 2013 http://library.utia.cas.cz/separaty/2013/AS/dedecius-0392551.pdf
Characteristic imsets for learning Bayesian network structure
Czech Academy of Sciences Publication Activity Database
Hemmecke, R.; Lindner, S.; Studený, Milan
2012-01-01
Roč. 53, č. 9 (2012), s. 1336-1349. ISSN 0888-613X R&D Projects: GA MŠk(CZ) 1M0572; GA ČR GA201/08/0539 Institutional support: RVO:67985556 Keywords : learning Bayesian network structure * essential graph * standard imset * characteristic imset * LP relaxation of a polytope Subject RIV: BA - General Mathematics Impact factor: 1.729, year: 2012 http://library.utia.cas.cz/separaty/2012/MTR/studeny-0382596.pdf
Approximate Bayesian computation in population genetics.
Beaumont, Mark A; Zhang, Wenyang; Balding, David J.
2002-01-01
We propose a new method for approximate Bayesian statistical inference on the basis of summary statistics. The method is suited to complex problems that arise in population genetics, extending ideas developed in this setting by earlier authors. Properties of the posterior distribution of a parameter, such as its mean or density curve, are approximated without explicit likelihood calculations. This is achieved by fitting a local-linear regression of simulated parameter values on simulated summ...
Nonparametric Bayesian Storyline Detection from Microtexts
Krishnan, Vinodh; Eisenstein, Jacob
2016-01-01
News events and social media are composed of evolving storylines, which capture public attention for a limited period of time. Identifying these storylines would enable many high-impact applications, such as tracking public interest and opinion in ongoing crisis events. However, this requires integrating temporal and linguistic information, and prior work takes a largely heuristic approach. We present a novel online non-parametric Bayesian framework for storyline detection, using the distance...
A Bayesian Concept Learning Approach to Crowdsourcing
DEFF Research Database (Denmark)
Viappiani, Paolo Renato; Zilles, Sandra; Hamilton, Howard J.;
2011-01-01
We develop a Bayesian approach to concept learning for crowdsourcing applications. A probabilistic belief over possible concept definitions is maintained and updated according to (noisy) observations from experts, whose behaviors are modeled using discrete types. We propose recommendation...... techniques, inference methods, and query selection strategies to assist a user charged with choosing a configuration that satisfies some (partially known) concept. Our model is able to simultaneously learn the concept definition and the types of the experts. We evaluate our model with simulations, showing...
Constrained bayesian inference of project performance models
Sunmola, Funlade
2013-01-01
Project performance models play an important role in the management of project success. When used for monitoring projects, they can offer predictive ability such as indications of possible delivery problems. Approaches for monitoring project performance relies on available project information including restrictions imposed on the project, particularly the constraints of cost, quality, scope and time. We study in this paper a Bayesian inference methodology for project performance modelling in ...
Dual Control for Approximate Bayesian Reinforcement Learning
Klenske, Edgar D.; Hennig, Philipp
2015-01-01
Control of non-episodic, finite-horizon dynamical systems with uncertain dynamics poses a tough and elementary case of the exploration-exploitation trade-off. Bayesian reinforcement learning, reasoning about the effect of actions and future observations, offers a principled solution, but is intractable. We review, then extend an old approximate approach from control theory---where the problem is known as dual control---in the context of modern regression methods, specifically generalized line...
Bayesian biclustering of gene expression data
Liu Jun S; Gu Jiajun
2008-01-01
Abstract Background Biclustering of gene expression data searches for local patterns of gene expression. A bicluster (or a two-way cluster) is defined as a set of genes whose expression profiles are mutually similar within a subset of experimental conditions/samples. Although several biclustering algorithms have been studied, few are based on rigorous statistical models. Results We developed a Bayesian biclustering model (BBC), and implemented a Gibbs sampling procedure for its statistical in...
A Theory of Bayesian Decision Making
Karni, Edi
2009-01-01
This paper presents a complete, choice-based, axiomatic Bayesian decision theory. It introduces a new choice set consisting of information-contingent plans for choosing actions and bets and subjective expected utility model with effect-dependent utility functions and action-dependent subjective probabilities which, in conjunction with the updating of the probabilities using Bayes' rule, gives rise to a unique prior and a set of action-dependent posterior probabilities representing the decisio...
A Bayesian framework for robotic programming
Lebeltel, Olivier; Diard, Julien; Bessiere, Pierre; Mazer, Emmanuel
2000-01-01
We propose an original method for programming robots based on Bayesian inference and learning. This method formally deals with problems of uncertainty and incomplete information that are inherent to the field. Indeed, the principal difficulties of robot programming comes from the unavoidable incompleteness of the models used. We present the formalism for describing a robotic task as well as the resolution methods. This formalism is inspired by the theory of probability, suggested by the physi...
Forming Object Concept Using Bayesian Network
Nakamura, Tomoaki; Nagai, Takayuki
2010-01-01
This chapter hase discussed a novel framework for object understanding. Implementation of the proposed framework using Bayesian Network has been presented. Although the result given in this paper is preliminary one, we have shown that the system can form object concept by observing the performance by human hands. The on-line learning is left for the future works. Moreover the model should be extended so that it can represent the object usage and work objects.
Approximate Bayesian inference for complex ecosystems
Michael P H Stumpf
2014-01-01
Mathematical models have been central to ecology for nearly a century. Simple models of population dynamics have allowed us to understand fundamental aspects underlying the dynamics and stability of ecological systems. What has remained a challenge, however, is to meaningfully interpret experimental or observational data in light of mathematical models. Here, we review recent developments, notably in the growing field of approximate Bayesian computation (ABC), that allow us to calibrate mathe...
Bayesian modeling and classification of neural signals
Lewicki, Michael S.
1994-01-01
Signal processing and classification algorithms often have limited applicability resulting from an inaccurate model of the signal's underlying structure. We present here an efficient, Bayesian algorithm for modeling a signal composed of the superposition of brief, Poisson-distributed functions. This methodology is applied to the specific problem of modeling and classifying extracellular neural waveforms which are composed of a superposition of an unknown number of action potentials CAPs). ...
Summary Statistics in Approximate Bayesian Computation
Prangle, Dennis
2015-01-01
This document is due to appear as a chapter of the forthcoming Handbook of Approximate Bayesian Computation (ABC) edited by S. Sisson, Y. Fan, and M. Beaumont. Since the earliest work on ABC, it has been recognised that using summary statistics is essential to produce useful inference results. This is because ABC suffers from a curse of dimensionality effect, whereby using high dimensional inputs causes large approximation errors in the output. It is therefore crucial to find low dimensional ...
Bayesian Semiparametric Modeling of Realized Covariance Matrices
Jin, Xin; John M Maheu
2014-01-01
This paper introduces several new Bayesian nonparametric models suitable for capturing the unknown conditional distribution of realized covariance (RCOV) matrices. Existing dynamic Wishart models are extended to countably infinite mixture models of Wishart and inverse-Wishart distributions. In addition to mixture models with constant weights we propose models with time-varying weights to capture time dependence in the unknown distribution. Each of our models can be combined with returns...
BEAST: Bayesian evolutionary analysis by sampling trees
Drummond Alexei J; Rambaut Andrew
2007-01-01
Abstract Background The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based m...
BEAST: Bayesian evolutionary analysis by sampling trees
Drummond, Alexei J.; Rambaut, Andrew
2007-01-01
Background: The evolutionary analysis of molecular sequence variation is a statistical enterprise. This is reflected in the increased use of probabilistic models for phylogenetic inference, multiple sequence alignment, and molecular population genetics. Here we present BEAST: a fast, flexible software architecture for Bayesian analysis of molecular sequences related by an evolutionary tree. A large number of popular stochastic models of sequence evolution are provided and tree-based models su...
Benchmarking dynamic Bayesian network structure learning algorithms
Trabelsi, Ghada; Leray, Philippe; Ben Ayed, Mounir; Alimi, Adel
2012-01-01
Dynamic Bayesian Networks (DBNs) are probabilistic graphical models dedicated to modeling multivariate time series. Two-time slice BNs (2-TBNs) are the most current type of these models. Static BN structure learning is a well-studied domain. Many approaches have been proposed and the quality of these algorithms has been studied over a range of di erent standard networks and methods of evaluation. To the best of our knowledge, all studies about DBN structure learning use their own benchmarks a...
Bayesian Multi-Scale Optimistic Optimization
Wang, Ziyu; Shakibi, Babak; Jin, Lin; De Freitas, Nando
2014-01-01
Bayesian optimization is a powerful global optimization technique for expensive black-box functions. One of its shortcomings is that it requires auxiliary optimization of an acquisition function at each iteration. This auxiliary optimization can be costly and very hard to carry out in practice. Moreover, it creates serious theoretical concerns, as most of the convergence results assume that the exact optimum of the acquisition function can be found. In this paper, we introduce a new technique...
Bayesian mixture models for Poisson astronomical images
Guglielmetti, Fabrizia; Fischer, Rainer; Dose, Volker
2012-01-01
Astronomical images in the Poisson regime are typically characterized by a spatially varying cosmic background, large variety of source morphologies and intensities, data incompleteness, steep gradients in the data, and few photon counts per pixel. The Background-Source separation technique is developed with the aim to detect faint and extended sources in astronomical images characterized by Poisson statistics. The technique employs Bayesian mixture models to reliably detect the background as...
Complex Bayesian models: construction, and sampling strategies
Huston, Carolyn Marie
2011-01-01
Bayesian models are useful tools for realistically modeling processes occurring in the real world. In particular, we consider models for spatio-temporal data where the response vector is compositional, ie. has components that sum-to-one. A unique multivariate conditional hierarchical model (MVCAR) is proposed. Statistical methods for MVCAR models are well developed and we extend these tools for use with a discrete compositional response. We harness the advantages of an MVCAR model when the re...
The variational Bayes approximation in Bayesian filtering
Czech Academy of Sciences Publication Activity Database
Šmídl, Václav; Quinn, A.
Bryan : IEEE, 2006, s. 1-4. ISBN 1-4244-0469-X. [IEEE International Conference on Acoustics , Speech and Signal Processing. Toulouse (FR), 14.05.2006-19.05.2006] R&D Projects: GA AV ČR 1ET100750401; GA MŠk 1M0572 Institutional research plan: CEZ:AV0Z10750506 Keywords : variational Bayes * Bayesian filtering Subject RIV: BD - Theory of Information
Towards Bayesian Deep Learning: A Survey
Wang, Hao; Yeung, Dit-Yan
2016-01-01
While perception tasks such as visual object recognition and text understanding play an important role in human intelligence, the subsequent tasks that involve inference, reasoning and planning require an even higher level of intelligence. The past few years have seen major advances in many perception tasks using deep learning models. For higher-level inference, however, probabilistic graphical models with their Bayesian nature are still more powerful and flexible. To achieve integrated intel...
Improving Environmental Scanning Systems Using Bayesian Networks
Simon Welter; Jörg H. Mayer; Reiner Quick
2013-01-01
As companies’ environment is becoming increasingly volatile, scanning systems gain in importance. We propose a hybrid process model for such systems' information gathering and interpretation tasks that combines quantitative information derived from regression analyses and qualitative knowledge from expert interviews. For the latter, we apply Bayesian networks. We derive the need for such a hybrid process model from a literature review. We lay out our model to find a suitable set of business e...
On-line Bayesian System Identification
Romeres, Diego; Prando, Giulia; Pillonetto, Gianluigi; Chiuso, Alessandro
2016-01-01
We consider an on-line system identification setting, in which new data become available at given time steps. In order to meet real-time estimation requirements, we propose a tailored Bayesian system identification procedure, in which the hyper-parameters are still updated through Marginal Likelihood maximization, but after only one iteration of a suitable iterative optimization algorithm. Both gradient methods and the EM algorithm are considered for the Marginal Likelihood optimization. We c...
Dynamic Bayesian Networks for Cue Integration
Paul Maier; Frederike Petzschner
2012-01-01
If we want to understand how humans use contextual cues to solve tasks such as estimating distances from optic flow during path integration, our models need to represent the available information and formally describe how these representations are processed. In particular the temporal dynamics need to be incorporated, since it has been shown that humans exploit short-term experience gained in previous trials (Petzschner und Glasauer, 2011). Existing studies often use a Bayesian approach to mo...
The Bayesian Second Law of Thermodynamics
Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason
2015-01-01
We derive a generalization of the Second Law of Thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically-evolving system degrades over...
Bayesian analysis of matrix data with rstiefel
Hoff, Peter D.
2013-01-01
We illustrate the use of the R-package "rstiefel" for matrix-variate data analysis in the context of two examples. The first example considers estimation of a reduced-rank mean matrix in the presence of normally distributed noise. The second example considers the modeling of a social network of friendships among teenagers. Bayesian estimation for these models requires the ability to simulate from the matrix-variate von Mises-Fisher distributions and the matrix-variate Bingham distributions on...
An Explanation Mechanism for Bayesian Inferencing Systems
Norton, Steven W.
2013-01-01
Explanation facilities are a particularly important feature of expert system frameworks. It is an area in which traditional rule-based expert system frameworks have had mixed results. While explanations about control are well handled, facilities are needed for generating better explanations concerning knowledge base content. This paper approaches the explanation problem by examining the effect an event has on a variable of interest within a symmetric Bayesian inferencing system. We argue that...
Knowledge Engineering Within A Generalized Bayesian Framework
Barth, Stephen W.; Norton, Steven W.
2013-01-01
During the ongoing debate over the representation of uncertainty in Artificial Intelligence, Cheeseman, Lemmer, Pearl, and others have argued that probability theory, and in particular the Bayesian theory, should be used as the basis for the inference mechanisms of Expert Systems dealing with uncertainty. In order to pursue the issue in a practical setting, sophisticated tools for knowledge engineering are needed that allow flexible and understandable interaction with the underlying knowledge...
Towards Bayesian filtering on restricted support
Czech Academy of Sciences Publication Activity Database
Pavelková, Lenka; Kárný, Miroslav; Šmídl, Václav
Cambridge : University of Cambridge, 2006, s. 1-4. ISBN 978-1-4244-0579-4. [Nonlinear Statistical Siganl Processing Workshop 2006. Cambridge (GB), 13.09.2006-15.09.2006] R&D Projects: GA MŠk 1M0572; GA AV ČR 1ET100750401; GA MŠk 2C06001 Institutional research plan: CEZ:AV0Z10750506 Keywords : bayesian estimation * state model * restricted support Subject RIV: BC - Control Systems Theory
Bayesian Estimation and Inference Using Stochastic Electronics.
Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André
2016-01-01
In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream. PMID:27047326
Bayesian Spatial Modelling with R-INLA
Finn Lindgren; Håvard Rue
2015-01-01
The principles behind the interface to continuous domain spatial models in the R- INLA software package for R are described. The integrated nested Laplace approximation (INLA) approach proposed by Rue, Martino, and Chopin (2009) is a computationally effective alternative to MCMC for Bayesian inference. INLA is designed for latent Gaussian models, a very wide and flexible class of models ranging from (generalized) linear mixed to spatial and spatio-temporal models. Combined with the stochastic...
Bayesian Analysis of Individual Level Personality Dynamics
Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Bayesian Analysis of Individual Level Personality Dynamics.
Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Particle identification in ALICE: a Bayesian approach
Adam, Jaroslav; Aggarwal, Madan Mohan; Aglieri Rinella, Gianluca; Agnello, Michelangelo; Agrawal, Neelima; Ahammed, Zubayer; Ahmad, Shakeel; Ahn, Sang Un; Aiola, Salvatore; Akindinov, Alexander; Alam, Sk Noor; Silva De Albuquerque, Danilo; Aleksandrov, Dmitry; Alessandro, Bruno; Alexandre, Didier; Alfaro Molina, Jose Ruben; Alici, Andrea; Alkin, Anton; Millan Almaraz, Jesus Roberto; Alme, Johan; Alt, Torsten; Altinpinar, Sedat; Altsybeev, Igor; Alves Garcia Prado, Caio; Andrei, Cristian; Andronic, Anton; Anguelov, Venelin; Anticic, Tome; Antinori, Federico; Antonioli, Pietro; Aphecetche, Laurent Bernard; Appelshaeuser, Harald; Arcelli, Silvia; Arnaldi, Roberta; Arnold, Oliver Werner; Arsene, Ionut Cristian; Arslandok, Mesut; Audurier, Benjamin; Augustinus, Andre; Averbeck, Ralf Peter; Azmi, Mohd Danish; Badala, Angela; Baek, Yong Wook; Bagnasco, Stefano; Bailhache, Raphaelle Marie; Bala, Renu; Balasubramanian, Supraja; Baldisseri, Alberto; Baral, Rama Chandra; Barbano, Anastasia Maria; Barbera, Roberto; Barile, Francesco; Barnafoldi, Gergely Gabor; Barnby, Lee Stuart; Ramillien Barret, Valerie; Bartalini, Paolo; Barth, Klaus; Bartke, Jerzy Gustaw; Bartsch, Esther; Basile, Maurizio; Bastid, Nicole; Basu, Sumit; Bathen, Bastian; Batigne, Guillaume; Batista Camejo, Arianna; Batyunya, Boris; Batzing, Paul Christoph; Bearden, Ian Gardner; Beck, Hans; Bedda, Cristina; Behera, Nirbhay Kumar; Belikov, Iouri; Bellini, Francesca; Bello Martinez, Hector; Bellwied, Rene; Belmont Iii, Ronald John; Belmont Moreno, Ernesto; Belyaev, Vladimir; Benacek, Pavel; Bencedi, Gyula; Beole, Stefania; Berceanu, Ionela; Bercuci, Alexandru; Berdnikov, Yaroslav; Berenyi, Daniel; Bertens, Redmer Alexander; Berzano, Dario; Betev, Latchezar; Bhasin, Anju; Bhat, Inayat Rasool; Bhati, Ashok Kumar; Bhattacharjee, Buddhadeb; Bhom, Jihyun; Bianchi, Livio; Bianchi, Nicola; Bianchin, Chiara; Bielcik, Jaroslav; Bielcikova, Jana; Bilandzic, Ante; Biro, Gabor; Biswas, Rathijit; Biswas, Saikat; Bjelogrlic, Sandro; Blair, Justin Thomas; Blau, Dmitry; Blume, Christoph; Bock, Friederike; Bogdanov, Alexey; Boggild, Hans; Boldizsar, Laszlo; Bombara, Marek; Book, Julian Heinz; Borel, Herve; Borissov, Alexander; Borri, Marcello; Bossu, Francesco; Botta, Elena; Bourjau, Christian; Braun-Munzinger, Peter; Bregant, Marco; Breitner, Timo Gunther; Broker, Theo Alexander; Browning, Tyler Allen; Broz, Michal; Brucken, Erik Jens; Bruna, Elena; Bruno, Giuseppe Eugenio; Budnikov, Dmitry; Buesching, Henner; Bufalino, Stefania; Buncic, Predrag; Busch, Oliver; Buthelezi, Edith Zinhle; Bashir Butt, Jamila; Buxton, Jesse Thomas; Cabala, Jan; Caffarri, Davide; Cai, Xu; Caines, Helen Louise; Calero Diaz, Liliet; Caliva, Alberto; Calvo Villar, Ernesto; Camerini, Paolo; Carena, Francesco; Carena, Wisla; Carnesecchi, Francesca; Castillo Castellanos, Javier Ernesto; Castro, Andrew John; Casula, Ester Anna Rita; Ceballos Sanchez, Cesar; Cepila, Jan; Cerello, Piergiorgio; Cerkala, Jakub; Chang, Beomsu; Chapeland, Sylvain; Chartier, Marielle; Charvet, Jean-Luc Fernand; Chattopadhyay, Subhasis; Chattopadhyay, Sukalyan; Chauvin, Alex; Chelnokov, Volodymyr; Cherney, Michael Gerard; Cheshkov, Cvetan Valeriev; Cheynis, Brigitte; Chibante Barroso, Vasco Miguel; Dobrigkeit Chinellato, David; Cho, Soyeon; Chochula, Peter; Choi, Kyungeon; Chojnacki, Marek; Choudhury, Subikash; Christakoglou, Panagiotis; Christensen, Christian Holm; Christiansen, Peter; Chujo, Tatsuya; Chung, Suh-Urk; Cicalo, Corrado; Cifarelli, Luisa; Cindolo, Federico; Cleymans, Jean Willy Andre; Colamaria, Fabio Filippo; Colella, Domenico; Collu, Alberto; Colocci, Manuel; Conesa Balbastre, Gustavo; Conesa Del Valle, Zaida; Connors, Megan Elizabeth; Contreras Nuno, Jesus Guillermo; Cormier, Thomas Michael; Corrales Morales, Yasser; Cortes Maldonado, Ismael; Cortese, Pietro; Cosentino, Mauro Rogerio; Costa, Filippo; Crochet, Philippe; Cruz Albino, Rigoberto; Cuautle Flores, Eleazar; Cunqueiro Mendez, Leticia; Dahms, Torsten; Dainese, Andrea; Danisch, Meike Charlotte; Danu, Andrea; Das, Debasish; Das, Indranil; Das, Supriya; Dash, Ajay Kumar; Dash, Sadhana; De, Sudipan; De Caro, Annalisa; De Cataldo, Giacinto; De Conti, Camila; De Cuveland, Jan; De Falco, Alessandro; De Gruttola, Daniele; De Marco, Nora; De Pasquale, Salvatore; Deisting, Alexander; Deloff, Andrzej; Denes, Ervin Sandor; Deplano, Caterina; Dhankher, Preeti; Di Bari, Domenico; Di Mauro, Antonio; Di Nezza, Pasquale; Diaz Corchero, Miguel Angel; Dietel, Thomas; Dillenseger, Pascal; Divia, Roberto; Djuvsland, Oeystein; Dobrin, Alexandru Florin; Domenicis Gimenez, Diogenes; Donigus, Benjamin; Dordic, Olja; Drozhzhova, Tatiana; Dubey, Anand Kumar; Dubla, Andrea; Ducroux, Laurent; Dupieux, Pascal; Ehlers Iii, Raymond James; Elia, Domenico; Endress, Eric; Engel, Heiko; Epple, Eliane; Erazmus, Barbara Ewa; Erdemir, Irem; Erhardt, Filip; Espagnon, Bruno; Estienne, Magali Danielle; Esumi, Shinichi; Eum, Jongsik; Evans, David; Evdokimov, Sergey; Eyyubova, Gyulnara; Fabbietti, Laura; Fabris, Daniela; Faivre, Julien; Fantoni, Alessandra; Fasel, Markus; Feldkamp, Linus; Feliciello, Alessandro; Feofilov, Grigorii; Ferencei, Jozef; Fernandez Tellez, Arturo; Gonzalez Ferreiro, Elena; Ferretti, Alessandro; Festanti, Andrea; Feuillard, Victor Jose Gaston; Figiel, Jan; Araujo Silva Figueredo, Marcel; Filchagin, Sergey; Finogeev, Dmitry; Fionda, Fiorella; Fiore, Enrichetta Maria; Fleck, Martin Gabriel; Floris, Michele; Foertsch, Siegfried Valentin; Foka, Panagiota; Fokin, Sergey; Fragiacomo, Enrico; Francescon, Andrea; Frankenfeld, Ulrich Michael; Fronze, Gabriele Gaetano; Fuchs, Ulrich; Furget, Christophe; Furs, Artur; Fusco Girard, Mario; Gaardhoeje, Jens Joergen; Gagliardi, Martino; Gago Medina, Alberto Martin; Gallio, Mauro; Gangadharan, Dhevan Raja; Ganoti, Paraskevi; Gao, Chaosong; Garabatos Cuadrado, Jose; Garcia-Solis, Edmundo Javier; Gargiulo, Corrado; Gasik, Piotr Jan; Gauger, Erin Frances; Germain, Marie; Gheata, Andrei George; Gheata, Mihaela; Ghosh, Premomoy; Ghosh, Sanjay Kumar; Gianotti, Paola; Giubellino, Paolo; Giubilato, Piero; Gladysz-Dziadus, Ewa; Glassel, Peter; Gomez Coral, Diego Mauricio; Gomez Ramirez, Andres; Sanchez Gonzalez, Andres; Gonzalez, Victor; Gonzalez Zamora, Pedro; Gorbunov, Sergey; Gorlich, Lidia Maria; Gotovac, Sven; Grabski, Varlen; Grachov, Oleg Anatolievich; Graczykowski, Lukasz Kamil; Graham, Katie Leanne; Grelli, Alessandro; Grigoras, Alina Gabriela; Grigoras, Costin; Grigoryev, Vladislav; Grigoryan, Ara; Grigoryan, Smbat; Grynyov, Borys; Grion, Nevio; Gronefeld, Julius Maximilian; Grosse-Oetringhaus, Jan Fiete; Grosso, Raffaele; Guber, Fedor; Guernane, Rachid; Guerzoni, Barbara; Gulbrandsen, Kristjan Herlache; Gunji, Taku; Gupta, Anik; Gupta, Ramni; Haake, Rudiger; Haaland, Oystein Senneset; Hadjidakis, Cynthia Marie; Haiduc, Maria; Hamagaki, Hideki; Hamar, Gergoe; Hamon, Julien Charles; Harris, John William; Harton, Austin Vincent; Hatzifotiadou, Despina; Hayashi, Shinichi; Heckel, Stefan Thomas; Hellbar, Ernst; Helstrup, Haavard; Herghelegiu, Andrei Ionut; Herrera Corral, Gerardo Antonio; Hess, Benjamin Andreas; Hetland, Kristin Fanebust; Hillemanns, Hartmut; Hippolyte, Boris; Horak, David; Hosokawa, Ritsuya; Hristov, Peter Zahariev; Humanic, Thomas; Hussain, Nur; Hussain, Tahir; Hutter, Dirk; Hwang, Dae Sung; Ilkaev, Radiy; Inaba, Motoi; Incani, Elisa; Ippolitov, Mikhail; Irfan, Muhammad; Ivanov, Marian; Ivanov, Vladimir; Izucheev, Vladimir; Jacazio, Nicolo; Jacobs, Peter Martin; Jadhav, Manoj Bhanudas; Jadlovska, Slavka; Jadlovsky, Jan; Jahnke, Cristiane; Jakubowska, Monika Joanna; Jang, Haeng Jin; Janik, Malgorzata Anna; Pahula Hewage, Sandun; Jena, Chitrasen; Jena, Satyajit; Jimenez Bustamante, Raul Tonatiuh; Jones, Peter Graham; Jusko, Anton; Kalinak, Peter; Kalweit, Alexander Philipp; Kamin, Jason Adrian; Kang, Ju Hwan; Kaplin, Vladimir; Kar, Somnath; Karasu Uysal, Ayben; Karavichev, Oleg; Karavicheva, Tatiana; Karayan, Lilit; Karpechev, Evgeny; Kebschull, Udo Wolfgang; Keidel, Ralf; Keijdener, Darius Laurens; Keil, Markus; Khan, Mohammed Mohisin; Khan, Palash; Khan, Shuaib Ahmad; Khanzadeev, Alexei; Kharlov, Yury; Kileng, Bjarte; Kim, Do Won; Kim, Dong Jo; Kim, Daehyeok; Kim, Hyeonjoong; Kim, Jinsook; Kim, Minwoo; Kim, Se Yong; Kim, Taesoo; Kirsch, Stefan; Kisel, Ivan; Kiselev, Sergey; Kisiel, Adam Ryszard; Kiss, Gabor; Klay, Jennifer Lynn; Klein, Carsten; Klein, Jochen; Klein-Boesing, Christian; Klewin, Sebastian; Kluge, Alexander; Knichel, Michael Linus; Knospe, Anders Garritt; Kobdaj, Chinorat; Kofarago, Monika; Kollegger, Thorsten; Kolozhvari, Anatoly; Kondratev, Valerii; Kondratyeva, Natalia; Kondratyuk, Evgeny; Konevskikh, Artem; Kopcik, Michal; Kostarakis, Panagiotis; Kour, Mandeep; Kouzinopoulos, Charalampos; Kovalenko, Oleksandr; Kovalenko, Vladimir; Kowalski, Marek; Koyithatta Meethaleveedu, Greeshma; Kralik, Ivan; Kravcakova, Adela; Krivda, Marian; Krizek, Filip; Kryshen, Evgeny; Krzewicki, Mikolaj; Kubera, Andrew Michael; Kucera, Vit; Kuhn, Christian Claude; Kuijer, Paulus Gerardus; Kumar, Ajay; Kumar, Jitendra; Kumar, Lokesh; Kumar, Shyam; Kurashvili, Podist; Kurepin, Alexander; Kurepin, Alexey; Kuryakin, Alexey; Kweon, Min Jung; Kwon, Youngil; La Pointe, Sarah Louise; La Rocca, Paola; Ladron De Guevara, Pedro; Lagana Fernandes, Caio; Lakomov, Igor; Langoy, Rune; Lara Martinez, Camilo Ernesto; Lardeux, Antoine Xavier; Lattuca, Alessandra; Laudi, Elisa; Lea, Ramona; Leardini, Lucia; Lee, Graham Richard; Lee, Seongjoo; Lehas, Fatiha; Lemmon, Roy Crawford; Lenti, Vito; Leogrande, Emilia; Leon Monzon, Ildefonso; Leon Vargas, Hermes; Leoncino, Marco; Levai, Peter; Li, Shuang; Li, Xiaomei; Lien, Jorgen Andre; Lietava, Roman; Lindal, Svein; Lindenstruth, Volker; Lippmann, Christian; Lisa, Michael Annan; Ljunggren, Hans Martin; Lodato, Davide Francesco; Lonne, Per-Ivar; Loginov, Vitaly; Loizides, Constantinos; Lopez, Xavier Bernard; Lopez Torres, Ernesto; Lowe, Andrew John; Luettig, Philipp Johannes; Lunardon, Marcello; Luparello, Grazia; Lutz, Tyler Harrison; Maevskaya, Alla; Mager, Magnus; Mahajan, Sanjay; Mahmood, Sohail Musa; Maire, Antonin; Majka, Richard Daniel; Malaev, Mikhail; Maldonado Cervantes, Ivonne Alicia; Malinina, Liudmila; Mal'Kevich, Dmitry; Malzacher, Peter; Mamonov, Alexander; Manko, Vladislav; Manso, Franck; Manzari, Vito; Marchisone, Massimiliano; Mares, Jiri; Margagliotti, Giacomo Vito; Margotti, Anselmo; Margutti, Jacopo; Marin, Ana Maria; Markert, Christina; Marquard, Marco; Martin, Nicole Alice; Martin Blanco, Javier; Martinengo, Paolo; Martinez Hernandez, Mario Ivan; Martinez-Garcia, Gines; Martinez Pedreira, Miguel; Mas, Alexis Jean-Michel; Masciocchi, Silvia; Masera, Massimo; Masoni, Alberto; Mastroserio, Annalisa; Matyja, Adam Tomasz; Mayer, Christoph; Mazer, Joel Anthony; Mazzoni, Alessandra Maria; Mcdonald, Daniel; Meddi, Franco; Melikyan, Yuri; Menchaca-Rocha, Arturo Alejandro; Meninno, Elisa; Mercado-Perez, Jorge; Meres, Michal; Miake, Yasuo; Mieskolainen, Matti Mikael; Mikhaylov, Konstantin; Milano, Leonardo; Milosevic, Jovan; Mischke, Andre; Mishra, Aditya Nath; Miskowiec, Dariusz Czeslaw; Mitra, Jubin; Mitu, Ciprian Mihai; Mohammadi, Naghmeh; Mohanty, Bedangadas; Molnar, Levente; Montano Zetina, Luis Manuel; Montes Prado, Esther; Moreira De Godoy, Denise Aparecida; Perez Moreno, Luis Alberto; Moretto, Sandra; Morreale, Astrid; Morsch, Andreas; Muccifora, Valeria; Mudnic, Eugen; Muhlheim, Daniel Michael; Muhuri, Sanjib; Mukherjee, Maitreyee; Mulligan, James Declan; Gameiro Munhoz, Marcelo; Munzer, Robert Helmut; Murakami, Hikari; Murray, Sean; Musa, Luciano; Musinsky, Jan; Naik, Bharati; Nair, Rahul; Nandi, Basanta Kumar; Nania, Rosario; Nappi, Eugenio; Naru, Muhammad Umair; Ferreira Natal Da Luz, Pedro Hugo; Nattrass, Christine; Rosado Navarro, Sebastian; Nayak, Kishora; Nayak, Ranjit; Nayak, Tapan Kumar; Nazarenko, Sergey; Nedosekin, Alexander; Nellen, Lukas; Ng, Fabian; Nicassio, Maria; Niculescu, Mihai; Niedziela, Jeremi; Nielsen, Borge Svane; Nikolaev, Sergey; Nikulin, Sergey; Nikulin, Vladimir; Noferini, Francesco; Nomokonov, Petr; Nooren, Gerardus; Cabanillas Noris, Juan Carlos; Norman, Jaime; Nyanin, Alexander; Nystrand, Joakim Ingemar; Oeschler, Helmut Oskar; Oh, Saehanseul; Oh, Sun Kun; Ohlson, Alice Elisabeth; Okatan, Ali; Okubo, Tsubasa; Olah, Laszlo; Oleniacz, Janusz; Oliveira Da Silva, Antonio Carlos; Oliver, Michael Henry; Onderwaater, Jacobus; Oppedisano, Chiara; Orava, Risto; Oravec, Matej; Ortiz Velasquez, Antonio; Oskarsson, Anders Nils Erik; Otwinowski, Jacek Tomasz; Oyama, Ken; Ozdemir, Mahmut; Pachmayer, Yvonne Chiara; Pagano, Davide; Pagano, Paola; Paic, Guy; Pal, Susanta Kumar; Pan, Jinjin; Pandey, Ashutosh Kumar; Papikyan, Vardanush; Pappalardo, Giuseppe; Pareek, Pooja; Park, Woojin; Parmar, Sonia; Passfeld, Annika; Paticchio, Vincenzo; Patra, Rajendra Nath; Paul, Biswarup; Pei, Hua; Peitzmann, Thomas; Pereira Da Costa, Hugo Denis Antonio; Peresunko, Dmitry Yurevich; Perez Lara, Carlos Eugenio; Perez Lezama, Edgar; Peskov, Vladimir; Pestov, Yury; Petracek, Vojtech; Petrov, Viacheslav; Petrovici, Mihai; Petta, Catia; Piano, Stefano; Pikna, Miroslav; Pillot, Philippe; Ozelin De Lima Pimentel, Lais; Pinazza, Ombretta; Pinsky, Lawrence; Piyarathna, Danthasinghe; Ploskon, Mateusz Andrzej; Planinic, Mirko; Pluta, Jan Marian; Pochybova, Sona; Podesta Lerma, Pedro Luis Manuel; Poghosyan, Martin; Polishchuk, Boris; Poljak, Nikola; Poonsawat, Wanchaloem; Pop, Amalia; Porteboeuf, Sarah Julie; Porter, R Jefferson; Pospisil, Jan; Prasad, Sidharth Kumar; Preghenella, Roberto; Prino, Francesco; Pruneau, Claude Andre; Pshenichnov, Igor; Puccio, Maximiliano; Puddu, Giovanna; Pujahari, Prabhat Ranjan; Punin, Valery; Putschke, Jorn Henning; Qvigstad, Henrik; Rachevski, Alexandre; Raha, Sibaji; Rajput, Sonia; Rak, Jan; Rakotozafindrabe, Andry Malala; Ramello, Luciano; Rami, Fouad; Raniwala, Rashmi; Raniwala, Sudhir; Rasanen, Sami Sakari; Rascanu, Bogdan Theodor; Rathee, Deepika; Read, Kenneth Francis; Redlich, Krzysztof; Reed, Rosi Jan; Rehman, Attiq Ur; Reichelt, Patrick Simon; Reidt, Felix; Ren, Xiaowen; Renfordt, Rainer Arno Ernst; Reolon, Anna Rita; Reshetin, Andrey; Reygers, Klaus Johannes; Riabov, Viktor; Ricci, Renato Angelo; Richert, Tuva Ora Herenui; Richter, Matthias Rudolph; Riedler, Petra; Riegler, Werner; Riggi, Francesco; Ristea, Catalin-Lucian; Rocco, Elena; Rodriguez Cahuantzi, Mario; Rodriguez Manso, Alis; Roeed, Ketil; Rogochaya, Elena; Rohr, David Michael; Roehrich, Dieter; Ronchetti, Federico; Ronflette, Lucile; Rosnet, Philippe; Rossi, Andrea; Roukoutakis, Filimon; Roy, Ankhi; Roy, Christelle Sophie; Roy, Pradip Kumar; Rubio Montero, Antonio Juan; Rui, Rinaldo; Russo, Riccardo; Ryabinkin, Evgeny; Ryabov, Yury; Rybicki, Andrzej; Saarinen, Sampo; Sadhu, Samrangy; Sadovskiy, Sergey; Safarik, Karel; Sahlmuller, Baldo; Sahoo, Pragati; Sahoo, Raghunath; Sahoo, Sarita; Sahu, Pradip Kumar; Saini, Jogender; Sakai, Shingo; Saleh, Mohammad Ahmad; Salzwedel, Jai Samuel Nielsen; Sambyal, Sanjeev Singh; Samsonov, Vladimir; Sandor, Ladislav; Sandoval, Andres; Sano, Masato; Sarkar, Debojit; Sarkar, Nachiketa; Sarma, Pranjal; Scapparone, Eugenio; Scarlassara, Fernando; Schiaua, Claudiu Cornel; Schicker, Rainer Martin; Schmidt, Christian Joachim; Schmidt, Hans Rudolf; Schuchmann, Simone; Schukraft, Jurgen; Schulc, Martin; Schutz, Yves Roland; Schwarz, Kilian Eberhard; Schweda, Kai Oliver; Scioli, Gilda; Scomparin, Enrico; Scott, Rebecca Michelle; Sefcik, Michal; Seger, Janet Elizabeth; Sekiguchi, Yuko; Sekihata, Daiki; Selyuzhenkov, Ilya; Senosi, Kgotlaesele; Senyukov, Serhiy; Serradilla Rodriguez, Eulogio; Sevcenco, Adrian; Shabanov, Arseniy; Shabetai, Alexandre; Shadura, Oksana; Shahoyan, Ruben; Shahzad, Muhammed Ikram; Shangaraev, Artem; Sharma, Ankita; Sharma, Mona; Sharma, Monika; Sharma, Natasha; Sheikh, Ashik Ikbal; Shigaki, Kenta; Shou, Qiye; Shtejer Diaz, Katherin; Sibiryak, Yury; Siddhanta, Sabyasachi; Sielewicz, Krzysztof Marek; Siemiarczuk, Teodor; Silvermyr, David Olle Rickard; Silvestre, Catherine Micaela; Simatovic, Goran; Simonetti, Giuseppe; Singaraju, Rama Narayana; Singh, Ranbir; Singha, Subhash; Singhal, Vikas; Sinha, Bikash; Sarkar - Sinha, Tinku; Sitar, Branislav; Sitta, Mario; Skaali, Bernhard; Slupecki, Maciej; Smirnov, Nikolai; Snellings, Raimond; Snellman, Tomas Wilhelm; Song, Jihye; Song, Myunggeun; Song, Zixuan; Soramel, Francesca; Sorensen, Soren Pontoppidan; Derradi De Souza, Rafael; Sozzi, Federica; Spacek, Michal; Spiriti, Eleuterio; Sputowska, Iwona Anna; Spyropoulou-Stassinaki, Martha; Stachel, Johanna; Stan, Ionel; Stankus, Paul; Stenlund, Evert Anders; Steyn, Gideon Francois; Stiller, Johannes Hendrik; Stocco, Diego; Strmen, Peter; Alarcon Do Passo Suaide, Alexandre; Sugitate, Toru; Suire, Christophe Pierre; Suleymanov, Mais Kazim Oglu; Suljic, Miljenko; Sultanov, Rishat; Sumbera, Michal; Sumowidagdo, Suharyo; Szabo, Alexander; Szanto De Toledo, Alejandro; Szarka, Imrich; Szczepankiewicz, Adam; Szymanski, Maciej Pawel; Tabassam, Uzma; Takahashi, Jun; Tambave, Ganesh Jagannath; Tanaka, Naoto; Tarhini, Mohamad; Tariq, Mohammad; Tarzila, Madalina-Gabriela; Tauro, Arturo; Tejeda Munoz, Guillermo; Telesca, Adriana; Terasaki, Kohei; Terrevoli, Cristina; Teyssier, Boris; Thaeder, Jochen Mathias; Thakur, Dhananjaya; Thomas, Deepa; Tieulent, Raphael Noel; Timmins, Anthony Robert; Toia, Alberica; Trogolo, Stefano; Trombetta, Giuseppe; Trubnikov, Victor; Trzaska, Wladyslaw Henryk; Tsuji, Tomoya; Tumkin, Alexandr; Turrisi, Rosario; Tveter, Trine Spedstad; Ullaland, Kjetil; Uras, Antonio; Usai, Gianluca; Utrobicic, Antonija; Vala, Martin; Valencia Palomo, Lizardo; Vallero, Sara; Van Der Maarel, Jasper; Van Hoorne, Jacobus Willem; Van Leeuwen, Marco; Vanat, Tomas; Vande Vyvre, Pierre; Varga, Dezso; Vargas Trevino, Aurora Diozcora; Vargyas, Marton; Varma, Raghava; Vasileiou, Maria; Vasiliev, Andrey; Vauthier, Astrid; Vechernin, Vladimir; Veen, Annelies Marianne; Veldhoen, Misha; Velure, Arild; Vercellin, Ermanno; Vergara Limon, Sergio; Vernet, Renaud; Verweij, Marta; Vickovic, Linda; Viesti, Giuseppe; Viinikainen, Jussi Samuli; Vilakazi, Zabulon; Villalobos Baillie, Orlando; Villatoro Tello, Abraham; Vinogradov, Alexander; Vinogradov, Leonid; Vinogradov, Yury; Virgili, Tiziano; Vislavicius, Vytautas; Viyogi, Yogendra; Vodopyanov, Alexander; Volkl, Martin Andreas; Voloshin, Kirill; Voloshin, Sergey; Volpe, Giacomo; Von Haller, Barthelemy; Vorobyev, Ivan; Vranic, Danilo; Vrlakova, Janka; Vulpescu, Bogdan; Wagner, Boris; Wagner, Jan; Wang, Hongkai; Wang, Mengliang; Watanabe, Daisuke; Watanabe, Yosuke; Weber, Michael; Weber, Steffen Georg; Weiser, Dennis Franz; Wessels, Johannes Peter; Westerhoff, Uwe; Whitehead, Andile Mothegi; Wiechula, Jens; Wikne, Jon; Wilk, Grzegorz Andrzej; Wilkinson, Jeremy John; Williams, Crispin; Windelband, Bernd Stefan; Winn, Michael Andreas; Yang, Hongyan; Yang, Ping; Yano, Satoshi; Yasin, Zafar; Yin, Zhongbao; Yokoyama, Hiroki; Yoo, In-Kwon; Yoon, Jin Hee; Yurchenko, Volodymyr; Yushmanov, Igor; Zaborowska, Anna; Zaccolo, Valentina; Zaman, Ali; Zampolli, Chiara; Correia Zanoli, Henrique Jose; Zaporozhets, Sergey; Zardoshti, Nima; Zarochentsev, Andrey; Zavada, Petr; Zavyalov, Nikolay; Zbroszczyk, Hanna Paulina; Zgura, Sorin Ion; Zhalov, Mikhail; Zhang, Haitao; Zhang, Xiaoming; Zhang, Yonghong; Chunhui, Zhang; Zhang, Zuman; Zhao, Chengxin; Zhigareva, Natalia; Zhou, Daicui; Zhou, You; Zhou, Zhuo; Zhu, Hongsheng; Zhu, Jianhui; Zichichi, Antonino; Zimmermann, Alice; Zimmermann, Markus Bernhard; Zinovjev, Gennady; Zyzak, Maksym
2016-01-01
We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss (dE/dx) and time-of-flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high purity samples of identified particles in the decay channels ${\\rm K}_{\\rm S}^{\\rm 0}\\rightarrow \\pi^+\\pi^-$, $\\phi\\rightarrow {\\rm K}^-{\\rm K}^+$ and $\\Lambda\\rightarrow{\\rm p}\\pi^-$ in p–Pb collisions at $\\sqrt{s_{\\rm NN}}= 5.02$TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected $p_{\\rm T}$ spectra of pions, kaons, protons, and D$^0$ mesons in pp coll...
Bayesian Inference of a Multivariate Regression Model
Directory of Open Access Journals (Sweden)
Marick S. Sinay
2014-01-01
Full Text Available We explore Bayesian inference of a multivariate linear regression model with use of a flexible prior for the covariance structure. The commonly adopted Bayesian setup involves the conjugate prior, multivariate normal distribution for the regression coefficients and inverse Wishart specification for the covariance matrix. Here we depart from this approach and propose a novel Bayesian estimator for the covariance. A multivariate normal prior for the unique elements of the matrix logarithm of the covariance matrix is considered. Such structure allows for a richer class of prior distributions for the covariance, with respect to strength of beliefs in prior location hyperparameters, as well as the added ability, to model potential correlation amongst the covariance structure. The posterior moments of all relevant parameters of interest are calculated based upon numerical results via a Markov chain Monte Carlo procedure. The Metropolis-Hastings-within-Gibbs algorithm is invoked to account for the construction of a proposal density that closely matches the shape of the target posterior distribution. As an application of the proposed technique, we investigate a multiple regression based upon the 1980 High School and Beyond Survey.
Bayesian Methods for Radiation Detection and Dosimetry
International Nuclear Information System (INIS)
We performed work in three areas: radiation detection, external and internal radiation dosimetry. In radiation detection we developed Bayesian techniques to estimate the net activity of high and low activity radioactive samples. These techniques have the advantage that the remaining uncertainty about the net activity is described by probability densities. Graphs of the densities show the uncertainty in pictorial form. Figure 1 below demonstrates this point. We applied stochastic processes for a method to obtain Bayesian estimates of 222Rn-daughter products from observed counting rates. In external radiation dosimetry we studied and developed Bayesian methods to estimate radiation doses to an individual with radiation induced chromosome aberrations. We analyzed chromosome aberrations after exposure to gammas and neutrons and developed a method for dose-estimation after criticality accidents. The research in internal radiation dosimetry focused on parameter estimation for compartmental models from observed compartmental activities. From the estimated probability densities of the model parameters we were able to derive the densities for compartmental activities for a two compartment catenary model at different times. We also calculated the average activities and their standard deviation for a simple two compartment model
Bayesian and Dempster–Shafer fusion
Indian Academy of Sciences (India)
Subhash Challa; Don Koks
2004-04-01
The Kalman Filter is traditionally viewed as a prediction–correction ﬁltering algorithm. In this work we show that it can be viewed as a Bayesian fusion algorithm and derive it using Bayesian arguments. We begin with an outline of Bayes theory, using it to discuss well-known quantities such as priors, likelihood and posteriors, and we provide the basic Bayesian fusion equation. We derive the Kalman Filter from this equation using a novel method to evaluate the Chapman–Kolmogorov prediction integral. We then use the theory to fuse data from multiple sensors. Vying with this approach is the Dempster–Shafer theory, which deals with measures of “belief”, and is based on the nonclassical idea of “mass” as opposed to probability. Although these two measures look very similar, there are some differences. We point them out through outlining the ideas of the Dempster– Shafer theory and presenting the basic Dempster–Shafer fusion equation. Finally we compare the two methods, and discuss the relative merits and demerits using an illustrative example.
Sparse Bayesian learning in ISAR tomography imaging
Institute of Scientific and Technical Information of China (English)
SU Wu-ge; WANG Hong-qiang; DENG Bin; WANG Rui-jun; QIN Yu-liang
2015-01-01
Inverse synthetic aperture radar (ISAR) imaging can be regarded as a narrow-band version of the computer aided tomography (CT). The traditional CT imaging algorithms for ISAR, including the polar format algorithm (PFA) and the convolution back projection algorithm (CBP), usually suffer from the problem of the high sidelobe and the low resolution. The ISAR tomography image reconstruction within a sparse Bayesian framework is concerned. Firstly, the sparse ISAR tomography imaging model is established in light of the CT imaging theory. Then, by using the compressed sensing (CS) principle, a high resolution ISAR image can be achieved with limited number of pulses. Since the performance of existing CS-based ISAR imaging algorithms is sensitive to the user parameter, this makes the existing algorithms inconvenient to be used in practice. It is well known that the Bayesian formalism of recover algorithm named sparse Bayesian learning (SBL) acts as an effective tool in regression and classification, which uses an efficient expectation maximization procedure to estimate the necessary parameters, and retains a preferable property of thel0-norm diversity measure. Motivated by that, a fully automated ISAR tomography imaging algorithm based on SBL is proposed. Experimental results based on simulated and electromagnetic (EM) data illustrate the effectiveness and the superiority of the proposed algorithm over the existing algorithms.
Aerosol behavior during SIC control rod failure in QUENCH-13 test
International Nuclear Information System (INIS)
In a nuclear reactor severe accident, radioactive fission products as well as structural materials are released from the core by evaporation, and the released gases form particles by nucleation and condensation. In addition, aerosol particles may be generated by droplet formation and fragmentation of the core. In pressurized water reactors (PWR), a commonly used control rod material is silver-indium-cadmium (SIC) covered with stainless steel cladding. The control rod elements, Cd, In and Ag, have relatively low melting temperatures, and especially Cd has also a very low boiling point. Control rods are likely to fail early on in the accident due to melting of the stainless steel cladding which can be accelerated by eutectic interaction between stainless steel and the surrounding Zircaloy guide tube. The release of the control rod materials would follow the cladding failure thus affecting aerosol source term as well as fuel rod degradation. The QUENCH experimental program at Forschungszentrum Karlsruhe investigates phenomena associated with reflood of a degrading core under postulated severe accident conditions. QUENCH-13 test was the first in this program to include a silver-indium-cadmium control rod of prototypic PWR design. To characterize the extent of aerosol release during the control rod failure, aerosol particle size distribution and concentration measurements in the off-gas pipe of the QUENCH facility were carried out. For the first time, it was possible to determine on-line the aerosol concentration and size distribution released from the core. These results are of prime importance for model development for the proper calculation of the source term resulting from control rod failure. The on-line measurement showed that the main aerosol release started at the bundle temperature maximum of T ∼ 1570 K at hottest bundle elevation. A very large burst of aerosols was detected 660 s later at the bundle temperature maximum of T ∼ 1650 K, followed by a
International Nuclear Information System (INIS)
The Swiss Gas Industry has carried out a systematic, technical estimate of methane release from the complete supply chain from production to consumption for the years 1992/1993. The result of this survey provided a conservative value, amounting to 0.9% of the Swiss domestic output. A continuation of the study taking into account new findings with regard to emission factors and the effect of the climate is now available, which provides a value of 0.8% for the target year of 1996. These results show that the renovation of the network has brought about lower losses in the local gas supplies, particularly for the grey cast iron pipelines. (author)
Converse, Sarah J.; Royle, J. Andrew; Urbanek, Richard P.
2012-01-01
Inbreeding depression is frequently a concern of managers interested in restoring endangered species. Decisions to reduce the potential for inbreeding depression by balancing genotypic contributions to reintroduced populations may exact a cost on long-term demographic performance of the population if those decisions result in reduced numbers of animals released and/or restriction of particularly successful genotypes (i.e., heritable traits of particular family lines). As part of an effort to restore a migratory flock of Whooping Cranes (Grus americana) to eastern North America using the offspring of captive breeders, we obtained a unique dataset which includes post-release mark-recapture data, as well as the pedigree of each released individual. We developed a Bayesian formulation of a multi-state model to analyze radio-telemetry, band-resight, and dead recovery data on reintroduced individuals, in order to track survival and breeding state transitions. We used studbook-based individual covariates to examine the comparative evidence for and degree of effects of inbreeding, genotype, and genotype quality on post-release survival of reintroduced individuals. We demonstrate implementation of the Bayesian multi-state model, which allows for the integration of imperfect detection, multiple data types, random effects, and individual- and time-dependent covariates. Our results provide only weak evidence for an effect of the quality of an individual's genotype in captivity on post-release survival as well as for an effect of inbreeding on post-release survival. We plan to integrate our results into a decision-analytic modeling framework that can explicitly examine tradeoffs between the effects of inbreeding and the effects of genotype and demographic stochasticity on population establishment.
Evaporation of droplets in a Champagne wine aerosol
Ghabache, Elisabeth; Liger-Belair, Gérard; Antkowiak, Arnaud; Séon, Thomas
2016-01-01
In a single glass of champagne about a million bubbles nucleate on the wall and rise towards the surface. When these bubbles reach the surface and rupture, they project a multitude of tiny droplets in the form of a particular aerosol holding a concentrate of wine aromas. Based on the model experiment of a single bubble bursting in idealized champagnes, the key features of the champagne aerosol are identified. In particular, we show that film drops, critical in sea spray for example, are here nonexistent. We then demonstrate that compared to a still wine, champagne fizz drastically enhances the transfer of liquid into the atmosphere. There, conditions on bubble radius and wine viscosity that optimize aerosol evaporation are provided. These results pave the way towards the fine tuning of flavor release during sparkling wine tasting, a major issue for the sparkling wine industry. PMID:27125240
Evaporation of droplets in a Champagne wine aerosol
Ghabache, Elisabeth; Liger-Belair, Gérard; Antkowiak, Arnaud; Séon, Thomas
2016-04-01
In a single glass of champagne about a million bubbles nucleate on the wall and rise towards the surface. When these bubbles reach the surface and rupture, they project a multitude of tiny droplets in the form of a particular aerosol holding a concentrate of wine aromas. Based on the model experiment of a single bubble bursting in idealized champagnes, the key features of the champagne aerosol are identified. In particular, we show that film drops, critical in sea spray for example, are here nonexistent. We then demonstrate that compared to a still wine, champagne fizz drastically enhances the transfer of liquid into the atmosphere. There, conditions on bubble radius and wine viscosity that optimize aerosol evaporation are provided. These results pave the way towards the fine tuning of flavor release during sparkling wine tasting, a major issue for the sparkling wine industry.
Learning Local Components to Understand Large Bayesian Networks
DEFF Research Database (Denmark)
Zeng, Yifeng; Xiang, Yanping; Cordero, Jorge;
2009-01-01
(domain experts) to extract accurate information from a large Bayesian network due to dimensional difficulty. We define a formulation of local components and propose a clustering algorithm to learn such local components given complete data. The algorithm groups together most inter-relevant attributes...... in a domain. We evaluate its performance on three benchmark Bayesian networks and provide results in support. We further show that the learned components may represent local knowledge more precisely in comparison to the full Bayesian networks when working with a small amount of data.......Bayesian networks are known for providing an intuitive and compact representation of probabilistic information and allowing the creation of models over a large and complex domain. Bayesian learning and reasoning are nontrivial for a large Bayesian network. In parallel, it is a tough job for users...
Fuzzy Naive Bayesian for constructing regulated network with weights.
Zhou, Xi Y; Tian, Xue W; Lim, Joon S
2015-01-01
In the data mining field, classification is a very crucial technology, and the Bayesian classifier has been one of the hotspots in classification research area. However, assumptions of Naive Bayesian and Tree Augmented Naive Bayesian (TAN) are unfair to attribute relations. Therefore, this paper proposes a new algorithm named Fuzzy Naive Bayesian (FNB) using neural network with weighted membership function (NEWFM) to extract regulated relations and weights. Then, we can use regulated relations and weights to construct a regulated network. Finally, we will classify the heart and Haberman datasets by the FNB network to compare with experiments of Naive Bayesian and TAN. The experiment results show that the FNB has a higher classification rate than Naive Bayesian and TAN. PMID:26405944
Prediction of fission product and aerosol behaviour during a postulated severe accident in a LWR
International Nuclear Information System (INIS)
Lack of appropriate energy removal causes fuel elements in a reactor core to overheat and may eventually cause core to degrade. Fission products will be emitted from a degraded reactor core. Aerosols are generated when the vapours of various fuel and structural materials reach a cold environment and nucleate. In addition to the fission products release and aerosol generation taking place in the reactor vessel, some more fission products release and aerosol generation will occur when the molten core debris leaves the pressure vessel bottom head and comes in contact with the pedestal concrete floor. Fission products, if they are released to environment from the containment boundary, exert a great danger to public health. A source term is defined as the quantity, timing, and characteristics of the release of radionuclide material to the environment following a postulated severe accident. At PSI a considerable effort hase been spent in investigating and establishing a source term assessment methodology in order to predict the source term for a given Light Water Reactor (LWR) accident scenario. This report introduces the computer programs and the methods associated with the release of the fission products, generation of the aerosols and behaviour of the aerosols in LWR compartments used for a source term assessment analysis at PSI. (author) 4 figs., 5 tabs., 28 refs
Fission product vapour - aerosol interactions in the containment: simulant fuel studies
International Nuclear Information System (INIS)
Experiments have been conducted in the Falcon facility to study the interaction of fission product vapours released from simulant fuel samples with control rod aerosols. The aerosols generated from both the control rod and fuel sample were chemically distinct and had different deposition characteristics. Extensive interaction was observed between the fission product vapours and the control rod aerosol. The two dominant mechanisms were condensation of the vapours onto the aerosol, and chemical reactions between the two components; sorption phenomena were believed to be only of secondary importance. The interaction of fission product vapours and reactor materials aerosols could have a major impact on the transport characteristics of the radioactive emission from a degrading core. (author)
Graphical aerosol classification method using aerosol relative optical depth
Chen, Qi-Xiang; Yuan, Yuan; Shuai, Yong; Tan, He-Ping
2016-06-01
A simple graphical method is presented to classify aerosol types based on a combination of aerosol optical thickness (AOT) and aerosol relative optical thickness (AROT). Six aerosol types, including maritime (MA), desert dust (DD), continental (CO), sub-continental (SC), urban industry (UI) and biomass burning (BB), are discriminated in a two dimensional space of AOT440 and AROT1020/440. Numerical calculations are performed using MIE theory based on a multi log-normal particle size distribution, and the AROT ranges for each aerosol type are determined. More than 5 years of daily observations from 8 representative aerosol sites are applied to the method to confirm spatial applicability. Finally, 3 individual cases are analyzed according to their specific aerosol status. The outcomes indicate that the new graphical method coordinates well with regional characteristics and is also able to distinguish aerosol variations in individual situations. This technique demonstrates a novel way to estimate different aerosol types and provide information on radiative forcing calculations and satellite data corrections.
Being Bayesian in a quantum world
International Nuclear Information System (INIS)
Full text: To be a Bayesian about probability theory is to accept that probabilities represent subjective degrees of belief and nothing more. This is in distinction to the idea that probabilities represent long-term frequencies or objective propensities. But, how can a subjective account of probabilities coexist with the existence of quantum mechanics? To accept quantum mechanics is to accept the calculational apparatus of quantum states and the Born rule for determining probabilities in a quantum measurement. If there ever were a place for probabilities to be objective, it ought to be here. This raises the question of whether Bayesianism and quantum mechanics are compatible at all. For the Bayesian, it only suggests that we should rethink what quantum mechanics is about. Is it 'law of nature' or really more 'law of thought'? From transistors to lasers, the evidence is in that we live in a quantum world. One could infer from this that all the elements in the quantum formalism necessarily mirror nature itself: wave functions are so successful as calculational tools precisely because they represent elements of reality. A more Bayesian-like perspective is that if wave functions generate probabilities, then they too must be Bayesian degrees of belief, with all that such a radical idea entails. In particular, quantum probabilities have no firmer hold on reality than the word 'belief' in 'degrees of belief' already indicates. From this perspective, the only sense in which the quantum formalism mirrors nature is through the constraints it places on gambling agents who would like to better navigate through world. One might think that this is thin information, but it is not insubstantial. To the extent that an agent should use quantum mechanics for his uncertainty accounting rather than some other theory tells us something about the world itself - i.e., the world independent of the agent and his particular beliefs at any moment. In this talk, I will try to shore up these
The Bayesian Modelling Of Inflation Rate In Romania
Mihaela Simionescu
2014-01-01
Bayesian econometrics knew a considerable increase in popularity in the last years, joining the interests of various groups of researchers in economic sciences and additional ones as specialists in econometrics, commerce, industry, marketing, finance, micro-economy, macro-economy and other domains. The purpose of this research is to achieve an introduction in Bayesian approach applied in economics, starting with Bayes theorem. For the Bayesian linear regression models the methodology of estim...
On the Relation between Robust and Bayesian Decision Making
Adam, Klaus
2003-01-01
This paper compares Bayesian decision theory with robust decision theory where the decision maker optimizes with respect to the worst state realization. For a class of robust decision problems there exists a sequence of Bayesian decision problems whose solution converges towards the robust solution. It is shown that the limiting Bayesian problem displays infinite risk aversion and that its solution is insensitive (robust) to the precise assignment of prior probabilities. Moreover, the limitin...
BAYESIAN ESTIMATION OF RELIABILITY IN TWOPARAMETER GEOMETRIC DISTRIBUTION
Directory of Open Access Journals (Sweden)
Sudhansu S. Maiti
2015-12-01
Full Text Available Bayesian estimation of reliability of a component, tR ( = P(X ≥ t, when X follows two-parameter geometric distribution, has been considered. Maximum Likelihood Estimator (MLE, an Unbiased Estimator and Bayesian Estimator have been compared. Bayesian estimation of component reliability R = P ( X ≤ Y , arising under stress-strength setup, when Y is assumed to follow independent two-parameter geometric distribution has also been discussed assuming independent priors for parameters under different loss functions.
Chain ladder method: Bayesian bootstrap versus classical bootstrap
Peters, Gareth W.; Mario V. W\\"uthrich; Shevchenko, Pavel V.
2010-01-01
The intention of this paper is to estimate a Bayesian distribution-free chain ladder (DFCL) model using approximate Bayesian computation (ABC) methodology. We demonstrate how to estimate quantities of interest in claims reserving and compare the estimates to those obtained from classical and credibility approaches. In this context, a novel numerical procedure utilising Markov chain Monte Carlo (MCMC), ABC and a Bayesian bootstrap procedure was developed in a truly distribution-free setting. T...
Bayesian just-so stories in psychology and neuroscience
Bowers, J.S.; Davis, Colin J
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make three main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak at best. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account fo...
Bayesian just-so stories in cognitive psychology and neuroscience.
Bowers, J.S.; Davis, Colin J
2012-01-01
According to Bayesian theories in psychology and neuroscience, minds and brains are (near) optimal in solving a wide range of tasks. We challenge this view and argue that more traditional, non-Bayesian approaches are more promising. We make three main arguments. First, we show that the empirical evidence for Bayesian theories in psychology is weak at best. This weakness relates to the many arbitrary ways that priors, likelihoods, and utility functions can be altered in order to account fo...
A tutorial introduction to Bayesian models of cognitive development
Perfors, Amy; Tenenbaum, Joshua B.; Griffiths, Thomas L.; Xu, Fei
2010-01-01
We present an introduction to Bayesian inference as it is used in probabilistic models of cognitive development. Our goal is to provide an intuitive and accessible guide to the what, the how, and the why of the Bayesian approach: what sorts of problems and data the framework is most relevant for, and how and why it may be useful for developmentalists. We emphasize a qualitative understanding of Bayesian inference, but also include information about additional resources for those interested in...
Parameterized Complexity Results for Exact Bayesian Network Structure Learning
Sebastian Ordyniak; Stefan Szeider
2014-01-01
Bayesian network structure learning is the notoriously difficult problem of discovering a Bayesian network that optimally represents a given set of training data. In this paper we study the computational worst-case complexity of exact Bayesian network structure learning under graph theoretic restrictions on the (directed) super-structure. The super-structure is an undirected graph that contains as subgraphs the skeletons of solution networks. We introduce the directed super-structure as a nat...
Algorithms and Complexity Results for Exact Bayesian Structure Learning
Sebastian Ordyniak; Stefan Szeider
2012-01-01
Bayesian structure learning is the NP-hard problem of discovering a Bayesian network that optimally represents a given set of training data. In this paper we study the computational worst-case complexity of exact Bayesian structure learning under graph theoretic restrictions on the super-structure. The super-structure (a concept introduced by Perrier, Imoto, and Miyano, JMLR 2008) is an undirected graph that contains as subgraphs the skeletons of solution networks. Our results apply to severa...
Bayesian non- and semi-parametric methods and applications
Rossi, Peter
2014-01-01
This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number
Computational Enhancements to Bayesian Design of Experiments Using Gaussian Processes
Weaver, Brian P.; Williams, Brian J.; Anderson-Cook, Christine M.; Higdon, David M.
2016-01-01
Bayesian design of experiments is a methodology for incorporating prior information into the design phase of an experiment. Unfortunately, the typical Bayesian approach to designing experiments is both numerically and analytically intractable without additional assumptions or approximations. In this paper, we discuss how Gaussian processes can be used to help alleviate the numerical issues associated with Bayesian design of experiments. We provide an example based on accelerated life tests an...
Characterizing the Aperiodic Variability of 3XMM Sources using Bayesian Blocks
Salvetti, D.; De Luca, A.; Belfiore, A.; Marelli, M.
2016-06-01
I will present Bayesian blocks algorithm and its application to XMM sources, statistical properties of the entire 3XMM sample, and a few interesting cases. While XMM-Newton is the best suited instrument for the characterization of X-ray source variability, its most recent catalogue (3XMM) reports light curves only for the brightest ones and excludes from its analysis periods of background flares. One aim of the EXTraS ("Exploring the X-ray Transient and variable Sky") project is the characterization of aperiodic variability of as many 3XMM sources as possible on a time scale shorter than the XMM observation. We adapted the original Bayesian blocks algorithm to account for background contamination, including soft proton flares. In addition, we characterized the short-term aperiodic variability performing a number of statistical tests on all the Bayesian blocks light curves. The EXTraS catalogue and products will be released to the community in 2017, together with tools that will allow the user to replicate EXTraS results and extend them through the next decade.
Directory of Open Access Journals (Sweden)
David Lunn
Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.
Aerosol samplers innovation possibilities
International Nuclear Information System (INIS)
The growing demand for an early detection of increased levels of the artificial radionuclides in the atmosphere resulted in the design and fabrication of an aerosol sampler with automated spectrometric unit providing online gamma spectrometry above the aerosol filter. Study was performed with two types of high volume samplers- SENYA JL-900 SnowWhite (900 m3/h) a SENYA JL-150 Hunter (150 m3/h). This work gives results of the design optimization with respect to the detector type, geometry of measurement, remote control and spectrometric evaluation 222Rn and 220Rn concentration fluctuations in the outdoor air are discussed with regard to the detection limit so the radionuclides expected after the NPP accident. (authors)
International Nuclear Information System (INIS)
That device is characterized in that, within a closed casing provided with a sealable opening, it comprises a storage rack for a plurality of stacked saucers, a sheath having a transverse slot for extracting said saucers separately from, or re-introducing same into, their respective sockets, a transfer gripping-member with its control mechanism for taking saucers from their respective sockets through said slot and moving them transversally, a sealing-plate mounted below the casing-opening, said plate being associated with a pushing mechanism, and a laterally retractable cover controlled by a separate mechanism for unsealing said casing opening and collecting aerosols on a saucer. That device can be used for taking samples of sodium aerosols
Aerosol Observing System (AOS) Handbook
Energy Technology Data Exchange (ETDEWEB)
Jefferson, A
2011-01-17
The Aerosol Observing System (AOS) is a suite of in situ surface measurements of aerosol optical and cloud-forming properties. The instruments measure aerosol properties that influence the earth’s radiative balance. The primary optical measurements are those of the aerosol scattering and absorption coefficients as a function of particle size and radiation wavelength and cloud condensation nuclei (CCN) measurements as a function of percent supersaturation. Additional measurements include those of the particle number concentration and scattering hygroscopic growth. Aerosol optical measurements are useful for calculating parameters used in radiative forcing calculations such as the aerosol single-scattering albedo, asymmetry parameter, mass scattering efficiency, and hygroscopic growth. CCN measurements are important in cloud microphysical models to predict droplet formation.
Aerosol characterization during project POLINAT
Energy Technology Data Exchange (ETDEWEB)
Hagen, D.E.; Hopkins, A.R.; Paladino, J.D.; Whitefield, P.D. [Missouri Univ., Rolla, MO (United States). Cloud and Aerosol Sciences Lab.; Lilenfeld, H.V. [McDonnell Douglas Aerospace-East, St. Louis, MO (United States)
1997-12-31
The objectives of the aerosol/particulate characterization measurements of project POLINAT (POLlution from aircraft emissions In the North ATlantic flight corridor) are: to search for aerosol/particulate signatures of air traffic emissions in the region of the North Atlantic Flight Corridor; to search for the aerosol/particulate component of large scale enhancement (`corridor effects`) of air traffic related species in the North Atlantic region; to determine the effective emission indices for the aerosol/particulate component of engine exhaust in both the near and far field of aircraft exhaust plumes; to measure the dispersion and transformation of the aerosol/particulate component of aircraft emissions as a function of ambient condition; to characterize background levels of aerosol/particulate concentrations in the North Atlantic Region; and to determine effective emission indices for engine exhaust particulates for regimes beyond the jet phase of plume expansion. (author) 10 refs.
International Nuclear Information System (INIS)
The paper describes the types and characteristics of the various aerosol particles (by size effects, and origin) and goes on to discuss the composition of particulates and their variation in different places in Asia, and the origin of global particulate emissions from natural and anthropogenic sources. The effects of particulate matter on human health, visibility and climate are summarised. Techniques for control and abatement of particulate emissions are outlined. 10 refs., 4 figs., 11 tabs
International Nuclear Information System (INIS)
AMY-project concentrates on understanding and modelling on deposition-resuspension phenomena of aerosols in pipe flow. The aim is to develop a calculation model that could resolve the current deficiencies in the aerosol deposition modelling in turbulent flows, and to implement the models into the tools that are used for calculating the fission product behaviour and release in severe reactor accidents. These tools are APROS SA, which is used for simulating the severe accident phenomena and progression of the accident, and SaTu (support system for radiation experts), which is originally designed to estimate radiation levels and radioactive releases during the accident situation. In addition to the deposition-resuspension model, other important models are to be implemented in the tools mentioned above. Revaporisation of deposited fission products from primary circuit surfaces may increase the releases into the reactor containment and further into the environment, and thus the phenomenon should be taken into account. To the SaTu system, models for estimating the environmental consequences will be implemented, as well, and the system will be modified to be able to describe nuclear power plants other than the Loviisa plant. Another important feature for source term calculations in PSA level 2 analyses is implementation of the uncertainty calculation environment in SaTu. (orig.)
Doing bayesian data analysis a tutorial with R and BUGS
Kruschke, John K
2011-01-01
There is an explosion of interest in Bayesian statistics, primarily because recently created computational methods have finally made Bayesian analysis obtainable to a wide audience. Doing Bayesian Data Analysis, A Tutorial Introduction with R and BUGS provides an accessible approach to Bayesian data analysis, as material is explained clearly with concrete examples. The book begins with the basics, including essential concepts of probability and random sampling, and gradually progresses to advanced hierarchical modeling methods for realistic data. The text delivers comprehensive coverage of all
Bayesian integer frequency offset estimator for MIMO-OFDM systems
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
Carrier frequency offset (CFO) in MIMO-OFDM systems can be decoupled into two parts: fraction frequency offset (FFO) and integer frequency offset (IFO). The problem of IFO estimation is addressed and a new IFO estimator based on the Bayesian philosophy is proposed. Also, it is shown that the Bayesian IFO estimator is optimal among all the IFO estimators. Furthermore, the Bayesian estimator can take advantage of oversampling so that better performance can be obtained. Finally, numerical results show the optimality of the Bayesian estimator and validate the theoretical analysis.
Photothermal spectroscopy of aerosols
International Nuclear Information System (INIS)
In situ aerosol absorption spectroscopy was performed using two novel photothermal detection schemes. The first, based on a photorefractive effect and coherent detection, called phase fluctuation optical heterodyne (PFLOH) spectroscopy, could, depending on the geometry employed, yield particle specific or particle and gas absorption data. Single particles of graphite as small as 1 μm were detected in the particle specific mode. In another geometrical configuration, the total absorption (both gas and particle) of submicron sized aerosols of ammonium sulfate particles in equilibrium with gaseous ammonia and water vapor were measured at varying CO2 laser frequencies. The specific absorption coefficient for the sulfate ion was measured to be 0.5 m2/g at 1087 cm-1. The absorption coefficient sensitivity of this scheme was less than or equal to 10-8 cm-1. The second scheme is a hybrid visible Mie scattering scheme incorporating photothermal modulation. Particle specific data on ammonium sulfate droplets were obtained. For chemically identical species, the relative absorption spectrum versus laser frequency can be obtained for polydisperse aerosol distributions directly from the data without the need for complex inverse scattering calculations
International Nuclear Information System (INIS)
In order to study the natural release of aerosol particles by the Amazon Basin tropical rain forest, the composition and size distribution of biogenic aerosol particles were analyzed. The role of the atmospheric emissions from the Amazon Basin rain forest in the global atmosphere will be investigated. The atmosphere was studied in long-term sampling stations in three different locations. The elemental composition of aerosol particles released during biomass burning was also measured in several different ecosystems, from primary forest to Savannah. One of the main focuses was to identify and quantify important physical and chemical processes in the generation, transformation and deposition of aerosol particles. Also important was to obtain a better understanding of natural aerosol sources concerning identification, their characteristics and strength, to be able to understand the natural chemistry in the atmosphere on a global scale. 36 refs, 3 figs, 3 tabs
Aerosol influence on radiative cooling
Grassl, Hartmut
2011-01-01
Aerosol particles have a complex index of refraction and therefore contribute to atmospheric emission and radiative cooling rates. In this paper calculations of the longwave flux divergence within the atmosphere at different heights are presented including water vapour and aerosol particles as emitters and absorbers. The spectral region covered is 5 to 100 microns divided into 23 spectral intervals. The relevant properties of the aerosol particles, the single scattering albedo and the extinct...
An Emerging Global Aerosol Climatology from the MODIS Satellite Sensors
Remer, Lorraine A.; Kleidman, Richard G.; Levy, Robert C.; Kaufman, Yoram J.; Tanre, Didier; Mattoo, Shana; Martins, J. Vandelei; Ichoku, Charles; Koren, Ilan; Hongbin, Yu; Holben, Brent N.
2008-01-01
The recently released Collection 5 MODIS aerosol products provide a consistent record of the Earth's aerosol system. Comparison with ground-based AERONET observations of aerosol optical depth (AOD) we find that Collection 5 MODIS aerosol products estimate AOD to within expected accuracy more than 60% of the time over ocean and more than 72% of the time over land. This is similar to previous results for ocean, and better than the previous results for land. However, the new Collection introduces a 0.01 5 offset between the Terra and Aqua global mean AOD over ocean, where none existed previously. Aqua conforms to previous values and expectations while Terra is high. The cause of the offset is unknown, but changes to calibration are a possible explanation. We focus the climatological analysis on the better understood Aqua retrievals. We find that global mean AOD at 550 nm over oceans is 0.13 and over land 0.19. AOD in situations with 80% cloud fraction are twice the global mean values, although such situations occur only 2% of the time over ocean and less than 1% of the time over land. There is no drastic change in aerosol particle size associated with these very cloudy situations. Regionally, aerosol amounts vary from polluted areas such as East Asia and India, to the cleanest regions such as Australia and the northern continents. In almost all oceans fine mode aerosol dominates over dust, except in the tropical Atlantic downwind of the Sahara and in some months the Arabian Sea.
Topics in current aerosol research
Hidy, G M
1971-01-01
Topics in Current Aerosol Research deals with the fundamental aspects of aerosol science, with emphasis on experiment and theory describing highly dispersed aerosols (HDAs) as well as the dynamics of charged suspensions. Topics covered range from the basic properties of HDAs to their formation and methods of generation; sources of electric charges; interactions between fluid and aerosol particles; and one-dimensional motion of charged cloud of particles. This volume is comprised of 13 chapters and begins with an introduction to the basic properties of HDAs, followed by a discussion on the form
Source strength of fungal spore aerosolization from moldy building material
Górny, Rafał L.; Reponen, Tiina; Grinshpun, Sergey A.; Willeke, Klaus
The release of Aspergillus versicolor, Cladosporium cladosporioides, and Penicillium melinii spores from agar and ceiling tile surfaces was tested under different controlled environmental conditions using a newly designed and constructed aerosolization chamber. This study revealed that all the investigated parameters, such as fungal species, air velocity above the surface, texture of the surface, and vibration of contaminated material, affected the fungal spore release. It was found that typical indoor air currents can release up to 200 spores cm -2 from surfaces with fungal spores during 30-min experiments. The release of fungal spores from smooth agar surfaces was found to be inadequate for accurately predicting the emission from rough ceiling tile surfaces because the air turbulence increases the spore release from a rough surface. A vibration at a frequency of 1 Hz at a power level of 14 W resulted in a significant increase in the spore release rate. The release appears to depend on the morphology of the fungal colonies grown on ceiling tile surfaces including the thickness of conidiophores, the length of spore chains, and the shape of spores. The spores were found to be released continuously during each 30-min experiment. However, the release rate was usually highest during the first few minutes of exposure to air currents and mechanical vibration. About 71-88% of the spores released during a 30-min interval became airborne during the first 10 min.
International Nuclear Information System (INIS)
The effectiveness of continuous air monitors (CAMs) in protecting plutonium workers depends on the efficiency of aerosol transport from the release point to the CAM. The main processes for aerosol transport are diffusion, forced convection, and gravitational settling. The transport of particles relative to each of these processes depends on particle size. Studies have shown that activity median aerodynamic diameters for plutonium aerosols can range from less than 0.1 μm to greater than 10 μm. The purpose of this study was to characterize the influence of particle size on aerosol transport and CAM response in a plutonium laboratory. Polydisperse dioctyl sebacate oil aerosols were released from multiple locations within a plutonium laboratory at Los Alamos National Laboratory. An array of Laser Particle counters (LPCs) positioned in the laboratory measured time resolved aerosol dispersion. Aerosol concentrations were binned into two size ranges: (1) 0.5 μm to 5.0 μm, and (2) those greater than 5.0 μm. Statistical comparisons were done and the results suggested that transport efficiency was greater for smaller particles than larger particles in this laboratory. This result suggested the importance of using particles of similar physical characteristics to those of the source when doing tests to decide optimal placement of CAMs
CATHENA/PACE calculations of aerosol transport in a simulated primary circuit
International Nuclear Information System (INIS)
Severe fuel damage in postulated loss-of-coolant accidents (LOCAs) in CANDU (Canada Deuterium Uranium) reactors may result in the release and transport of aerosol fission product materials from the fuel, through the primary circuit to containment and possibly into the atmosphere. In order to obtain a greater understanding of the mechanisms underlying fission product release, transport and deposition resulting from fuel damage in such accidents, a series of COG funded (Candu Owners Group) severe fuel damage experiments have been planned in the Blowdown Test Facility (BTF) at AECL Research, Chalk River. One of the planned experiments, BTF-104, has been simulated with the thermalhydraulics/aerosol transport code CATHENA/PACE, as a pretest investigation of fission product transport behaviour. Results of a number of CATHENA/PACE pretest simulations of the BTF-104 experiment are reported in this paper. The CATHENA code, developed at Whiteshell Laboratories, has been used to model the thermalhydraulic phenomena in a section of primary circuit piping representing the BTF. The PACE code (a Program for Aerosol Code Evaluation), also developed at Whiteshell, is coupled to the thermahydraulic calculations of CATHENA to model the transport of the aerosol fission products generated from the fuel. The PACE code contains aerosol physics models from a number of aerosol codes (e.g., HAA4, REMOVAL, NAUA, AEROSIM, etc.) and can therefore emulate any one of these by user selection. A number of codes (including VICTORIA, AEROSIM, REMOVAL, AND HAARM) were emulated with CATHENA/PACE for aerosol transport in this circuit
International Nuclear Information System (INIS)
A program for aerosol behavior validation and evaluation (ABCOVE) has been developed in accordance with the LMFBR Safety Program Plan. The ABCOVE program is a cooperative effort between the USDOE, the USNRC, and their contractor organizations currently involved in aerosol code development, testing or application. The third large-scale test in the ABCOVE program, AB7, was performed in the 850-m3 CSTF vessel with a two-species test aerosol. The test conditions involved the release of a simulated fission product aerosol, NaI, into the containment atmosphere after the end of a small sodium pool fire. Four organizations made pretest predictions of aerosol behavior using five computer codes. Two of the codes (QUICKM and CONTAIN) were discrete, multiple species codes, while three (HAA-3, HAA-4, and HAARM-3) were log-normal codes which assume uniform coagglomeration of different aerosol species. Detailed test results are presented and compared with the code predictions for eight key aerosol behavior parameters. 11 refs., 44 figs., 35 tabs
Kahn, R. A.
2009-12-01
As expected, the aerosol data products from the NASA Earth Observing System’s MISR and MODIS instruments provide significant advances in regional and global aerosol optical depth (AOD) mapping, aerosol type measurement, and source plume characterization from space. Although these products have been and are being used for many applications, ranging from regional air quality assessment, to aerosol air mass type evolution, to aerosol injection height and aerosol transport model validation, uncertainties still limit the quantitative constraints these satellite data place on global-scale direct aerosol radiative forcing. Some further refinement of the current aerosol products is possible, but a major advance in this area seems to require a different paradigm, involving the integration of satellite and suborbital data with models. This presentation will briefly summarize where we stand, and what incremental advances we can expect, with the current aerosol products, and will then elaborate on some initial steps aimed at the necessary integration. Many other AGU presentations, covering parts of the community’s emerging efforts in this direction, will be referenced, and key points from the recently released CCSP-SAP (US Climate Change Program - Synthesis and Assessment Product) 2.3 - Atmospheric aerosols: Properties and Climate Impacts, will be included in the discussion.
Sulphur-rich volcanic eruptions and stratospheric aerosols
Rampino, M. R.; Self, S.
1984-01-01
Data from direct measurements of stratospheric optical depth, Greenland ice-core acidity, and volcanological studies are compared, and it is shown that relatively small but sulfur-rich volcanic eruptions can have atmospheric effects equal to or even greater than much larger sulfur-poor eruptions. These small eruptions are probably the most frequent cause of increased stratospheric aerosols. The possible sources of the excess sulfur released in these eruptions are discussed.
E-7 analysis of aerosol behavior under secondary sodium leak accident
International Nuclear Information System (INIS)
Analysis of aerosol behavior inside and outside of the building under conditions of secondary sodium leak accident was performed by simulation The calculated results were compared to the sampled values. For the aerosol behavior analysis in the building, the aerosol generation rate was assumed to be 25 g/sec (conversion Na) and the chemical composition to be Na2O. The lumped-parameter code was used for simulation of aerosol diffusion in the building corresponding to air ventilating conditions. Based on the results of 3-dimension calculation for the roof space of the building, 20% of the aerosol released from the sodium leak cell to the outside was assumed to re-circulate to air supply system. Based on the result of comparison of density values between calculation and sampling, the simulation was almost successful, although density values in cells far from sodium leak cell were underestimated, and the values in cells near the leak were overestimated. For aerosol diffusion out of building and off-site, the 3-dimensional calculation was performed assuming that the aerosol exhaust rate was 25 g/sec (conversion Na), the chemical composition the Na2CO3, aerosol mass density 0.3 g/cm3, wind velocity 11 m/sec and wind direction north north west. The aerosol concentration at the site boundary was calculated at 0.03 mg/m3 (conversion Na) considering aerosol settling. This value is sufficiently smaller than aerosol concentration criterion 2 mg/m3 (NaOH). (author)
Bayesian inference tools for inverse problems
Mohammad-Djafari, Ali
2013-08-01
In this paper, first the basics of Bayesian inference with a parametric model of the data is presented. Then, the needed extensions are given when dealing with inverse problems and in particular the linear models such as Deconvolution or image reconstruction in Computed Tomography (CT). The main point to discuss then is the prior modeling of signals and images. A classification of these priors is presented, first in separable and Markovien models and then in simple or hierarchical with hidden variables. For practical applications, we need also to consider the estimation of the hyper parameters. Finally, we see that we have to infer simultaneously on the unknowns, the hidden variables and the hyper parameters. Very often, the expression of this joint posterior law is too complex to be handled directly. Indeed, rarely we can obtain analytical solutions to any point estimators such the Maximum A posteriori (MAP) or Posterior Mean (PM). Three main tools are then can be used: Laplace approximation (LAP), Markov Chain Monte Carlo (MCMC) and Bayesian Variational Approximations (BVA). To illustrate all these aspects, we will consider a deconvolution problem where we know that the input signal is sparse and propose to use a Student-t prior for that. Then, to handle the Bayesian computations with this model, we use the property of Student-t which is modelling it via an infinite mixture of Gaussians, introducing thus hidden variables which are the variances. Then, the expression of the joint posterior of the input signal samples, the hidden variables (which are here the inverse variances of those samples) and the hyper-parameters of the problem (for example the variance of the noise) is given. From this point, we will present the joint maximization by alternate optimization and the three possible approximation methods. Finally, the proposed methodology is applied in different applications such as mass spectrometry, spectrum estimation of quasi periodic biological signals and
On local optima in learning bayesian networks
DEFF Research Database (Denmark)
Dalgaard, Jens; Kocka, Tomas; Pena, Jose
2003-01-01
This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness is...... set at maximum, KES corresponds to the greedy equivalence search algorithm (GES). When greediness is kept at minimum, we prove that under mild assumptions KES asymptotically returns any inclusion optimal BN with nonzero probability. Experimental results for both synthetic and real data are reported...
Bayesian regularization of diffusion tensor images
DEFF Research Database (Denmark)
Frandsen, Jesper; Hobolth, Asger; Østergaard, Leif;
2007-01-01
Diffusion tensor imaging (DTI) is a powerful tool in the study of the course of nerve fibre bundles in the human brain. Using DTI, the local fibre orientation in each image voxel can be described by a diffusion tensor which is constructed from local measurements of diffusion coefficients along...... several directions. The measured diffusion coefficients and thereby the diffusion tensors are subject to noise, leading to possibly flawed representations of the three dimensional fibre bundles. In this paper we develop a Bayesian procedure for regularizing the diffusion tensor field, fully utilizing the...
Sensor fault diagnosis using Bayesian belief networks
International Nuclear Information System (INIS)
This paper describes a method based on Bayesian belief networks (BBNs) sensor fault detection, isolation, classification, and accommodation (SFDIA). For this purpose, a BBN uses three basic types of nodes to represent the information associated with each sensor: (1) sensor-reading nodes that represent the mechanisms by which the information is communicated to the BBN, (2) sensor-status nodes that convey the status of the corresponding sensors at any given time, and (3) process-variable nodes that are a conceptual representation of the actual values of the process variables, which are unknown
Reasons for (prior) belief in bayesian epistemology
Dietrich, Franz; List, Christian
2012-01-01
Bayesian epistemology tells us with great precision how we should move from prior to posterior beliefs in light of new evidence or information, but says little about where our prior beliefs come from. It o¤ers few resources to describe some prior beliefs as rational or well-justi�ed, and others as irrational or unreasonable. A di¤erent strand of epistemology takes the central epistemological question to be not how to change one�s beliefs in light of new evidence, but what reasons justify a gi...
Confidence intervals: am I unconsciously Bayesian?
Directory of Open Access Journals (Sweden)
Andrea Onofri
2015-08-01
Full Text Available To most biologists, the exact meaning of confidence intervals is very difficult to grasp, though such intervals are shown in many of our papers as measures of data variability. One of the reasons lies in the fact that the traditional way of teaching confidence intervals suggests much more than they actually deliver. Therefore, working with biologists, statistics teachers need a convincing way of introducing this topic and, to my experience, Monte Carlo simulation offers some opportunities. However, understanding the crude meaning of frequentist confidence intervals may be disappointing for biologists, who might be seduced by the intuitive appeal of Bayesian credible intervals.
Bayesian Estimation of a Mixture Model
Ilhem Merah; Assia Chadli
2015-01-01
We present the properties of a bathtub curve reliability model having both a sufficient adaptability and a minimal number of parameters introduced by Idée and Pierrat (2010). This one is a mixture of a Gamma distribution G(2, (1/θ)) and a new distribution L(θ). We are interesting by Bayesian estimation of the parameters and survival function of this model with a squared-error loss function and non-informative prior using the approximations of Lindley (1980) and Tierney and Kadane (1986). Usin...
Diffusion filtration with approximate Bayesian computation
Czech Academy of Sciences Publication Activity Database
Dedecius, Kamil; Djurić, P. M.
Piscataway: IEEE Computer Society, 2015, s. 3207-3211. ISBN 978-1-4673-6997-8. ISSN 1520-6149. [2015 IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP 2015). Brisbane (AU), 19.05.2015-24.05.2015] R&D Projects: GA ČR(CZ) GP14-06678P Institutional support: RVO:67985556 Keywords : Bayesian filtration * diffusion * distributed filtration Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2015/AS/dedecius-0443931.pdf
Bayesian state estimation using generalized coordinates
Balaji, Bhashyam; Friston, Karl
2011-06-01
This paper reviews a simple solution to the continuous-discrete Bayesian nonlinear state estimation problem that has been proposed recently. The key ideas are analytic noise processes, variational Bayes, and the formulation of the problem in terms of generalized coordinates of motion. Some of the algorithms, specifically dynamic expectation maximization and variational filtering, have been shown to outperform existing approaches like extended Kalman filtering and particle filtering. A pedagogical review of the theoretical formulation is presented, with an emphasis on concepts that are not as widely known in the filtering literature. We illustrate the appliction of these concepts using a numerical example.
Structure-based bayesian sparse reconstruction
Quadeer, Ahmed Abdul
2012-12-01
Sparse signal reconstruction algorithms have attracted research attention due to their wide applications in various fields. In this paper, we present a simple Bayesian approach that utilizes the sparsity constraint and a priori statistical information (Gaussian or otherwise) to obtain near optimal estimates. In addition, we make use of the rich structure of the sensing matrix encountered in many signal processing applications to develop a fast sparse recovery algorithm. The computational complexity of the proposed algorithm is very low compared with the widely used convex relaxation methods as well as greedy matching pursuit techniques, especially at high sparsity. © 1991-2012 IEEE.
Using imsets for learning Bayesian networks
Czech Academy of Sciences Publication Activity Database
Vomlel, Jiří; Studený, Milan
Praha : UTIA AV ČR, 2007 - (Kroupa, T.; Vejnarová, J.), s. 178-189 [Czech-Japan Seminar on Data Analysis and Decision Making under Uncertainty /10./. Liblice (CZ), 15.09.2007-18.09.2007] R&D Projects: GA MŠk(CZ) 1M0572 Grant ostatní: GA MŠk(CZ) 2C06019 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian networks * artificial intelligence * probabilistic graphical models * machine learning Subject RIV: BB - Applied Statistics, Operational Research
A Bayesian approach to earthquake source studies
Minson, Sarah
Bayesian sampling has several advantages over conventional optimization approaches to solving inverse problems. It produces the distribution of all possible models sampled proportionally to how much each model is consistent with the data and the specified prior information, and thus images the entire solution space, revealing the uncertainties and trade-offs in the model. Bayesian sampling is applicable to both linear and non-linear modeling, and the values of the model parameters being sampled can be constrained based on the physics of the process being studied and do not have to be regularized. However, these methods are computationally challenging for high-dimensional problems. Until now the computational expense of Bayesian sampling has been too great for it to be practicable for most geophysical problems. I present a new parallel sampling algorithm called CATMIP for Cascading Adaptive Tempered Metropolis In Parallel. This technique, based on Transitional Markov chain Monte Carlo, makes it possible to sample distributions in many hundreds of dimensions, if the forward model is fast, or to sample computationally expensive forward models in smaller numbers of dimensions. The design of the algorithm is independent of the model being sampled, so CATMIP can be applied to many areas of research. I use CATMIP to produce a finite fault source model for the 2007 Mw 7.7 Tocopilla, Chile earthquake. Surface displacements from the earthquake were recorded by six interferograms and twelve local high-rate GPS stations. Because of the wealth of near-fault data, the source process is well-constrained. I find that the near-field high-rate GPS data have significant resolving power above and beyond the slip distribution determined from static displacements. The location and magnitude of the maximum displacement are resolved. The rupture almost certainly propagated at sub-shear velocities. The full posterior distribution can be used not only to calculate source parameters but also
Bayesian logistic betting strategy against probability forecasting
Kumon, Masayuki; Takemura, Akimichi; Takeuchi, Kei
2012-01-01
We propose a betting strategy based on Bayesian logistic regression modeling for the probability forecasting game in the framework of game-theoretic probability by Shafer and Vovk (2001). We prove some results concerning the strong law of large numbers in the probability forecasting game with side information based on our strategy. We also apply our strategy for assessing the quality of probability forecasting by the Japan Meteorological Agency. We find that our strategy beats the agency by exploiting its tendency of avoiding clear-cut forecasts.
A Bayesian Framework for Combining Valuation Estimates
Yee, Kenton K
2007-01-01
Obtaining more accurate equity value estimates is the starting point for stock selection, value-based indexing in a noisy market, and beating benchmark indices through tactical style rotation. Unfortunately, discounted cash flow, method of comparables, and fundamental analysis typically yield discrepant valuation estimates. Moreover, the valuation estimates typically disagree with market price. Can one form a superior valuation estimate by averaging over the individual estimates, including market price? This article suggests a Bayesian framework for combining two or more estimates into a superior valuation estimate. The framework justifies the common practice of averaging over several estimates to arrive at a final point estimate.