WorldWideScience

Sample records for reducing temperature uncertainties

  1. Uncertainty Analysis of the Temperature–Resistance Relationship of Temperature Sensing Fabric

    Directory of Open Access Journals (Sweden)

    Muhammad Dawood Husain

    2016-11-01

    Full Text Available This paper reports the uncertainty analysis of the temperature–resistance (TR data of the newly developed temperature sensing fabric (TSF, which is a double-layer knitted structure fabricated on an electronic flat-bed knitting machine, made of polyester as a basal yarn, and embedded with fine metallic wire as sensing element. The measurement principle of the TSF is identical to temperature resistance detector (RTD; that is, change in resistance due to change in temperature. The regression uncertainty (uncertainty within repeats and repeatability uncertainty (uncertainty among repeats were estimated by analysing more than 300 TR experimental repeats of 50 TSF samples. The experiments were performed under dynamic heating and cooling environments on a purpose-built test rig within the temperature range of 20–50 °C. The continuous experimental data was recorded through LabVIEW-based graphical user interface. The result showed that temperature and resistance values were not only repeatable but reproducible, with only minor variations. The regression uncertainty was found to be less than ±0.3 °C; the TSF sample made of Ni and W wires showed regression uncertainty of <±0.13 °C in comparison to Cu-based TSF samples (>±0.18 °C. The cooling TR data showed considerably reduced values (±0.07 °C of uncertainty in comparison with the heating TR data (±0.24 °C. The repeatability uncertainty was found to be less than ±0.5 °C. By increasing the number of samples and repeats, the uncertainties may be reduced further. The TSF could be used for continuous measurement of the temperature profile on the surface of the human body.

  2. Quantifying data worth toward reducing predictive uncertainty

    Science.gov (United States)

    Dausman, A.M.; Doherty, J.; Langevin, C.D.; Sukop, M.C.

    2010-01-01

    The present study demonstrates a methodology for optimization of environmental data acquisition. Based on the premise that the worth of data increases in proportion to its ability to reduce the uncertainty of key model predictions, the methodology can be used to compare the worth of different data types, gathered at different locations within study areas of arbitrary complexity. The method is applied to a hypothetical nonlinear, variable density numerical model of salt and heat transport. The relative utilities of temperature and concentration measurements at different locations within the model domain are assessed in terms of their ability to reduce the uncertainty associated with predictions of movement of the salt water interface in response to a decrease in fresh water recharge. In order to test the sensitivity of the method to nonlinear model behavior, analyses were repeated for multiple realizations of system properties. Rankings of observation worth were similar for all realizations, indicating robust performance of the methodology when employed in conjunction with a highly nonlinear model. The analysis showed that while concentration and temperature measurements can both aid in the prediction of interface movement, concentration measurements, especially when taken in proximity to the interface at locations where the interface is expected to move, are of greater worth than temperature measurements. Nevertheless, it was also demonstrated that pairs of temperature measurements, taken in strategic locations with respect to the interface, can also lead to more precise predictions of interface movement. Journal compilation ?? 2010 National Ground Water Association.

  3. Crop Model Improvement Reduces the Uncertainty of the Response to Temperature of Multi-Model Ensembles

    Science.gov (United States)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold; Ewert, Frank; Mueller, Christoph; Roetter, Reimund P.; Ruane, Alex C.; Semenov, Mikhail A.; Wallach, Daniel; Wang, Enli

    2016-01-01

    To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA Hot Serial Cereal experiment (calibration data set). Simulation results from before and after model improvement were then evaluated with independent field experiments from a CIMMYT worldwide field trial network (evaluation data set). Model improvements decreased the variation (10th to 90th model ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures greater than 24 C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME uncertainty range by 27% increased MME prediction skills by 47%. Results suggest that the mean level of variation observed in field experiments and used as a benchmark can be reached with half the number of models in the MME. Improving crop models is therefore important to increase the certainty of model-based impact assessments and allow more practical, i.e. smaller MMEs to be used effectively.

  4. Understanding and reducing statistical uncertainties in nebular abundance determinations

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2012-06-01

    Whenever observations are compared to theories, an estimate of the uncertainties associated with the observations is vital if the comparison is to be meaningful. However, many or even most determinations of temperatures, densities and abundances in photoionized nebulae do not quote the associated uncertainty. Those that do typically propagate the uncertainties using analytical techniques which rely on assumptions that generally do not hold. Motivated by this issue, we have developed Nebular Empirical Analysis Tool (NEAT), a new code for calculating chemical abundances in photoionized nebulae. The code carries out a standard analysis of lists of emission lines using long-established techniques to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEATuses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances. We show that, for typical observational data, this approach is superior to analytic estimates of uncertainties. NEAT also accounts for the effect of upward biasing on measurements of lines with low signal-to-noise ratio, allowing us to accurately quantify the effect of this bias on abundance determinations. We find not only that the effect can result in significant overestimates of heavy element abundances derived from weak lines, but also that taking it into account reduces the uncertainty of these abundance determinations. Finally, we investigate the effect of possible uncertainties in R, the ratio of selective-to-total extinction, on abundance determinations. We find that the uncertainty due to this parameter is negligible compared to the statistical uncertainties due to typical line flux measurement uncertainties.

  5. The uncertainty of crop yield projections is reduced by improved temperature response functions

    DEFF Research Database (Denmark)

    Wang, Enli; Martre, Pierre; Zhao, Zhigan

    2017-01-01

    , we show that variations in the mathematical functions currently used to simulate temperature responses of physiological processes in 29 wheat models account for >50% of uncertainty in simulated grain yields for mean growing season temperatures from 14 °C to 33 °C. We derived a set of new temperature......Quality) and analysing their results against the HSC data and an additional global dataset from the International Heat Stress Genotpye Experiment (IHSGE)8 carried out by the International Maize and Wheat Improvement Center (CIMMYT). More importantly, we derive, based on newest knowledge and data, a set of new...

  6. Thermodynamic Temperatures of High-Temperature Fixed Points: Uncertainties Due to Temperature Drop and Emissivity

    Science.gov (United States)

    Castro, P.; Machin, G.; Bloembergen, P.; Lowe, D.; Whittam, A.

    2014-07-01

    This study forms part of the European Metrology Research Programme project implementing the New Kelvin to assign thermodynamic temperatures to a selected set of high-temperature fixed points (HTFPs), Cu, Co-C, Pt-C, and Re-C. A realistic thermal model of these HTFPs, developed in finite volume software ANSYS FLUENT, was constructed to quantify the uncertainty associated with the temperature drop across the back wall of the cell. In addition, the widely applied software package, STEEP3 was used to investigate the influence of cell emissivity. The temperature drop, , relates to the temperature difference due to the net loss of heat from the aperture of the cavity between the back wall of the cavity, viewed by the thermometer, defining the radiance temperature, and the solid-liquid interface of the alloy, defining the transition temperature of the HTFP. The actual value of can be used either as a correction (with associated uncertainty) to thermodynamic temperature evaluations of HTFPs, or as an uncertainty contribution to the overall estimated uncertainty. In addition, the effect of a range of furnace temperature profiles on the temperature drop was calculated and found to be negligible for Cu, Co-C, and Pt-C and small only for Re-C. The effective isothermal emissivity is calculated over the wavelength range from 450 nm to 850 nm for different assumed values of surface emissivity. Even when furnace temperature profiles are taken into account, the estimated emissivities change only slightly from the effective isothermal emissivity of the bare cell. These emissivity calculations are used to estimate the uncertainty in the temperature assignment due to the uncertainty in the emissivity of the blackbody.

  7. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 3: Temperature uncertainty budget

    Science.gov (United States)

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Haefele, Alexander; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified uncertainty sources comprise major components such as signal detection, saturation correction, background noise extraction, temperature tie-on at the top of the profile, and absorption by ozone if working in the visible spectrum, as well as other components such as molecular extinction, the acceleration of gravity, and the molecular mass of air, whose magnitudes depend on the instrument, data processing algorithm, and altitude range of interest. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated, taking into account the effect of vertical filtering and the merging of multiple channels. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated from the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. Using this standardized approach, an example of uncertainty budget is provided for the Jet Propulsion Laboratory (JPL) lidar at Mauna Loa Observatory, Hawai'i, which is

  8. Neural Mechanisms of Updating under Reducible and Irreducible Uncertainty.

    Science.gov (United States)

    Kobayashi, Kenji; Hsu, Ming

    2017-07-19

    Adaptive decision making depends on an agent's ability to use environmental signals to reduce uncertainty. However, because of multiple types of uncertainty, agents must take into account not only the extent to which signals violate prior expectations but also whether uncertainty can be reduced in the first place. Here we studied how human brains of both sexes respond to signals under conditions of reducible and irreducible uncertainty. We show behaviorally that subjects' value updating was sensitive to the reducibility of uncertainty, and could be quantitatively characterized by a Bayesian model where agents ignore expectancy violations that do not update beliefs or values. Using fMRI, we found that neural processes underlying belief and value updating were separable from responses to expectancy violation, and that reducibility of uncertainty in value modulated connections from belief-updating regions to value-updating regions. Together, these results provide insights into how agents use knowledge about uncertainty to make better decisions while ignoring mere expectancy violation. SIGNIFICANCE STATEMENT To make good decisions, a person must observe the environment carefully, and use these observations to reduce uncertainty about consequences of actions. Importantly, uncertainty should not be reduced purely based on how surprising the observations are, particularly because in some cases uncertainty is not reducible. Here we show that the human brain indeed reduces uncertainty adaptively by taking into account the nature of uncertainty and ignoring mere surprise. Behaviorally, we show that human subjects reduce uncertainty in a quasioptimal Bayesian manner. Using fMRI, we characterize brain regions that may be involved in uncertainty reduction, as well as the network they constitute, and dissociate them from brain regions that respond to mere surprise. Copyright © 2017 the authors 0270-6474/17/376972-11$15.00/0.

  9. Uncertainty of Wheat Water Use: Simulated Patterns and Sensitivity to Temperature and CO2

    Science.gov (United States)

    Cammarano, Davide; Roetter, Reimund P.; Asseng, Senthold; Ewert, Frank; Wallach, Daniel; Martre, Pierre; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia E.; Ruane, Alex C.; hide

    2016-01-01

    Projected global warming and population growth will reduce future water availability for agriculture. Thus, it is essential to increase the efficiency in using water to ensure crop productivity. Quantifying crop water use (WU; i.e. actual evapotranspiration) is a critical step towards this goal. Here, sixteen wheat simulation models were used to quantify sources of model uncertainty and to estimate the relative changes and variability between models for simulated WU, water use efficiency (WUE, WU per unit of grain dry mass produced), transpiration efficiency (Teff, transpiration per kg of unit of grain yield dry mass produced), grain yield, crop transpiration and soil evaporation at increased temperatures and elevated atmospheric carbon dioxide concentrations ([CO2]). The greatest uncertainty in simulating water use, potential evapotranspiration, crop transpiration and soil evaporation was due to differences in how crop transpiration was modelled and accounted for 50 of the total variability among models. The simulation results for the sensitivity to temperature indicated that crop WU will decline with increasing temperature due to reduced growing seasons. The uncertainties in simulated crop WU, and in particularly due to uncertainties in simulating crop transpiration, were greater under conditions of increased temperatures and with high temperatures in combination with elevated atmospheric [CO2] concentrations. Hence the simulation of crop WU, and in particularly crop transpiration under higher temperature, needs to be improved and evaluated with field measurements before models can be used to simulate climate change impacts on future crop water demand.

  10. CFCI3 (CFC-11): UV Absorption Spectrum Temperature Dependence Measurements and the Impact on Atmospheric Lifetime and Uncertainty

    Science.gov (United States)

    Mcgillen, Max R.; Fleming, Eric L.; Jackman, Charles H.; Burkholder, James B.

    2014-01-01

    CFCl3 (CFC-11) is both an atmospheric ozone-depleting and potent greenhouse gas that is removed primarily via stratospheric UV photolysis. Uncertainty in the temperature dependence of its UV absorption spectrum is a significant contributing factor to the overall uncertainty in its global lifetime and, thus, model calculations of stratospheric ozone recovery and climate change. In this work, the CFC-11 UV absorption spectrum was measured over a range of wavelength (184.95 - 230 nm) and temperature (216 - 296 K). We report a spectrum temperature dependence that is less than currently recommended for use in atmospheric models. The impact on its atmospheric lifetime was quantified using a 2-D model and the spectrum parameterization developed in this work. The obtained global annually averaged lifetime was 58.1 +- 0.7 years (2 sigma uncertainty due solely to the spectrum uncertainty). The lifetime is slightly reduced and the uncertainty significantly reduced from that obtained using current spectrum recommendations

  11. Reducing the top quark mass uncertainty with jet grooming

    Science.gov (United States)

    Andreassen, Anders; Schwartz, Matthew D.

    2017-10-01

    The measurement of the top quark mass has large systematic uncertainties coming from the Monte Carlo simulations that are used to match theory and experiment. We explore how much that uncertainty can be reduced by using jet grooming procedures. Using the ATLAS A14 tunes of pythia, we estimate the uncertainty from the choice of tuning parameters in what is meant by the Monte Carlo mass to be around 530 MeV without any corrections. This uncertainty can be reduced by 60% to 200 MeV by calibrating to the W mass and by 70% to 140 MeV by additionally applying soft-drop jet grooming (or to 170 MeV using trimming). At e + e - colliders, the associated uncertainty is around 110 MeV, reducing to 50 MeV after calibrating to the W mass. By analyzing the tuning parameters, we conclude that the importance of jet grooming after calibrating to the W -mass is to reduce sensitivity to the underlying event.

  12. Reassessing biases and other uncertainties in sea surface temperature observations measured in situ since 1850: 2. Biases and homogenization

    Science.gov (United States)

    Kennedy, J. J.; Rayner, N. A.; Smith, R. O.; Parker, D. E.; Saunby, M.

    2011-07-01

    Changes in instrumentation and data availability have caused time-varying biases in estimates of global and regional average sea surface temperature. The size of the biases arising from these changes are estimated and their uncertainties evaluated. The estimated biases and their associated uncertainties are largest during the period immediately following the Second World War, reflecting the rapid and incompletely documented changes in shipping and data availability at the time. Adjustments have been applied to reduce these effects in gridded data sets of sea surface temperature and the results are presented as a set of interchangeable realizations. Uncertainties of estimated trends in global and regional average sea surface temperature due to bias adjustments since the Second World War are found to be larger than uncertainties arising from the choice of analysis technique, indicating that this is an important source of uncertainty in analyses of historical sea surface temperatures. Despite this, trends over the twentieth century remain qualitatively consistent.

  13. Temperature response functions introduce high uncertainty in modelled carbon stocks in cold temperature regimes

    Science.gov (United States)

    Portner, H.; Wolf, A.; Bugmann, H.

    2009-04-01

    Many biogeochemical models have been applied to study the response of the carbon cycle to changes in climate, whereby the process of carbon uptake (photosynthesis) has usually gained more attention than the equally important process of carbon release by respiration. The decomposition of soil organic matter is driven by a combination of factors with a prominent one being soil temperature [Berg and Laskowski(2005)]. One uncertainty concerns the response function used to describe the sensitivity of soil organic matter decomposition to temperature. This relationship is often described by one out of a set of similar exponential functions, but it has not been investigated how uncertainties in the choice of the response function influence the long term predictions of biogeochemical models. We built upon the well-established LPJ-GUESS model [Smith et al.(2001)]. We tested five candidate functions and calibrated them against eight datasets from different Ameriflux and CarboEuropeIP sites [Hibbard et al.(2006)]. We used a simple Exponential function with a constant Q10, the Arrhenius function, the Gaussian function [Tuomi et al.(2008), O'Connell(1990)], the Van't Hoff function [Van't Hoff(1901)] and the Lloyd&Taylor function [Lloyd and Taylor(1994)]. We assessed the impact of uncertainty in model formulation of temperature response on estimates of present and future long-term carbon storage in ecosystems and hence on the CO2 feedback potential to the atmosphere. We specifically investigated the relative importance of model formulation and the error introduced by using different data sets for the parameterization. Our results suggested that the Exponential and Arrhenius functions are inappropriate, as they overestimated the respiration rates at lower temperatures. The Gaussian, Van't Hoff and Lloyd&Taylor functions all fit the observed data better, whereby the functions of Gaussian and Van't Hoff underestimated the response at higher temperatures. We suggest, that the

  14. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  15. Uncertainties in hydrogen combustion

    International Nuclear Information System (INIS)

    Stamps, D.W.; Wong, C.C.; Nelson, L.S.

    1988-01-01

    Three important areas of hydrogen combustion with uncertainties are identified: high-temperature combustion, flame acceleration and deflagration-to-detonation transition, and aerosol resuspension during hydrogen combustion. The uncertainties associated with high-temperature combustion may affect at least three different accident scenarios: the in-cavity oxidation of combustible gases produced by core-concrete interactions, the direct containment heating hydrogen problem, and the possibility of local detonations. How these uncertainties may affect the sequence of various accident scenarios is discussed and recommendations are made to reduce these uncertainties. 40 references

  16. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  17. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several

  18. Quantifying Uncertainty in Satellite-Retrieved Land Surface Temperature from Cloud Detection Errors

    Directory of Open Access Journals (Sweden)

    Claire E. Bulgin

    2018-04-01

    Full Text Available Clouds remain one of the largest sources of uncertainty in remote sensing of surface temperature in the infrared, but this uncertainty has not generally been quantified. We present a new approach to do so, applied here to the Advanced Along-Track Scanning Radiometer (AATSR. We use an ensemble of cloud masks based on independent methodologies to investigate the magnitude of cloud detection uncertainties in area-average Land Surface Temperature (LST retrieval. We find that at a grid resolution of 625 km 2 (commensurate with a 0.25 ∘ grid size at the tropics, cloud detection uncertainties are positively correlated with cloud-cover fraction in the cell and are larger during the day than at night. Daytime cloud detection uncertainties range between 2.5 K for clear-sky fractions of 10–20% and 1.03 K for clear-sky fractions of 90–100%. Corresponding night-time uncertainties are 1.6 K and 0.38 K, respectively. Cloud detection uncertainty shows a weaker positive correlation with the number of biomes present within a grid cell, used as a measure of heterogeneity in the background against which the cloud detection must operate (e.g., surface temperature, emissivity and reflectance. Uncertainty due to cloud detection errors is strongly dependent on the dominant land cover classification. We find cloud detection uncertainties of a magnitude of 1.95 K over permanent snow and ice, 1.2 K over open forest, 0.9–1 K over bare soils and 0.09 K over mosaic cropland, for a standardised clear-sky fraction of 74.2%. As the uncertainties arising from cloud detection errors are of a significant magnitude for many surface types and spatially heterogeneous where land classification varies rapidly, LST data producers are encouraged to quantify cloud-related uncertainties in gridded products.

  19. Estimation of sampling error uncertainties in observed surface air temperature change in China

    Science.gov (United States)

    Hua, Wei; Shen, Samuel S. P.; Weithmann, Alexander; Wang, Huijun

    2017-08-01

    This study examines the sampling error uncertainties in the monthly surface air temperature (SAT) change in China over recent decades, focusing on the uncertainties of gridded data, national averages, and linear trends. Results indicate that large sampling error variances appear at the station-sparse area of northern and western China with the maximum value exceeding 2.0 K2 while small sampling error variances are found at the station-dense area of southern and eastern China with most grid values being less than 0.05 K2. In general, the negative temperature existed in each month prior to the 1980s, and a warming in temperature began thereafter, which accelerated in the early and mid-1990s. The increasing trend in the SAT series was observed for each month of the year with the largest temperature increase and highest uncertainty of 0.51 ± 0.29 K (10 year)-1 occurring in February and the weakest trend and smallest uncertainty of 0.13 ± 0.07 K (10 year)-1 in August. The sampling error uncertainties in the national average annual mean SAT series are not sufficiently large to alter the conclusion of the persistent warming in China. In addition, the sampling error uncertainties in the SAT series show a clear variation compared with other uncertainty estimation methods, which is a plausible reason for the inconsistent variations between our estimate and other studies during this period.

  20. Unrealized Global Temperature Increase: Implications of Current Uncertainties

    Science.gov (United States)

    Schwartz, Stephen E.

    2018-04-01

    Unrealized increase in global mean surface air temperature (GMST) may result from the climate system not being in steady state with forcings and/or from cessation of negative aerosol forcing that would result from decreases in emissions. An observation-constrained method is applied to infer the dependence of Earth's climate sensitivity on forcing by anthropogenic aerosols within the uncertainty on that forcing given by the Fifth (2013) Assessment Report of the Intergovernmental Panel on Climate Change. Within these uncertainty ranges the increase in GMST due to temperature lag for future forcings held constant is slight (0.09-0.19 K over 20 years; 0.12-0.26 K over 100 years). However, the incremental increase in GMST that would result from a hypothetical abrupt cessation of sources of aerosols could be quite large but is highly uncertain, 0.1-1.3 K over 20 years. Decrease in CO2 abundance and forcing following abrupt cessation of emissions would offset these increases in GMST over 100 years by as little as 0.09 K to as much as 0.8 K. The uncertainties quantified here greatly limit confidence in projections of change in GMST that would result from any strategy for future reduction of emissions.

  1. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  2. The Effect of Uncertainties on the Operating Temperature of U-Mo/Al Dispersion Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Sweidana, Faris B.; Mistarihia, Qusai M.; Ryu Ho Jin [KAIST, Daejeon (Korea, Republic of); Yim, Jeong Sik [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    In this study, uncertainty and combined uncertainty studies have been carried out to evaluate the uncertainty of the parameters affecting the operational temperature of U-Mo/Al fuel. The uncertainties related to the thermal conductivity of fuel meat, which consists of the effects of thermal diffusivity, density and specific heat capacity, the interaction layer (IL) that forms between the dispersed fuel and the matrix, fuel plate dimensions, heat flux, heat transfer coefficient and the outer cladding temperature were considered. As the development of low-enriched uranium (LEU) fuels has been pursued for research reactors to replace the use of highly-enriched uranium (HEU) for the improvement of proliferation resistance of fuels and fuel cycle, U-Mo particles dispersed in an Al matrix (UMo/Al) is a promising fuel for conversion of the research reactors that currently use HEU fuels to LEUfueled reactors due to its high density and good irradiation stability. Several models have been developed for the estimation of the thermal conductivity of U–Mo fuel, mainly based on the best fit of the very few measured data without providing uncertainty ranges. The purpose of this study is to provide a reasonable estimation of the upper bounds and lower bounds of fuel temperatures with burnup through the evaluation of the uncertainties in the thermal conductivity of irradiated U-Mo/Al dispersion fuel. The combined uncertainty study using RSS method evaluated the effect of applying all the uncertainty values of all the parameters on the operational temperature of U-Mo/Al fuel. The overall influence on the value of the operational temperature is 16.58 .deg. C at the beginning of life and it increases as the burnup increases to reach 18.74 .deg. C at a fuel meat fission density of 3.50E+21 fission/cm{sup 3}. Further studies are needed to evaluate the behavior more accurately by including other parameters uncertainties such as the interaction layer thermal conductivity.

  3. Developing first time-series of land surface temperature from AATSR with uncertainty estimates

    Science.gov (United States)

    Ghent, Darren; Remedios, John

    2013-04-01

    Land surface temperature (LST) is the radiative skin temperature of the land, and is one of the key parameters in the physics of land-surface processes on regional and global scales. Earth Observation satellites provide the opportunity to obtain global coverage of LST approximately every 3 days or less. One such source of satellite retrieved LST has been the Advanced Along-Track Scanning Radiometer (AATSR); with LST retrieval being implemented in the AATSR Instrument Processing Facility in March 2004. Here we present first regional and global time-series of LST data from AATSR with estimates of uncertainty. Mean changes in temperature over the last decade will be discussed along with regional patterns. Although time-series across all three ATSR missions have previously been constructed (Kogler et al., 2012), the use of low resolution auxiliary data in the retrieval algorithm and non-optimal cloud masking resulted in time-series artefacts. As such, considerable ESA supported development has been carried out on the AATSR data to address these concerns. This includes the integration of high resolution auxiliary data into the retrieval algorithm and subsequent generation of coefficients and tuning parameters, plus the development of an improved cloud mask based on the simulation of clear sky conditions from radiance transfer modelling (Ghent et al., in prep.). Any inference on this LST record is though of limited value without the accompaniment of an uncertainty estimate; wherein the Joint Committee for Guides in Metrology quote an uncertainty as "a parameter associated with the result of a measurement that characterizes the dispersion of the values that could reasonably be attributed to the measurand that is the value of the particular quantity to be measured". Furthermore, pixel level uncertainty fields are a mandatory requirement in the on-going preparation of the LST product for the upcoming Sea and Land Surface Temperature (SLSTR) instrument on-board Sentinel-3

  4. Ground surface temperature and continental heat gain: uncertainties from underground

    International Nuclear Information System (INIS)

    Beltrami, Hugo; Matharoo, Gurpreet S; Smerdon, Jason E

    2015-01-01

    Temperature changes at the Earth's surface propagate and are recorded underground as perturbations to the equilibrium thermal regime associated with the heat flow from the Earth's interior. Borehole climatology is concerned with the analysis and interpretation of these downward propagating subsurface temperature anomalies in terms of surface climate. Proper determination of the steady-state geothermal regime is therefore crucial because it is the reference against which climate-induced subsurface temperature anomalies are estimated. Here, we examine the effects of data noise on the determination of the steady-state geothermal regime of the subsurface and the subsequent impact on estimates of ground surface temperature (GST) history and heat gain. We carry out a series of Monte Carlo experiments using 1000 Gaussian noise realizations and depth sections of 100 and 200 m as for steady-state estimates depth intervals, as well as a range of data sampling intervals from 10 m to 0.02 m. Results indicate that typical uncertainties for 50 year averages are on the order of ±0.02 K for the most recent 100 year period. These uncertainties grow with decreasing sampling intervals, reaching about ±0.1 K for a 10 m sampling interval under identical conditions and target period. Uncertainties increase for progressively older periods, reaching ±0.3 K at 500 years before present for a 10 m sampling interval. The uncertainties in reconstructed GST histories for the Northern Hemisphere for the most recent 50 year period can reach a maximum of ±0.5 K in some areas. We suggest that continuous logging should be the preferred approach when measuring geothermal data for climate reconstructions, and that for those using the International Heat Flow Commission database for borehole climatology, the steady-state thermal conditions should be estimated from boreholes as deep as possible and using a large fitting depth range (∼100 m). (letter)

  5. Measurement Uncertainty of Dew-Point Temperature in a Two-Pressure Humidity Generator

    Science.gov (United States)

    Martins, L. Lages; Ribeiro, A. Silva; Alves e Sousa, J.; Forbes, Alistair B.

    2012-09-01

    This article describes the measurement uncertainty evaluation of the dew-point temperature when using a two-pressure humidity generator as a reference standard. The estimation of the dew-point temperature involves the solution of a non-linear equation for which iterative solution techniques, such as the Newton-Raphson method, are required. Previous studies have already been carried out using the GUM method and the Monte Carlo method but have not discussed the impact of the approximate numerical method used to provide the temperature estimation. One of the aims of this article is to take this approximation into account. Following the guidelines presented in the GUM Supplement 1, two alternative approaches can be developed: the forward measurement uncertainty propagation by the Monte Carlo method when using the Newton-Raphson numerical procedure; and the inverse measurement uncertainty propagation by Bayesian inference, based on prior available information regarding the usual dispersion of values obtained by the calibration process. The measurement uncertainties obtained using these two methods can be compared with previous results. Other relevant issues concerning this research are the broad application to measurements that require hygrometric conditions obtained from two-pressure humidity generators and, also, the ability to provide a solution that can be applied to similar iterative models. The research also studied the factors influencing both the use of the Monte Carlo method (such as the seed value and the convergence parameter) and the inverse uncertainty propagation using Bayesian inference (such as the pre-assigned tolerance, prior estimate, and standard deviation) in terms of their accuracy and adequacy.

  6. Quantifying and Reducing Uncertainty in Correlated Multi-Area Short-Term Load Forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Yannan; Hou, Zhangshuan; Meng, Da; Samaan, Nader A.; Makarov, Yuri V.; Huang, Zhenyu

    2016-07-17

    In this study, we represent and reduce the uncertainties in short-term electric load forecasting by integrating time series analysis tools including ARIMA modeling, sequential Gaussian simulation, and principal component analysis. The approaches are mainly focusing on maintaining the inter-dependency between multiple geographically related areas. These approaches are applied onto cross-correlated load time series as well as their forecast errors. Multiple short-term prediction realizations are then generated from the reduced uncertainty ranges, which are useful for power system risk analyses.

  7. Characterizing Uncertainty In Electrical Resistivity Tomography Images Due To Subzero Temperature Variability

    Science.gov (United States)

    Herring, T.; Cey, E. E.; Pidlisecky, A.

    2017-12-01

    Time-lapse electrical resistivity tomography (ERT) is used to image changes in subsurface electrical conductivity (EC), e.g. due to a saline contaminant plume. Temperature variation also produces an EC response, which interferes with the signal of interest. Temperature compensation requires the temperature distribution and the relationship between EC and temperature, but this relationship at subzero temperatures is not well defined. The goal of this study is to examine how uncertainty in the subzero EC/temperature relationship manifests in temperature corrected ERT images, especially with respect to relevant plume parameters (location, contaminant mass, etc.). First, a lab experiment was performed to determine the EC of fine-grained glass beads over a range of temperatures (-20° to 20° C) and saturations. The measured EC/temperature relationship was then used to add temperature effects to a hypothetical EC model of a conductive plume. Forward simulations yielded synthetic field data to which temperature corrections were applied. Varying the temperature/EC relationship used in the temperature correction and comparing the temperature corrected ERT results to the synthetic model enabled a quantitative analysis of the error of plume parameters associated with temperature variability. Modeling possible scenarios in this way helps to establish the feasibility of different time-lapse ERT applications by quantifying the uncertainty associated with parameter(s) of interest.

  8. Estabilishing requirements for the next generation of pressurized water reactors--reducing the uncertainty

    International Nuclear Information System (INIS)

    Chernock, W.P.; Corcoran, W.R.; Rasin, W.H.; Stahlkopf, K.E.

    1987-01-01

    The Electric Power Research Institute is managing a major effort to establish requirements for the next generation of U.S. light water reactors. This effort is the vital first step in preserving the viability of the nuclear option to contribute to meet U.S. national electric power capacity needs in the next century. Combustion Engineering, Inc. and Duke Power Company formed a team to participate in the EPRI program which is guided by a Utility Steering committee consisting of experienced utility technical executives. A major thrust of the program is to reduce the uncertainties which would be faced by the utility executives in choosing the nuclear option. The uncertainties to be reduced include those related to safety, economic, operational, and regulatory aspects of advanced light water reactors. This paper overviews the Requirements Document program as it relates to the U.S. Advanced Light Water Reactor (ALWR) effort in reducing these uncertainties and reports the status of efforts to establish requirements for the next generation of pressurized water reactors. It concentrates on progress made in reducing the uncertainties which would deter selection of the nuclear option for contributing to U.S. national electric power capacity needs in the next century and updates previous reports in the same area. (author)

  9. Trends and associated uncertainty in the global mean temperature record

    Science.gov (United States)

    Poppick, A. N.; Moyer, E. J.; Stein, M.

    2016-12-01

    Physical models suggest that the Earth's mean temperature warms in response to changing CO2 concentrations (and hence increased radiative forcing); given physical uncertainties in this relationship, the historical temperature record is a source of empirical information about global warming. A persistent thread in many analyses of the historical temperature record, however, is the reliance on methods that appear to deemphasize both physical and statistical assumptions. Examples include regression models that treat time rather than radiative forcing as the relevant covariate, and time series methods that account for natural variability in nonparametric rather than parametric ways. We show here that methods that deemphasize assumptions can limit the scope of analysis and can lead to misleading inferences, particularly in the setting considered where the data record is relatively short and the scale of temporal correlation is relatively long. A proposed model that is simple but physically informed provides a more reliable estimate of trends and allows a broader array of questions to be addressed. In accounting for uncertainty, we also illustrate how parametric statistical models that are attuned to the important characteristics of natural variability can be more reliable than ostensibly more flexible approaches.

  10. An audit of the global carbon budget: identifying and reducing sources of uncertainty

    Science.gov (United States)

    Ballantyne, A. P.; Tans, P. P.; Marland, G.; Stocker, B. D.

    2012-12-01

    Uncertainties in our carbon accounting practices may limit our ability to objectively verify emission reductions on regional scales. Furthermore uncertainties in the global C budget must be reduced to benchmark Earth System Models that incorporate carbon-climate interactions. Here we present an audit of the global C budget where we try to identify sources of uncertainty for major terms in the global C budget. The atmospheric growth rate of CO2 has increased significantly over the last 50 years, while the uncertainty in calculating the global atmospheric growth rate has been reduced from 0.4 ppm/yr to 0.2 ppm/yr (95% confidence). Although we have greatly reduced global CO2 growth rate uncertainties, there remain regions, such as the Southern Hemisphere, Tropics and Arctic, where changes in regional sources/sinks will remain difficult to detect without additional observations. Increases in fossil fuel (FF) emissions are the primary factor driving the increase in global CO2 growth rate; however, our confidence in FF emission estimates has actually gone down. Based on a comparison of multiple estimates, FF emissions have increased from 2.45 ± 0.12 PgC/yr in 1959 to 9.40 ± 0.66 PgC/yr in 2010. Major sources of increasing FF emission uncertainty are increased emissions from emerging economies, such as China and India, as well as subtle differences in accounting practices. Lastly, we evaluate emission estimates from Land Use Change (LUC). Although relative errors in emission estimates from LUC are quite high (2 sigma ~ 50%), LUC emissions have remained fairly constant in recent decades. We evaluate the three commonly used approaches to estimating LUC emissions- Bookkeeping, Satellite Imagery, and Model Simulations- to identify their main sources of error and their ability to detect net emissions from LUC.; Uncertainties in Fossil Fuel Emissions over the last 50 years.

  11. Probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty from maximum temperature metric selection

    Science.gov (United States)

    DeWeber, Jefferson T.; Wagner, Tyler

    2018-01-01

    Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30‐day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species’ distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold‐water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid‐century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation

  12. Probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty from maximum temperature metric selection.

    Science.gov (United States)

    DeWeber, Jefferson T; Wagner, Tyler

    2018-06-01

    Predictions of the projected changes in species distributions and potential adaptation action benefits can help guide conservation actions. There is substantial uncertainty in projecting species distributions into an unknown future, however, which can undermine confidence in predictions or misdirect conservation actions if not properly considered. Recent studies have shown that the selection of alternative climate metrics describing very different climatic aspects (e.g., mean air temperature vs. mean precipitation) can be a substantial source of projection uncertainty. It is unclear, however, how much projection uncertainty might stem from selecting among highly correlated, ecologically similar climate metrics (e.g., maximum temperature in July, maximum 30-day temperature) describing the same climatic aspect (e.g., maximum temperatures) known to limit a species' distribution. It is also unclear how projection uncertainty might propagate into predictions of the potential benefits of adaptation actions that might lessen climate change effects. We provide probabilistic measures of climate change vulnerability, adaptation action benefits, and related uncertainty stemming from the selection of four maximum temperature metrics for brook trout (Salvelinus fontinalis), a cold-water salmonid of conservation concern in the eastern United States. Projected losses in suitable stream length varied by as much as 20% among alternative maximum temperature metrics for mid-century climate projections, which was similar to variation among three climate models. Similarly, the regional average predicted increase in brook trout occurrence probability under an adaptation action scenario of full riparian forest restoration varied by as much as .2 among metrics. Our use of Bayesian inference provides probabilistic measures of vulnerability and adaptation action benefits for individual stream reaches that properly address statistical uncertainty and can help guide conservation actions. Our

  13. Uncertainty Characterization of Reactor Vessel Fracture Toughness

    International Nuclear Information System (INIS)

    Li, Fei; Modarres, Mohammad

    2002-01-01

    To perform fracture mechanics analysis of reactor vessel, fracture toughness (K Ic ) at various temperatures would be necessary. In a best estimate approach, K Ic uncertainties resulting from both lack of sufficient knowledge and randomness in some of the variables of K Ic must be characterized. Although it may be argued that there is only one type of uncertainty, which is lack of perfect knowledge about the subject under study, as a matter of practice K Ic uncertainties can be divided into two types: aleatory and epistemic. Aleatory uncertainty is related to uncertainty that is very difficult to reduce, if not impossible; epistemic uncertainty, on the other hand, can be practically reduced. Distinction between aleatory and epistemic uncertainties facilitates decision-making under uncertainty and allows for proper propagation of uncertainties in the computation process. Typically, epistemic uncertainties representing, for example, parameters of a model are sampled (to generate a 'snapshot', single-value of the parameters), but the totality of aleatory uncertainties is carried through the calculation as available. In this paper a description of an approach to account for these two types of uncertainties associated with K Ic has been provided. (authors)

  14. Quantifying Surface Energy Flux Estimation Uncertainty Using Land Surface Temperature Observations

    Science.gov (United States)

    French, A. N.; Hunsaker, D.; Thorp, K.; Bronson, K. F.

    2015-12-01

    Remote sensing with thermal infrared is widely recognized as good way to estimate surface heat fluxes, map crop water use, and detect water-stressed vegetation. When combined with net radiation and soil heat flux data, observations of sensible heat fluxes derived from surface temperatures (LST) are indicative of instantaneous evapotranspiration (ET). There are, however, substantial reasons LST data may not provide the best way to estimate of ET. For example, it is well known that observations and models of LST, air temperature, or estimates of transport resistances may be so inaccurate that physically based model nevertheless yield non-meaningful results. Furthermore, using visible and near infrared remote sensing observations collected at the same time as LST often yield physically plausible results because they are constrained by less dynamic surface conditions such as green fractional cover. Although sensitivity studies exist that help identify likely sources of error and uncertainty, ET studies typically do not provide a way to assess the relative importance of modeling ET with and without LST inputs. To better quantify model benefits and degradations due to LST observational inaccuracies, a Bayesian uncertainty study was undertaken using data collected in remote sensing experiments at Maricopa, Arizona. Visible, near infrared and thermal infrared data were obtained from an airborne platform. The prior probability distribution of ET estimates were modeled using fractional cover, local weather data and a Penman-Monteith mode, while the likelihood of LST data was modeled from a two-source energy balance model. Thus the posterior probabilities of ET represented the value added by using LST data. Results from an ET study over cotton grown in 2014 and 2015 showed significantly reduced ET confidence intervals when LST data were incorporated.

  15. Understanding uncertainty in temperature effects on vector-borne disease: a Bayesian approach

    Science.gov (United States)

    Johnson, Leah R.; Ben-Horin, Tal; Lafferty, Kevin D.; McNally, Amy; Mordecai, Erin A.; Paaijmans, Krijn P.; Pawar, Samraat; Ryan, Sadie J.

    2015-01-01

    Extrinsic environmental factors influence the distribution and population dynamics of many organisms, including insects that are of concern for human health and agriculture. This is particularly true for vector-borne infectious diseases like malaria, which is a major source of morbidity and mortality in humans. Understanding the mechanistic links between environment and population processes for these diseases is key to predicting the consequences of climate change on transmission and for developing effective interventions. An important measure of the intensity of disease transmission is the reproductive number R0. However, understanding the mechanisms linking R0 and temperature, an environmental factor driving disease risk, can be challenging because the data available for parameterization are often poor. To address this, we show how a Bayesian approach can help identify critical uncertainties in components of R0 and how this uncertainty is propagated into the estimate of R0. Most notably, we find that different parameters dominate the uncertainty at different temperature regimes: bite rate from 15°C to 25°C; fecundity across all temperatures, but especially ~25–32°C; mortality from 20°C to 30°C; parasite development rate at ~15–16°C and again at ~33–35°C. Focusing empirical studies on these parameters and corresponding temperature ranges would be the most efficient way to improve estimates of R0. While we focus on malaria, our methods apply to improving process-based models more generally, including epidemiological, physiological niche, and species distribution models.

  16. Diagnosing Geospatial Uncertainty Visualization Challenges in Seasonal Temperature and Precipitation Forecasts

    Science.gov (United States)

    Speciale, A.; Kenney, M. A.; Gerst, M.; Baer, A. E.; DeWitt, D.; Gottschalk, J.; Handel, S.

    2017-12-01

    The uncertainty of future weather and climate conditions is important for many decisions made in communities and economic sectors. One tool that decision-makers use in gauging this uncertainty is forecasts, especially maps (or visualizations) of probabilistic forecast results. However, visualizing geospatial uncertainty is challenging because including probability introduces an extra variable to represent and probability is often poorly understood by users. Using focus group and survey methods, this study seeks to understand the barriers to using probabilistic temperature and precipitation visualizations for specific decisions in the agriculture, energy, emergency management, and water resource sectors. Preliminary results shown here focus on findings of emergency manager needs. Our experimental design uses National Oceanic and Atmospheric Administration (NOAA's) Climate Prediction Center (CPC) climate outlooks, which produce probabilistic temperature and precipitation forecast visualizations at the 6-10 day, 8-14 day, 3-4 week, and 1 and 3 month timeframes. Users were asked to complete questions related to how they use weather information, how uncertainty is represented, and design elements (e.g., color, contour lines) of the visualizations. Preliminary results from the emergency management sector indicate there is significant confusion on how "normal" weather is defined, boundaries between probability ranges, and meaning of the contour lines. After a complete understandability diagnosis is made using results from all sectors, we will collaborate with CPC to suggest modifications to the climate outlook visualizations. These modifications will then be retested in similar focus groups and web-based surveys to confirm they better meet the needs of users.

  17. Combining observations and models to reduce uncertainty in the cloud response to global warming

    Science.gov (United States)

    Norris, J. R.; Myers, T.; Chellappan, S.

    2017-12-01

    Currently there is large uncertainty on how subtropical low-level clouds will respond to global warming and whether they will act as a positive feedback or negative feedback. Global climate models substantially agree on what changes in atmospheric structure and circulation will occur with global warming but greatly disagree over how clouds will respond to these changes in structure and circulation. An examination of models with the most realistic simulations of low-level cloudiness indicates that the model cloud response to atmospheric changes associated with global warming is quantitatively similar to the model cloud response to atmospheric changes at interannual time scales. For these models, the cloud response to global warming predicted by multilinear regression using coefficients derived from interannual time scales is quantitatively similar to the cloud response to global warming directly simulated by the model. Since there is a large spread among cloud response coefficients even among models with the most realistic cloud simulations, substitution of coefficients derived from satellite observations reduces the uncertainty range of the low-level cloud feedback. Increased sea surface temperature associated with global warming acts to reduce low-level cloudiness, which is partially offset by increased lower tropospheric stratification that acts to enhance low-level cloudiness. Changes in free-tropospheric relative humidity, subsidence, and horizontal advection have only a small impact on low-level cloud. The net reduction in subtropical low-level cloudiness increases absorption of solar radiation by the climate system, thus resulting in a weak positive feedback.

  18. Uncertainty: a discriminator for above and below boiling repository design decisions

    International Nuclear Information System (INIS)

    Wilder, D G; Lin, W; Buscheck, T A; Wolery, T J; Francis, N D

    2000-01-01

    The US nuclear waste disposal program is evaluating the Yucca Mountain (YM) site for possible disposal of nuclear waste. Radioactive decay of the waste, particularly spent fuel, generates sufficient heat to significantly raise repository temperatures. Environmental conditions in the repository system evolve in response to this heat. The amount of temperature increase, and thus environmental changes, depends on repository design and operations. Because the evolving environment cannot be directly measured until after waste is emplaced, licensing decisions must be based upon model and analytical projections of the environmental conditions. These analyses have inherent uncertainties. There is concern that elevated temperatures increase uncertainty, because most chemical reaction rates increase with temperature and boiling introduces additional complexity of vapor phase reactions and transport. This concern was expressed by the NWTRB, particularly for above boiling temperatures. They state that ''the cooler the repository, the lower the uncertainty about heat-driven water migration and the better the performance of waste package materials. Above this temperature, technical uncertainties tend to be significantly higher than those associated with below-boiling conditions.'' (Cohon 1999). However, not all uncertainties are reduced by lower temperatures, indeed some may even be increased. This paper addresses impacts of temperatures on uncertainties

  19. Integrating uncertainties for climate change mitigation

    Science.gov (United States)

    Rogelj, Joeri; McCollum, David; Reisinger, Andy; Meinshausen, Malte; Riahi, Keywan

    2013-04-01

    geophysical, future energy demand, and mitigation technology uncertainties. This information provides central information for policy making, since it helps to understand the relationship between mitigation costs and their potential to reduce the risk of exceeding 2°C, or other temperature limits like 3°C or 1.5°C, under a wide range of scenarios.

  20. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  1. Temperature Data Evaluation

    International Nuclear Information System (INIS)

    Gillespie, David

    2003-01-01

    Groundwater temperature is sensitive to the competing processes of heat flow from below the advective transport of heat by groundwater flow. Because groundwater temperature is sensitive to conductive and advective processes, groundwater temperature may be utilized as a tracer to further constrain the uncertainty of predictions of advective radionuclide transport models constructed for the Nevada Test Site (NTS). Since heat transport, geochemical, and hydrologic models for a given area must all be consistent, uncertainty can be reduced by devaluing the weight of those models that do not match estimated heat flow. The objective of this study was to identify the quantity and quality of available heat flow data at the NTS. One-hundred-forty-five temperature logs from 63 boreholes were examined. Thirteen were found to have temperature profiles suitable for the determination of heat flow values from one or more intervals within the boreholes. If sufficient spatially distributed heat flow values are obtained, a heat transport model coupled to a hydrologic model may be used to reduce the uncertainty of a nonisothermal hydrologic model of the NTS

  2. Using performance indicators to reduce cost uncertainty of China's CO2 mitigation goals

    International Nuclear Information System (INIS)

    Xu, Yuan

    2013-01-01

    Goals on absolute emissions and intensity play key roles in CO 2 mitigation. However, like cap-and-trade policies with price uncertainty, they suffer from significant uncertainty in abatement costs. This article examines whether an indicator could be established to complement CO 2 mitigation goals and help reduce cost uncertainty with a particular focus on China. Performance indicators on CO 2 emissions per unit of energy consumption could satisfy three criteria: compared with the mitigation goals, (i) they are more closely associated with active mitigation efforts and (ii) their baselines have more stable projections from historical trajectories. (iii) Their abatement costs are generally higher than other mitigation methods, particularly energy efficiency and conservation. Performance indicators could be used in the following way: if a CO 2 goal on absolute emissions or intensity is attained, the performance indicator should still reach a lower threshold as a cost floor. If the goal cannot be attained, an upper performance threshold should be achieved as a cost ceiling. The narrower cost uncertainty may encourage wider and greater mitigation efforts. - Highlights: ► CO 2 emissions per unit of energy consumption could act as performance indicators. ► Performance indicators are more closely related to active mitigation activities. ► Performance indicators have more stable historical trajectories. ► Abatement costs are higher for performance indicators than for other activities. ► Performance thresholds could reduce the cost uncertainty of CO 2 mitigation goals.

  3. Uncertainty in simulating wheat yields under climate change

    DEFF Research Database (Denmark)

    Asseng, A; Ewert, F; Rosenzweig, C

    2013-01-01

    of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...

  4. Reducing Uncertainty: Implementation of Heisenberg Principle to Measure Company Performance

    Directory of Open Access Journals (Sweden)

    Anna Svirina

    2015-08-01

    Full Text Available The paper addresses the problem of uncertainty reduction in estimation of future company performance, which is a result of wide range of enterprise's intangible assets probable efficiency. To reduce this problem, the paper suggests to use quantum economy principles, i.e. implementation of Heisenberg principle to measure efficiency and potential of intangible assets of the company. It is proposed that for intangibles it is not possible to estimate both potential and efficiency at a certain time point. To provide a proof for these thesis, the data on resources potential and efficiency from mid-Russian companies was evaluated within deterministic approach, which did not allow to evaluate probability of achieving certain resource efficiency, and quantum approach, which allowed to estimate the central point around which the probable efficiency of resources in concentrated. Visualization of these approaches was performed by means of LabView software. It was proven that for tangible assets performance estimation a deterministic approach should be used; while for intangible assets the quantum approach allows better quality of future performance prediction. On the basis of these findings we proposed the holistic approach towards estimation of company resource efficiency in order to reduce uncertainty in modeling company performance.

  5. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Science.gov (United States)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  6. Uncertainty Quantification of Calculated Temperatures for the U.S. Capsules in the AGR-2 Experiment

    Energy Technology Data Exchange (ETDEWEB)

    Lybeck, Nancy [Idaho National Lab. (INL), Idaho Falls, ID (United States); Einerson, Jeffrey J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Pham, Binh T. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hawkes, Grant L. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    A series of Advanced Gas Reactor (AGR) irradiation experiments are being conducted within the Advanced Reactor Technology (ART) Fuel Development and Qualification Program. The main objectives of the fuel experimental campaign are to provide the necessary data on fuel performance to support fuel process development, qualify a fuel design and fabrication process for normal operation and accident conditions, and support development and validation of fuel performance and fission product transport models and codes (PLN-3636). The AGR-2 test was inserted in the B-12 position in the Advanced Test Reactor (ATR) core at Idaho National Laboratory (INL) in June 2010 and successfully completed irradiation in October 2013, resulting in irradiation of the TRISO fuel for 559.2 effective full power days (EFPDs) during approximately 3.3 calendar years. The AGR-2 data, including the irradiation data and calculated results, were qualified and stored in the Nuclear Data Management and Analysis System (NDMAS) (Pham and Einerson 2014). To support the U.S. TRISO fuel performance assessment and to provide data for validation of fuel performance and fission product transport models and codes, the daily as-run thermal analysis has been performed separately on each of four AGR-2 U.S. capsules for the entire irradiation as discussed in (Hawkes 2014). The ABAQUS code’s finite element-based thermal model predicts the daily average volume-average fuel temperature and peak fuel temperature in each capsule. This thermal model involves complex physical mechanisms (e.g., graphite holder and fuel compact shrinkage) and properties (e.g., conductivity and density). Therefore, the thermal model predictions are affected by uncertainty in input parameters and by incomplete knowledge of the underlying physics leading to modeling assumptions. Therefore, alongside with the deterministic predictions from a set of input thermal conditions, information about prediction uncertainty is instrumental for the ART

  7. The worth of data to reduce predictive uncertainty of an integrated catchment model by multi-constraint calibration

    Science.gov (United States)

    Koch, J.; Jensen, K. H.; Stisen, S.

    2017-12-01

    Hydrological models that integrate numerical process descriptions across compartments of the water cycle are typically required to undergo thorough model calibration in order to estimate suitable effective model parameters. In this study, we apply a spatially distributed hydrological model code which couples the saturated zone with the unsaturated zone and the energy portioning at the land surface. We conduct a comprehensive multi-constraint model calibration against nine independent observational datasets which reflect both the temporal and the spatial behavior of hydrological response of a 1000km2 large catchment in Denmark. The datasets are obtained from satellite remote sensing and in-situ measurements and cover five keystone hydrological variables: discharge, evapotranspiration, groundwater head, soil moisture and land surface temperature. Results indicate that a balanced optimization can be achieved where errors on objective functions for all nine observational datasets can be reduced simultaneously. The applied calibration framework was tailored with focus on improving the spatial pattern performance; however results suggest that the optimization is still more prone to improve the temporal dimension of model performance. This study features a post-calibration linear uncertainty analysis. This allows quantifying parameter identifiability which is the worth of a specific observational dataset to infer values to model parameters through calibration. Furthermore the ability of an observation to reduce predictive uncertainty is assessed as well. Such findings determine concrete implications on the design of model calibration frameworks and, in more general terms, the acquisition of data in hydrological observatories.

  8. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  9. Revealing, Reducing, and Representing Uncertainties in New Hydrologic Projections for Climate-changed Futures

    Science.gov (United States)

    Arnold, Jeffrey; Clark, Martyn; Gutmann, Ethan; Wood, Andy; Nijssen, Bart; Rasmussen, Roy

    2016-04-01

    The United States Army Corps of Engineers (USACE) has had primary responsibility for multi-purpose water resource operations on most of the major river systems in the U.S. for more than 200 years. In that time, the USACE projects and programs making up those operations have proved mostly robust against the range of natural climate variability encountered over their operating life spans. However, in some watersheds and for some variables, climate change now is known to be shifting the hydroclimatic baseline around which that natural variability occurs and changing the range of that variability as well. This makes historical stationarity an inappropriate basis for assessing continued project operations under climate-changed futures. That means new hydroclimatic projections are required at multiple scales to inform decisions about specific threats and impacts, and for possible adaptation responses to limit water-resource vulnerabilities and enhance operational resilience. However, projections of possible future hydroclimatologies have myriad complex uncertainties that require explicit guidance for interpreting and using them to inform those decisions about climate vulnerabilities and resilience. Moreover, many of these uncertainties overlap and interact. Recent work, for example, has shown the importance of assessing the uncertainties from multiple sources including: global model structure [Meehl et al., 2005; Knutti and Sedlacek, 2013]; internal climate variability [Deser et al., 2012; Kay et al., 2014]; climate downscaling methods [Gutmann et al., 2012; Mearns et al., 2013]; and hydrologic models [Addor et al., 2014; Vano et al., 2014; Mendoza et al., 2015]. Revealing, reducing, and representing these uncertainties is essential for defining the plausible quantitative climate change narratives required to inform water-resource decision-making. And to be useful, such quantitative narratives, or storylines, of climate change threats and hydrologic impacts must sample

  10. Subpixel edge localization with reduced uncertainty by violating the Nyquist criterion

    Science.gov (United States)

    Heidingsfelder, Philipp; Gao, Jun; Wang, Kun; Ott, Peter

    2014-12-01

    In this contribution, the extent to which the Nyquist criterion can be violated in optical imaging systems with a digital sensor, e.g., a digital microscope, is investigated. In detail, we analyze the subpixel uncertainty of the detected position of a step edge, the edge of a stripe with a varying width, and that of a periodic rectangular pattern for varying pixel pitches of the sensor, thus also in aliased conditions. The analysis includes the investigation of different algorithms of edge localization based on direct fitting or based on the derivative of the edge profile, such as the common centroid method. In addition to the systematic error of these algorithms, the influence of the photon noise (PN) is included in the investigation. A simplified closed form solution for the uncertainty of the edge position caused by the PN is derived. The presented results show that, in the vast majority of cases, the pixel pitch can exceed the Nyquist sampling distance by about 50% without an increase of the uncertainty of edge localization. This allows one to increase the field-of-view without increasing the resolution of the sensor and to decrease the size of the setup by reducing the magnification. Experimental results confirm the simulation results.

  11. Signal detection in global mean temperatures after "Paris": an uncertainty and sensitivity analysis

    Science.gov (United States)

    Visser, Hans; Dangendorf, Sönke; van Vuuren, Detlef P.; Bregman, Bram; Petersen, Arthur C.

    2018-02-01

    In December 2015, 195 countries agreed in Paris to hold the increase in global mean surface temperature (GMST) well below 2.0 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C. Since large financial flows will be needed to keep GMSTs below these targets, it is important to know how GMST has progressed since pre-industrial times. However, the Paris Agreement is not conclusive as regards methods to calculate it. Should trend progression be deduced from GCM simulations or from instrumental records by (statistical) trend methods? Which simulations or GMST datasets should be chosen, and which trend models? What is pre-industrial and, finally, are the Paris targets formulated for total warming, originating from both natural and anthropogenic forcing, or do they refer to anthropogenic warming only? To find answers to these questions we performed an uncertainty and sensitivity analysis where datasets and model choices have been varied. For all cases we evaluated trend progression along with uncertainty information. To do so, we analysed four trend approaches and applied these to the five leading observational GMST products. We find GMST progression to be largely independent of various trend model approaches. However, GMST progression is significantly influenced by the choice of GMST datasets. Uncertainties due to natural variability are largest in size. As a parallel path, we calculated GMST progression from an ensemble of 42 GCM simulations. Mean progression derived from GCM-based GMSTs appears to lie in the range of trend-dataset combinations. A difference between both approaches appears to be the width of uncertainty bands: GCM simulations show a much wider spread. Finally, we discuss various choices for pre-industrial baselines and the role of warming definitions. Based on these findings we propose an estimate for signal progression in GMSTs since pre-industrial.

  12. A fractional factorial probabilistic collocation method for uncertainty propagation of hydrologic model parameters in a reduced dimensional space

    Science.gov (United States)

    Wang, S.; Huang, G. H.; Huang, W.; Fan, Y. R.; Li, Z.

    2015-10-01

    In this study, a fractional factorial probabilistic collocation method is proposed to reveal statistical significance of hydrologic model parameters and their multi-level interactions affecting model outputs, facilitating uncertainty propagation in a reduced dimensional space. The proposed methodology is applied to the Xiangxi River watershed in China to demonstrate its validity and applicability, as well as its capability of revealing complex and dynamic parameter interactions. A set of reduced polynomial chaos expansions (PCEs) only with statistically significant terms can be obtained based on the results of factorial analysis of variance (ANOVA), achieving a reduction of uncertainty in hydrologic predictions. The predictive performance of reduced PCEs is verified by comparing against standard PCEs and the Monte Carlo with Latin hypercube sampling (MC-LHS) method in terms of reliability, sharpness, and Nash-Sutcliffe efficiency (NSE). Results reveal that the reduced PCEs are able to capture hydrologic behaviors of the Xiangxi River watershed, and they are efficient functional representations for propagating uncertainties in hydrologic predictions.

  13. Global sensitivity analysis for identifying important parameters of nitrogen nitrification and denitrification under model uncertainty and scenario uncertainty

    Science.gov (United States)

    Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong

    2018-06-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen

  14. Reducing Dose Uncertainty for Spot-Scanning Proton Beam Therapy of Moving Tumors by Optimizing the Spot Delivery Sequence

    International Nuclear Information System (INIS)

    Li, Heng; Zhu, X. Ronald; Zhang, Xiaodong

    2015-01-01

    Purpose: To develop and validate a novel delivery strategy for reducing the respiratory motion–induced dose uncertainty of spot-scanning proton therapy. Methods and Materials: The spot delivery sequence was optimized to reduce dose uncertainty. The effectiveness of the delivery sequence optimization was evaluated using measurements and patient simulation. One hundred ninety-one 2-dimensional measurements using different delivery sequences of a single-layer uniform pattern were obtained with a detector array on a 1-dimensional moving platform. Intensity modulated proton therapy plans were generated for 10 lung cancer patients, and dose uncertainties for different delivery sequences were evaluated by simulation. Results: Without delivery sequence optimization, the maximum absolute dose error can be up to 97.2% in a single measurement, whereas the optimized delivery sequence results in a maximum absolute dose error of ≤11.8%. In patient simulation, the optimized delivery sequence reduces the mean of fractional maximum absolute dose error compared with the regular delivery sequence by 3.3% to 10.6% (32.5-68.0% relative reduction) for different patients. Conclusions: Optimizing the delivery sequence can reduce dose uncertainty due to respiratory motion in spot-scanning proton therapy, assuming the 4-dimensional CT is a true representation of the patients' breathing patterns.

  15. How incorporating more data reduces uncertainty in recovery predictions

    Energy Technology Data Exchange (ETDEWEB)

    Campozana, F.P.; Lake, L.W.; Sepehrnoori, K. [Univ. of Texas, Austin, TX (United States)

    1997-08-01

    From the discovery to the abandonment of a petroleum reservoir, there are many decisions that involve economic risks because of uncertainty in the production forecast. This uncertainty may be quantified by performing stochastic reservoir modeling (SRM); however, it is not practical to apply SRM every time the model is updated to account for new data. This paper suggests a novel procedure to estimate reservoir uncertainty (and its reduction) as a function of the amount and type of data used in the reservoir modeling. Two types of data are analyzed: conditioning data and well-test data. However, the same procedure can be applied to any other data type. Three performance parameters are suggested to quantify uncertainty. SRM is performed for the following typical stages: discovery, primary production, secondary production, and infill drilling. From those results, a set of curves is generated that can be used to estimate (1) the uncertainty for any other situation and (2) the uncertainty reduction caused by the introduction of new wells (with and without well-test data) into the description.

  16. Reducing structural uncertainty in conceptual hydrological modeling in the semi-arid Andes

    Science.gov (United States)

    Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.

    2014-10-01

    The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modeling of a meso-scale Andean catchment (1515 km2) over a 30 year period (1982-2011). The modeling process was decomposed into six model-building decisions related to the following aspects of the system behavior: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modeling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain 8 model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modeling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.

  17. Reducing structural uncertainty in conceptual hydrological modelling in the semi-arid Andes

    Science.gov (United States)

    Hublart, P.; Ruelland, D.; Dezetter, A.; Jourde, H.

    2015-05-01

    The use of lumped, conceptual models in hydrological impact studies requires placing more emphasis on the uncertainty arising from deficiencies and/or ambiguities in the model structure. This study provides an opportunity to combine a multiple-hypothesis framework with a multi-criteria assessment scheme to reduce structural uncertainty in the conceptual modelling of a mesoscale Andean catchment (1515 km2) over a 30-year period (1982-2011). The modelling process was decomposed into six model-building decisions related to the following aspects of the system behaviour: snow accumulation and melt, runoff generation, redistribution and delay of water fluxes, and natural storage effects. Each of these decisions was provided with a set of alternative modelling options, resulting in a total of 72 competing model structures. These structures were calibrated using the concept of Pareto optimality with three criteria pertaining to streamflow simulations and one to the seasonal dynamics of snow processes. The results were analyzed in the four-dimensional (4-D) space of performance measures using a fuzzy c-means clustering technique and a differential split sample test, leading to identify 14 equally acceptable model hypotheses. A filtering approach was then applied to these best-performing structures in order to minimize the overall uncertainty envelope while maximizing the number of enclosed observations. This led to retain eight model hypotheses as a representation of the minimum structural uncertainty that could be obtained with this modelling framework. Future work to better consider model predictive uncertainty should include a proper assessment of parameter equifinality and data errors, as well as the testing of new or refined hypotheses to allow for the use of additional auxiliary observations.

  18. Uncertainty in climate-carbon-cycle projections associated with the sensitivity of soil respiration to temperature

    International Nuclear Information System (INIS)

    Jones, Chris D.; Cox, Peter; Huntingford, Chris

    2003-01-01

    Carbon-cycle feedbacks have been shown to be very important in predicting climate change over the next century, with a potentially large positive feedback coming from the release of carbon from soils as global temperatures increase. The magnitude of this feedback and whether or not it drives the terrestrial carbon cycle to become a net source of carbon dioxide during the next century depends particularly on the response of soil respiration to temperature. Observed global atmospheric CO 2 concentration, and its response to naturally occurring climate anomalies, is used to constrain the behaviour of soil respiration in our coupled climate-carbon-cycle GCM. This constraint is used to quantify some of the uncertainties in predictions of future CO 2 levels. The uncertainty is large, emphasizing the importance of carbon-cycle research with respect to future climate change predictions

  19. How uncertainty analysis of streamflow data can reduce costs and promote robust decisions in water management applications

    Science.gov (United States)

    McMillan, Hilary; Seibert, Jan; Petersen-Overleir, Asgeir; Lang, Michel; White, Paul; Snelder, Ton; Rutherford, Kit; Krueger, Tobias; Mason, Robert; Kiang, Julie

    2017-07-01

    Streamflow data are used for important environmental and economic decisions, such as specifying and regulating minimum flows, managing water supplies, and planning for flood hazards. Despite significant uncertainty in most flow data, the flow series for these applications are often communicated and used without uncertainty information. In this commentary, we argue that proper analysis of uncertainty in river flow data can reduce costs and promote robust conclusions in water management applications. We substantiate our argument by providing case studies from Norway and New Zealand where streamflow uncertainty analysis has uncovered economic costs in the hydropower industry, improved public acceptance of a controversial water management policy, and tested the accuracy of water quality trends. We discuss the need for practical uncertainty assessment tools that generate multiple flow series realizations rather than simple error bounds. Although examples of such tools are in development, considerable barriers for uncertainty analysis and communication still exist for practitioners, and future research must aim to provide easier access and usability of uncertainty estimates. We conclude that flow uncertainty analysis is critical for good water management decisions.

  20. Using Uncertainty Analysis to Guide the Development of Accelerated Stress Tests (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Kempe, M.

    2014-03-01

    Extrapolation of accelerated testing to the long-term results expected in the field has uncertainty associated with the acceleration factors and the range of possible stresses in the field. When multiple stresses (such as temperature and humidity) can be used to increase the acceleration, the uncertainty may be reduced according to which stress factors are used to accelerate the degradation.

  1. A review of reactor physics uncertainties and validation requirements for the modular high-temperature gas-cooled reactor

    International Nuclear Information System (INIS)

    Baxter, A.M.; Lane, R.K.; Hettergott, E.; Lefler, W.

    1991-01-01

    The important, safety-related, physics parameters for the low-enriched Modular High-Temperature gas-Cooled Reactor (MHTGR) such as control rod worth, shutdown margins, temperature coefficients, and reactivity worths, are considered, and estimates are presented of the uncertainties in the calculated values of these parameters. The basis for the uncertainty estimate in several of the important calculated parameters is reviewed, including the available experimental data used in obtaining these estimates. Based on this review, the additional experimental data needed to complete the validation of the methods used to calculate these parameters is presented. The role of benchmark calculations in validating MHTGR reactor physics data is also considered. (author). 10 refs, 5 figs, 3 tabs

  2. A new approach to reduce uncertainties in space radiation cancer risk predictions.

    Directory of Open Access Journals (Sweden)

    Francis A Cucinotta

    Full Text Available The prediction of space radiation induced cancer risk carries large uncertainties with two of the largest uncertainties being radiation quality and dose-rate effects. In risk models the ratio of the quality factor (QF to the dose and dose-rate reduction effectiveness factor (DDREF parameter is used to scale organ doses for cosmic ray proton and high charge and energy (HZE particles to a hazard rate for γ-rays derived from human epidemiology data. In previous work, particle track structure concepts were used to formulate a space radiation QF function that is dependent on particle charge number Z, and kinetic energy per atomic mass unit, E. QF uncertainties where represented by subjective probability distribution functions (PDF for the three QF parameters that described its maximum value and shape parameters for Z and E dependences. Here I report on an analysis of a maximum QF parameter and its uncertainty using mouse tumor induction data. Because experimental data for risks at low doses of γ-rays are highly uncertain which impacts estimates of maximum values of relative biological effectiveness (RBEmax, I developed an alternate QF model, denoted QFγAcute where QFs are defined relative to higher acute γ-ray doses (0.5 to 3 Gy. The alternate model reduces the dependence of risk projections on the DDREF, however a DDREF is still needed for risk estimates for high-energy protons and other primary or secondary sparsely ionizing space radiation components. Risk projections (upper confidence levels (CL for space missions show a reduction of about 40% (CL∼50% using the QFγAcute model compared the QFs based on RBEmax and about 25% (CL∼35% compared to previous estimates. In addition, I discuss how a possible qualitative difference leading to increased tumor lethality for HZE particles compared to low LET radiation and background tumors remains a large uncertainty in risk estimates.

  3. High-temperature uncertainty

    International Nuclear Information System (INIS)

    Timusk, T.

    2005-01-01

    Recent experiments reveal that the mechanism responsible for the superconducting properties of cuprate materials is even more mysterious than we thought. Two decades ago, Georg Bednorz and Alex Mueller of IBM's research laboratory in Zurich rocked the world of physics when they discovered a material that lost all resistance to electrical current at the record temperature of 36 K. Until then, superconductivity was thought to be a strictly low-temperature phenomenon that required costly refrigeration. Moreover, the IBM discovery - for which Bednorz and Mueller were awarded the 1987 Nobel Prize for Physics - was made in a ceramic copper-oxide material that nobody expected to be particularly special. Proposed applications for these 'cuprates' abounded. High-temperature superconductivity, particularly if it could be extended to room temperature, offered the promise of levitating trains, ultra-efficient power cables, and even supercomputers based on superconducting quantum interference devices. But these applications have been slow to materialize. Moreover, almost 20 years on, the physics behind this strange state of matter remains a mystery. (U.K.)

  4. Accounting for downscaling and model uncertainty in fine-resolution seasonal climate projections over the Columbia River Basin

    Science.gov (United States)

    Ahmadalipour, Ali; Moradkhani, Hamid; Rana, Arun

    2018-01-01

    Climate change is expected to have severe impacts on natural systems as well as various socio-economic aspects of human life. This has urged scientific communities to improve the understanding of future climate and reduce the uncertainties associated with projections. In the present study, ten statistically downscaled CMIP5 GCMs at 1/16th deg. spatial resolution from two different downscaling procedures are utilized over the Columbia River Basin (CRB) to assess the changes in climate variables and characterize the associated uncertainties. Three climate variables, i.e. precipitation, maximum temperature, and minimum temperature, are studied for the historical period of 1970-2000 as well as future period of 2010-2099, simulated with representative concentration pathways of RCP4.5 and RCP8.5. Bayesian Model Averaging (BMA) is employed to reduce the model uncertainty and develop a probabilistic projection for each variable in each scenario. Historical comparison of long-term attributes of GCMs and observation suggests a more accurate representation for BMA than individual models. Furthermore, BMA projections are used to investigate future seasonal to annual changes of climate variables. Projections indicate significant increase in annual precipitation and temperature, with varied degree of change across different sub-basins of CRB. We then characterized uncertainty of future projections for each season over CRB. Results reveal that model uncertainty is the main source of uncertainty, among others. However, downscaling uncertainty considerably contributes to the total uncertainty of future projections, especially in summer. On the contrary, downscaling uncertainty appears to be higher than scenario uncertainty for precipitation.

  5. Optimal portfolio design to reduce climate-related conservation uncertainty in the Prairie Pothole Region.

    Science.gov (United States)

    Ando, Amy W; Mallory, Mindy L

    2012-04-24

    Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit-cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change-induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area.

  6. Embracing uncertainty in climate change policy

    Science.gov (United States)

    Otto, Friederike E. L.; Frame, David J.; Otto, Alexander; Allen, Myles R.

    2015-10-01

    The 'pledge and review' approach to reducing greenhouse-gas emissions presents an opportunity to link mitigation goals explicitly to the evolving climate response. This seems desirable because the progression from the Intergovernmental Panel on Climate Change's fourth to fifth assessment reports has seen little reduction in uncertainty. A common reaction to persistent uncertainties is to advocate mitigation policies that are robust even under worst-case scenarios, thereby focusing attention on upper extremes of both the climate response and the costs of impacts and mitigation, all of which are highly contestable. Here we ask whether those contributing to the formation of climate policies can learn from 'adaptive management' techniques. Recognizing that long-lived greenhouse gas emissions have to be net zero by the time temperatures reach a target stabilization level, such as 2 °C above pre-industrial levels, and anchoring commitments to an agreed index of attributable anthropogenic warming would provide a transparent approach to meeting such a temperature goal without prior consensus on the climate response.

  7. Reducing uncertainties in volumetric image based deformable organ registration

    International Nuclear Information System (INIS)

    Liang, J.; Yan, D.

    2003-01-01

    Applying volumetric image feedback in radiotherapy requires image based deformable organ registration. The foundation of this registration is the ability of tracking subvolume displacement in organs of interest. Subvolume displacement can be calculated by applying biomechanics model and the finite element method to human organs manifested on the multiple volumetric images. The calculation accuracy, however, is highly dependent on the determination of the corresponding organ boundary points. Lacking sufficient information for such determination, uncertainties are inevitable--thus diminishing the registration accuracy. In this paper, a method of consuming energy minimization was developed to reduce these uncertainties. Starting from an initial selection of organ boundary point correspondence on volumetric image sets, the subvolume displacement and stress distribution of the whole organ are calculated and the consumed energy due to the subvolume displacements is computed accordingly. The corresponding positions of the initially selected boundary points are then iteratively optimized to minimize the consuming energy under geometry and stress constraints. In this study, a rectal wall delineated from patient CT image was artificially deformed using a computer simulation and utilized to test the optimization. Subvolume displacements calculated based on the optimized boundary point correspondence were compared to the true displacements, and the calculation accuracy was thereby evaluated. Results demonstrate that a significant improvement on the accuracy of the deformable organ registration can be achieved by applying the consuming energy minimization in the organ deformation calculation

  8. Reducing the uncertainty in robotic machining by modal analysis

    Science.gov (United States)

    Alberdi, Iñigo; Pelegay, Jose Angel; Arrazola, Pedro Jose; Ørskov, Klaus Bonde

    2017-10-01

    The use of industrial robots for machining could lead to high cost and energy savings for the manufacturing industry. Machining robots offer several advantages respect to CNC machines such as flexibility, wide working space, adaptability and relatively low cost. However, there are some drawbacks that are preventing a widespread adoption of robotic solutions namely lower stiffness, vibration/chatter problems and lower accuracy and repeatability. Normally due to these issues conservative cutting parameters are chosen, resulting in a low material removal rate (MRR). In this article, an example of a modal analysis of a robot is presented. For that purpose the Tap-testing technology is introduced, which aims at maximizing productivity, reducing the uncertainty in the selection of cutting parameters and offering a stable process free from chatter vibrations.

  9. Using FOSM-Based Data Worth Analyses to Design Geophysical Surveys to Reduce Uncertainty in a Regional Groundwater Model Update

    Science.gov (United States)

    Smith, B. D.; White, J.; Kress, W. H.; Clark, B. R.; Barlow, J.

    2016-12-01

    Hydrogeophysical surveys have become an integral part of understanding hydrogeological frameworks used in groundwater models. Regional models cover a large area where water well data is, at best, scattered and irregular. Since budgets are finite, priorities must be assigned to select optimal areas for geophysical surveys. For airborne electromagnetic (AEM) geophysical surveys, optimization of mapping depth and line spacing needs to take in account the objectives of the groundwater models. The approach discussed here uses a first-order, second-moment (FOSM) uncertainty analyses which assumes an approximate linear relation between model parameters and observations. This assumption allows FOSM analyses to be applied to estimate the value of increased parameter knowledge to reduce forecast uncertainty. FOSM is used to facilitate optimization of yet-to-be-completed geophysical surveying to reduce model forecast uncertainty. The main objective of geophysical surveying is assumed to estimate values and spatial variation in hydrologic parameters (i.e. hydraulic conductivity) as well as map lower permeability layers that influence the spatial distribution of recharge flux. The proposed data worth analysis was applied to Mississippi Embayment Regional Aquifer Study (MERAS) which is being updated. The objective of MERAS is to assess the ground-water availability (status and trends) of the Mississippi embayment aquifer system. The study area covers portions of eight states including Alabama, Arkansas, Illinois, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. The active model grid covers approximately 70,000 square miles, and incorporates some 6,000 miles of major rivers and over 100,000 water wells. In the FOSM analysis, a dense network of pilot points was used to capture uncertainty in hydraulic conductivity and recharge. To simulate the effect of AEM flight lines, the prior uncertainty for hydraulic conductivity and recharge pilots along potential flight lines was

  10. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    Science.gov (United States)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  11. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries

    Science.gov (United States)

    Sutton, Abigail M.; Rudd, Murray A.

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on `expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent `shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  12. Crossing Science-Policy-Societal Boundaries to Reduce Scientific and Institutional Uncertainty in Small-Scale Fisheries.

    Science.gov (United States)

    Sutton, Abigail M; Rudd, Murray A

    2016-10-01

    The governance of small-scale fisheries (SSF) is challenging due to the uncertainty, complexity, and interconnectedness of social, political, ecological, and economical processes. Conventional SSF management has focused on a centralized and top-down approach. A major criticism of conventional management is the over-reliance on 'expert science' to guide decision-making and poor consideration of fishers' contextually rich knowledge. That is thought to exacerbate the already low governance potential of SSF. Integrating scientific knowledge with fishers' knowledge is increasingly popular and is often assumed to help reduce levels of biophysical and institutional uncertainties. Many projects aimed at encouraging knowledge integration have, however, been unsuccessful. Our objective in this research was to assess factors that influence knowledge integration and the uptake of integrated knowledge into policy-making. We report results from 54 semi-structured interviews with SSF researchers and practitioners from around the globe. Our analysis is framed in terms of scientific credibility, societal legitimacy, and policy saliency, and we discuss cases that have been partially or fully successful in reducing uncertainty via push-and-pull-oriented boundary crossing initiatives. Our findings suggest that two important factors affect the science-policy-societal boundary: a lack of consensus among stakeholders about what constitutes credible knowledge and institutional uncertainty resulting from shifting policies and leadership change. A lack of training for scientific leaders and an apparent 'shelf-life' for community organizations highlight the importance of ongoing institutional support for knowledge integration projects. Institutional support may be enhanced through such investments, such as capacity building and specialized platforms for knowledge integration.

  13. A Best-Estimate Reactor Core Monitor Using State Feedback Strategies to Reduce Uncertainties

    International Nuclear Information System (INIS)

    Martin, Robert P.; Edwards, Robert M.

    2000-01-01

    The development and demonstration of a new algorithm to reduce modeling and state-estimation uncertainty in best-estimate simulation codes has been investigated. Demonstration is given by way of a prototype reactor core monitor. The architecture of this monitor integrates a control-theory-based, distributed-parameter estimation technique into a production-grade best-estimate simulation code. The Kalman Filter-Sequential Least-Squares (KFSLS) parameter estimation algorithm has been extended for application into the computational environment of the best-estimate simulation code RELAP5-3D. In control system terminology, this configuration can be thought of as a 'best-estimate' observer. The application to a distributed-parameter reactor system involves a unique modal model that approximates physical components, such as the reactor, by describing both states and parameters by an orthogonal expansion. The basic KFSLS parameter estimation is used to dynamically refine a spatially varying (distributed) parameter. The application of the distributed-parameter estimator is expected to complement a traditional nonlinear best-estimate simulation code by providing a mechanism for reducing both code input (modeling) and output (state-estimation) uncertainty in complex, distributed-parameter systems

  14. Optimal portfolio design to reduce climate-related conservation uncertainty in the Prairie Pothole Region

    Science.gov (United States)

    Ando, Amy W.; Mallory, Mindy L.

    2012-01-01

    Climate change is likely to alter the spatial distributions of species and habitat types but the nature of such change is uncertain. Thus, climate change makes it difficult to implement standard conservation planning paradigms. Previous work has suggested some approaches to cope with such uncertainty but has not harnessed all of the benefits of risk diversification. We adapt Modern Portfolio Theory (MPT) to optimal spatial targeting of conservation activity, using wetland habitat conservation in the Prairie Pothole Region (PPR) as an example. This approach finds the allocations of conservation activity among subregions of the planning area that maximize the expected conservation returns for a given level of uncertainty or minimize uncertainty for a given expected level of returns. We find that using MPT instead of simple diversification in the PPR can achieve a value of the conservation objective per dollar spent that is 15% higher for the same level of risk. MPT-based portfolios can also have 21% less uncertainty over benefits or 6% greater expected benefits than the current portfolio of PPR conservation. Total benefits from conservation investment are higher if returns are defined in terms of benefit–cost ratios rather than benefits alone. MPT-guided diversification can work to reduce the climate-change–induced uncertainty of future ecosystem-service benefits from many land policy and investment initiatives, especially when outcomes are negatively correlated between subregions of a planning area. PMID:22451914

  15. A hierarchical bayesian model to quantify uncertainty of stream water temperature forecasts.

    Directory of Open Access Journals (Sweden)

    Guillaume Bal

    Full Text Available Providing generic and cost effective modelling approaches to reconstruct and forecast freshwater temperature using predictors as air temperature and water discharge is a prerequisite to understanding ecological processes underlying the impact of water temperature and of global warming on continental aquatic ecosystems. Using air temperature as a simple linear predictor of water temperature can lead to significant bias in forecasts as it does not disentangle seasonality and long term trends in the signal. Here, we develop an alternative approach based on hierarchical Bayesian statistical time series modelling of water temperature, air temperature and water discharge using seasonal sinusoidal periodic signals and time varying means and amplitudes. Fitting and forecasting performances of this approach are compared with that of simple linear regression between water and air temperatures using i an emotive simulated example, ii application to three French coastal streams with contrasting bio-geographical conditions and sizes. The time series modelling approach better fit data and does not exhibit forecasting bias in long term trends contrary to the linear regression. This new model also allows for more accurate forecasts of water temperature than linear regression together with a fair assessment of the uncertainty around forecasting. Warming of water temperature forecast by our hierarchical Bayesian model was slower and more uncertain than that expected with the classical regression approach. These new forecasts are in a form that is readily usable in further ecological analyses and will allow weighting of outcomes from different scenarios to manage climate change impacts on freshwater wildlife.

  16. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses

    Science.gov (United States)

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-07-01

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P-E). Here, we compute P-E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P-E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P-E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P-E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making.

  17. Use of Atmospheric Budget to Reduce Uncertainty in Estimated Water Availability over South Asia from Different Reanalyses.

    Science.gov (United States)

    Sebastian, Dawn Emil; Pathak, Amey; Ghosh, Subimal

    2016-07-08

    Disagreements across different reanalyses over South Asia result into uncertainty in assessment of water availability, which is computed as the difference between Precipitation and Evapotranspiration (P-E). Here, we compute P-E directly from atmospheric budget with divergence of moisture flux for different reanalyses and find improved correlation with observed values of P-E, acquired from station and satellite data. We also find reduced closure terms for water cycle computed with atmospheric budget, analysed over South Asian landmass, when compared to that obtained with individual values of P and E. The P-E value derived with atmospheric budget is more consistent with energy budget, when we use top-of-atmosphere radiation for the same. For analysing water cycle, we use runoff from Global Land Data Assimilation System, and water storage from Gravity Recovery and Climate Experiment. We find improvements in agreements across different reanalyses, in terms of inter-annual cross correlation when atmospheric budget is used to estimate P-E and hence, emphasize to use the same for estimations of water availability in South Asia to reduce uncertainty. Our results on water availability with reduced uncertainty over highly populated monsoon driven South Asia will be useful for water management and agricultural decision making.

  18. Signal detection in global mean temperatures after “Paris”: an uncertainty and sensitivity analysis

    Directory of Open Access Journals (Sweden)

    H. Visser

    2018-02-01

    Full Text Available In December 2015, 195 countries agreed in Paris to hold the increase in global mean surface temperature (GMST well below 2.0 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C. Since large financial flows will be needed to keep GMSTs below these targets, it is important to know how GMST has progressed since pre-industrial times. However, the Paris Agreement is not conclusive as regards methods to calculate it. Should trend progression be deduced from GCM simulations or from instrumental records by (statistical trend methods? Which simulations or GMST datasets should be chosen, and which trend models? What is pre-industrial and, finally, are the Paris targets formulated for total warming, originating from both natural and anthropogenic forcing, or do they refer to anthropogenic warming only? To find answers to these questions we performed an uncertainty and sensitivity analysis where datasets and model choices have been varied. For all cases we evaluated trend progression along with uncertainty information. To do so, we analysed four trend approaches and applied these to the five leading observational GMST products. We find GMST progression to be largely independent of various trend model approaches. However, GMST progression is significantly influenced by the choice of GMST datasets. Uncertainties due to natural variability are largest in size. As a parallel path, we calculated GMST progression from an ensemble of 42 GCM simulations. Mean progression derived from GCM-based GMSTs appears to lie in the range of trend–dataset combinations. A difference between both approaches appears to be the width of uncertainty bands: GCM simulations show a much wider spread. Finally, we discuss various choices for pre-industrial baselines and the role of warming definitions. Based on these findings we propose an estimate for signal progression in GMSTs since pre-industrial.

  19. Determination of temperature measurements uncertainties of the heat transport primary system of Embalse nuclear power plant

    International Nuclear Information System (INIS)

    Pomerantz, Marcelo E.; Coutsiers, Eduardo E.; Moreno, Carlos A.

    1999-01-01

    In this work, the systematic errors in temperature measurements in inlet and outlet headers of HTPS coolant channels of Embalse nuclear power plant are evaluated. These uncertainties are necessary for a later evaluation of the channel power maps transferred to the coolant. The power maps calculated in this way are used to compare power distributions using neutronic codes. Therefore, a methodology to correct systematic errors of temperature in outlet feeders and inlet headers is developed in this work. (author)

  20. Forest management under climatic and social uncertainty: trade-offs between reducing climate change impacts and fostering adaptive capacity.

    Science.gov (United States)

    Seidl, Rupert; Lexer, Manfred J

    2013-01-15

    The unabated continuation of anthropogenic greenhouse gas emissions and the lack of an international consensus on a stringent climate change mitigation policy underscore the importance of adaptation for coping with the all but inevitable changes in the climate system. Adaptation measures in forestry have particularly long lead times. A timely implementation is thus crucial for reducing the considerable climate vulnerability of forest ecosystems. However, since future environmental conditions as well as future societal demands on forests are inherently uncertain, a core requirement for adaptation is robustness to a wide variety of possible futures. Here we explicitly address the roles of climatic and social uncertainty in forest management, and tackle the question of robustness of adaptation measures in the context of multi-objective sustainable forest management (SFM). We used the Austrian Federal Forests (AFF) as a case study, and employed a comprehensive vulnerability assessment framework based on ecosystem modeling, multi-criteria decision analysis, and practitioner participation. We explicitly considered climate uncertainty by means of three climate change scenarios, and accounted for uncertainty in future social demands by means of three societal preference scenarios regarding SFM indicators. We found that the effects of climatic and social uncertainty on the projected performance of management were in the same order of magnitude, underlining the notion that climate change adaptation requires an integrated social-ecological perspective. Furthermore, our analysis of adaptation measures revealed considerable trade-offs between reducing adverse impacts of climate change and facilitating adaptive capacity. This finding implies that prioritization between these two general aims of adaptation is necessary in management planning, which we suggest can draw on uncertainty analysis: Where the variation induced by social-ecological uncertainty renders measures aiming to

  1. County-Level Climate Uncertainty for Risk Assessments: Volume 2 Appendix A - Historical Near-Surface Air Temperature.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.

  2. Uncertainty in Simulating Wheat Yields Under Climate Change

    Science.gov (United States)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; hide

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  3. Intolerance of uncertainty mediates reduced reward anticipation in major depressive disorder.

    Science.gov (United States)

    Nelson, Brady D; Shankman, Stewart A; Proudfit, Greg H

    2014-04-01

    Reduced reward sensitivity has long been considered a fundamental deficit of major depressive disorder (MDD). One way this deficit has been measured is by an asymmetry in electroencephalogram (EEG) activity between left and right frontal brain regions. MDD has been associated with a reduced frontal EEG asymmetry (i.e., decreased left relative to right) while anticipating reward. However, the mechanism (or mediator) of this association is unclear. The present study examined whether intolerance of uncertainty (IU) mediated the association between depression and reduced reward anticipation. Data were obtained from a prior study reporting reduced frontal EEG asymmetry while anticipating reward in early-onset MDD. Participants included 156 individuals with early-onset MDD-only, panic disorder-only, both (comorbids), or controls. Frontal EEG asymmetry was recorded during an uncertain reward anticipation task. Participants completed a self-report measure of IU. All three psychopathology groups reported greater IU relative to controls. Across all participants, greater IU was associated with a reduced frontal EEG asymmetry. Furthermore, IU mediated the relationship between MDD and frontal EEG asymmetry and results remained significant after controlling for neuroticism, suggesting effects were not due to broad negative affectivity. MDD participants were limited to those with early-onset depression. Measures were collected cross-sectionally, precluding causal relationships. IU mediated the relationship between MDD and reduced reward anticipation, independent of neuroticism. Explanations are provided regarding how IU may contribute to reduced reward anticipation in depression. Overall, IU appears to be an important mechanism for the association between depression and reduced reward anticipation. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Religion in the face of uncertainty: an uncertainty-identity theory account of religiousness.

    Science.gov (United States)

    Hogg, Michael A; Adelman, Janice R; Blagg, Robert D

    2010-02-01

    The authors characterize religions as social groups and religiosity as the extent to which a person identifies with a religion, subscribes to its ideology or worldview, and conforms to its normative practices. They argue that religions have attributes that make them well suited to reduce feelings of self-uncertainty. According to uncertainty-identity theory, people are motivated to reduce feelings of uncertainty about or reflecting on self; and identification with groups, particularly highly entitative groups, is a very effective way to reduce uncertainty. All groups provide belief systems and normative prescriptions related to everyday life. However, religions also address the nature of existence, invoking sacred entities and associated rituals and ceremonies. They are entitative groups that provide a moral compass and rules for living that pervade a person's life, making them particularly attractive in times of uncertainty. The authors document data supporting their analysis and discuss conditions that transform religiosity into religious zealotry and extremism.

  5. Uncertainty in temperature-based determination of time of death

    Science.gov (United States)

    Weiser, Martin; Erdmann, Bodo; Schenkl, Sebastian; Muggenthaler, Holger; Hubig, Michael; Mall, Gita; Zachow, Stefan

    2018-03-01

    Temperature-based estimation of time of death (ToD) can be performed either with the help of simple phenomenological models of corpse cooling or with detailed mechanistic (thermodynamic) heat transfer models. The latter are much more complex, but allow a higher accuracy of ToD estimation as in principle all relevant cooling mechanisms can be taken into account. The potentially higher accuracy depends on the accuracy of tissue and environmental parameters as well as on the geometric resolution. We investigate the impact of parameter variations and geometry representation on the estimated ToD. For this, numerical simulation of analytic heat transport models is performed on a highly detailed 3D corpse model, that has been segmented and geometrically reconstructed from a computed tomography (CT) data set, differentiating various organs and tissue types. From that and prior information available on thermal parameters and their variability, we identify the most crucial parameters to measure or estimate, and obtain an a priori uncertainty quantification for the ToD.

  6. Reducing prediction uncertainty of weather controlled systems

    NARCIS (Netherlands)

    Doeswijk, T.G.

    2007-01-01

    In closed agricultural systems the weather acts both as a disturbance and as a resource. By using weather forecasts in control strategies the effects of disturbances can be minimized whereas the resources can be utilized. In this situation weather forecast uncertainty and model based control are

  7. Quantum-memory-assisted entropic uncertainty in spin models with Dzyaloshinskii-Moriya interaction

    Science.gov (United States)

    Huang, Zhiming

    2018-02-01

    In this article, we investigate the dynamics and correlations of quantum-memory-assisted entropic uncertainty, the tightness of the uncertainty, entanglement, quantum correlation and mixedness for various spin chain models with Dzyaloshinskii-Moriya (DM) interaction, including the XXZ model with DM interaction, the XY model with DM interaction and the Ising model with DM interaction. We find that the uncertainty grows to a stable value with growing temperature but reduces as the coupling coefficient, anisotropy parameter and DM values increase. It is found that the entropic uncertainty is closely correlated with the mixedness of the system. The increasing quantum correlation can result in a decrease in the uncertainty, and the robustness of quantum correlation is better than entanglement since entanglement means sudden birth and death. The tightness of the uncertainty drops to zero, apart from slight volatility as various parameters increase. Furthermore, we propose an effective approach to steering the uncertainty by weak measurement reversal.

  8. Reducing, Maintaining, or Escalating Uncertainty? The Development and Validation of Four Uncertainty Preference Scales Related to Cancer Information Seeking and Avoidance.

    Science.gov (United States)

    Carcioppolo, Nick; Yang, Fan; Yang, Qinghua

    2016-09-01

    Uncertainty is a central characteristic of many aspects of cancer prevention, screening, diagnosis, and treatment. Brashers's (2001) uncertainty management theory details the multifaceted nature of uncertainty and describes situations in which uncertainty can both positively and negatively affect health outcomes. The current study extends theory on uncertainty management by developing four scale measures of uncertainty preferences in the context of cancer. Two national surveys were conducted to validate the scales and assess convergent and concurrent validity. Results support the factor structure of each measure and provide general support across multiple validity assessments. These scales can advance research on uncertainty and cancer communication by providing researchers with measures that address multiple aspects of uncertainty management.

  9. County-Level Climate Uncertainty for Risk Assessments: Volume 4 Appendix C - Historical Maximum Near-Surface Air Temperature.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.

  10. County-Level Climate Uncertainty for Risk Assessments: Volume 6 Appendix E - Historical Minimum Near-Surface Air Temperature.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.

  11. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  12. Temperature acclimation of photosynthesis and respiration: A key uncertainty in the carbon cycle-climate feedback

    Science.gov (United States)

    Lombardozzi, Danica L.; Bonan, Gordon B.; Smith, Nicholas G.; Dukes, Jeffrey S.; Fisher, Rosie A.

    2015-10-01

    Earth System Models typically use static responses to temperature to calculate photosynthesis and respiration, but experimental evidence suggests that many plants acclimate to prevailing temperatures. We incorporated representations of photosynthetic and leaf respiratory temperature acclimation into the Community Land Model, the terrestrial component of the Community Earth System Model. These processes increased terrestrial carbon pools by 20 Pg C (22%) at the end of the 21st century under a business-as-usual (Representative Concentration Pathway 8.5) climate scenario. Including the less certain estimates of stem and root respiration acclimation increased terrestrial carbon pools by an additional 17 Pg C (~40% overall increase). High latitudes gained the most carbon with acclimation, and tropical carbon pools increased least. However, results from both of these regions remain uncertain; few relevant data exist for tropical and boreal plants or for extreme temperatures. Constraining these uncertainties will produce more realistic estimates of land carbon feedbacks throughout the 21st century.

  13. Optimum design of forging process parameters and preform shape under uncertainties

    International Nuclear Information System (INIS)

    Repalle, Jalaja; Grandhi, Ramana V.

    2004-01-01

    Forging is a highly complex non-linear process that is vulnerable to various uncertainties, such as variations in billet geometry, die temperature, material properties, workpiece and forging equipment positional errors and process parameters. A combination of these uncertainties could induce heavy manufacturing losses through premature die failure, final part geometric distortion and production risk. Identifying the sources of uncertainties, quantifying and controlling them will reduce risk in the manufacturing environment, which will minimize the overall cost of production. In this paper, various uncertainties that affect forging tool life and preform design are identified, and their cumulative effect on the forging process is evaluated. Since the forging process simulation is computationally intensive, the response surface approach is used to reduce time by establishing a relationship between the system performance and the critical process design parameters. Variability in system performance due to randomness in the parameters is computed by applying Monte Carlo Simulations (MCS) on generated Response Surface Models (RSM). Finally, a Robust Methodology is developed to optimize forging process parameters and preform shape. The developed method is demonstrated by applying it to an axisymmetric H-cross section disk forging to improve the product quality and robustness

  14. Photovoltaic Calibrations at the National Renewable Energy Laboratory and Uncertainty Analysis Following the ISO 17025 Guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Emery, Keith [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2016-09-01

    The measurement of photovoltaic (PV) performance with respect to reference conditions requires measuring current versus voltage for a given tabular reference spectrum, junction temperature, and total irradiance. This report presents the procedures implemented by the PV Cell and Module Performance Characterization Group at the National Renewable Energy Laboratory (NREL) to achieve the lowest practical uncertainty. A rigorous uncertainty analysis of these procedures is presented, which follows the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement. This uncertainty analysis is required for the team’s laboratory accreditation under ISO standard 17025, “General Requirements for the Competence of Testing and Calibration Laboratories.” The report also discusses additional areas where the uncertainty can be reduced.

  15. MRS role in reducing technical uncertainties in geological disposal

    International Nuclear Information System (INIS)

    Ramspott, L.D.

    1990-06-01

    A high-level nuclear waste repository has inherent technical uncertainty due to its first-of-a-kind nature and the unprecedented time over which it must function. Three possible technical modifications to the currently planned US high-level nuclear waste system are reviewed in this paper. These modifications would be facilitated by inclusion of a monitored retrievable storage (MRS) in the system. The modifications are (1) an underground MRS at Yucca Mountain, (2) a phased repository, and (3) a ''cold'' repository. These modifications are intended to enhance scientific confidence that a repository system would function satisfactorily despite technical uncertainty. 12 refs

  16. A universal reduced glass transition temperature for liquids

    Science.gov (United States)

    Fedors, R. F.

    1979-01-01

    Data on the dependence of the glass transition temperature on the molecular structure for low-molecular-weight liquids are analyzed in order to determine whether Boyer's reduced glass transition temperature (1952) is a universal constant as proposed. It is shown that the Boyer ratio varies widely depending on the chemical nature of the molecule. It is pointed out that a characteristic temperature ratio, defined by the ratio of the sum of the melting temperature and the boiling temperature to the sum of the glass transition temperature and the boiling temperature, is a universal constant independent of the molecular structure of the liquid. The average value of the ratio obtained from data for 65 liquids is 1.15.

  17. Stream temperature estimated in situ from thermal-infrared images: best estimate and uncertainty

    International Nuclear Information System (INIS)

    Iezzi, F; Todisco, M T

    2015-01-01

    The paper aims to show a technique to estimate in situ the stream temperature from thermal-infrared images deepening its best estimate and uncertainty. Stream temperature is an important indicator of water quality and nowadays its assessment is important particularly for thermal pollution monitoring in water bodies. Stream temperature changes are especially due to the anthropogenic heat input from urban wastewater and from water used as a coolant by power plants and industrial manufacturers. The stream temperatures assessment using ordinary techniques (e.g. appropriate thermometers) is limited by sparse sampling in space due to a spatial discretization necessarily punctual. Latest and most advanced techniques assess the stream temperature using thermal-infrared remote sensing based on thermal imagers placed usually on aircrafts or using satellite images. These techniques assess only the surface water temperature and they are suitable to detect the temperature of vast water bodies but do not allow a detailed and precise surface water temperature assessment in limited areas of the water body. The technique shown in this research is based on the assessment of thermal-infrared images obtained in situ via portable thermal imager. As in all thermographic techniques, also in this technique, it is possible to estimate only the surface water temperature. A stream with the presence of a discharge of urban wastewater is proposed as case study to validate the technique and to show its application limits. Since the technique analyzes limited areas in extension of the water body, it allows a detailed and precise assessment of the water temperature. In general, the punctual and average stream temperatures are respectively uncorrected and corrected. An appropriate statistical method that minimizes the errors in the average stream temperature is proposed. The correct measurement of this temperature through the assessment of thermal- infrared images obtained in situ via portable

  18. Simultaneous determination of reference free-stream temperature and convective heat transfer coefficients

    International Nuclear Information System (INIS)

    Jeong, Gi Ho; Song, Ki Bum; Kim, Kui Soon

    2001-01-01

    This paper deals with the development of a new method that can obtain heat transfer coefficient and reference free stream temperature simultaneously. The method is based on transient heat transfer experiments using two narrow-band TLCs. The method is validated through error analysis in terms of the random uncertainties in the measured temperatures. It is shown how the uncertainties in heat transfer coefficient and free stream temperature can be reduced. The general method described in this paper is applicable to many heat transfer models with unknown free stream temperature

  19. A framework to quantify uncertainties of seafloor backscatter from swath mapping echosounders

    Science.gov (United States)

    Malik, Mashkoor; Lurton, Xavier; Mayer, Larry

    2018-06-01

    Multibeam echosounders (MBES) have become a widely used acoustic remote sensing tool to map and study the seafloor, providing co-located bathymetry and seafloor backscatter. Although the uncertainty associated with MBES-derived bathymetric data has been studied extensively, the question of backscatter uncertainty has been addressed only minimally and hinders the quantitative use of MBES seafloor backscatter. This paper explores approaches to identifying uncertainty sources associated with MBES-derived backscatter measurements. The major sources of uncertainty are catalogued and the magnitudes of their relative contributions to the backscatter uncertainty budget are evaluated. These major uncertainty sources include seafloor insonified area (1-3 dB), absorption coefficient (up to > 6 dB), random fluctuations in echo level (5.5 dB for a Rayleigh distribution), and sonar calibration (device dependent). The magnitudes of these uncertainty sources vary based on how these effects are compensated for during data acquisition and processing. Various cases (no compensation, partial compensation and full compensation) for seafloor insonified area, transmission losses and random fluctuations were modeled to estimate their uncertainties in different scenarios. Uncertainty related to the seafloor insonified area can be reduced significantly by accounting for seafloor slope during backscatter processing while transmission losses can be constrained by collecting full water column absorption coefficient profiles (temperature and salinity profiles). To reduce random fluctuations to below 1 dB, at least 20 samples are recommended to be used while computing mean values. The estimation of uncertainty in backscatter measurements is constrained by the fact that not all instrumental components are characterized and documented sufficiently for commercially available MBES. Further involvement from manufacturers in providing this essential information is critically required.

  20. Do regional methods really help reduce uncertainties in flood frequency analyses?

    Science.gov (United States)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged

  1. Reduced dose uncertainty in MRI-based polymer gel dosimetry using parallel RF transmission with multiple RF sources

    International Nuclear Information System (INIS)

    Sang-Young Kim; Jung-Hoon Lee; Jin-Young Jung; Do-Wan Lee; Seu-Ran Lee; Bo-Young Choe; Hyeon-Man Baek; Korea University of Science and Technology, Daejeon; Dae-Hyun Kim; Jung-Whan Min; Ji-Yeon Park

    2014-01-01

    In this work, we present the feasibility of using a parallel RF transmit with multiple RF sources imaging method (MultiTransmit imaging) in polymer gel dosimetry. Image quality and B 1 field homogeneity was statistically better in the MultiTransmit imaging method than in conventional single source RF transmission imaging method. In particular, the standard uncertainty of R 2 was lower on the MultiTransmit images than on the conventional images. Furthermore, the MultiTransmit measurement showed improved dose resolution. Improved image quality and B 1 homogeneity results in reduced dose uncertainty, thereby suggesting the feasibility of MultiTransmit MR imaging in gel dosimetry. (author)

  2. One Strategy for Reducing Uncertainty in Climate Change Communications

    Science.gov (United States)

    Romm, J.

    2011-12-01

    Future impacts of climate change are invariably presented with a very wide range of impacts reflecting two different sets of uncertainties. The first concerns our uncertainty about precisely how much greenhouse gas emissions humanity will emit into the atmosphere. The second concerns our uncertainty about precisely what impact those emissions will have on the climate. By failing to distinguish between these two types of uncertainties, climate scientists have not clearly explained to the public and policymakers what the scientific literature suggests is likely to happen if we don't substantially alter our current emissions path. Indeed, much of climate communications has been built around describing the range of impacts from emissions paths that are increasingly implausible given political and technological constraints, such as a stabilization at 450 or 550 parts per million atmospheric of carbon dioxide. For the past decade, human emissions of greenhouse gases have trended near the worst-case scenarios of the Intergovernmental Panel on Climate Change, emissions paths that reach 800 ppm or even 1000 ppm. The current policies of the two biggest emitters, the United States and China, coupled with the ongoing failure of international negotiations to come to an agreement on restricting emissions, suggests that recent trends will continue for the foreseeable future. This in turn suggests that greater clarity in climate change communications could be achieved by more clearly explaining to the public what the scientific literature suggests the range of impacts are for our current high emissions path. This also suggests that more focus should be given in the scientific literature to better constraining the range of impacts from the high emissions scenarios.

  3. Integrating geophysics and hydrology for reducing the uncertainty of groundwater model predictions and improved prediction performance

    DEFF Research Database (Denmark)

    Christensen, Nikolaj Kruse; Christensen, Steen; Ferre, Ty

    the integration of geophysical data in the construction of a groundwater model increases the prediction performance. We suggest that modelers should perform a hydrogeophysical “test-bench” analysis of the likely value of geophysics data for improving groundwater model prediction performance before actually...... and the resulting predictions can be compared with predictions from the ‘true’ model. By performing this analysis we expect to give the modeler insight into how the uncertainty of model-based prediction can be reduced.......A major purpose of groundwater modeling is to help decision-makers in efforts to manage the natural environment. Increasingly, it is recognized that both the predictions of interest and their associated uncertainties should be quantified to support robust decision making. In particular, decision...

  4. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-01-01

    Uncertainties of computer results are of primary interest in applications such as high-level waste (HLW) repository performance assessment in which experimental validation is not possible or practical. This work presents an alternate deterministic approach for calculating uncertainties that has the potential to significantly reduce the number of computer runs required for conventional statistical analysis. 7 refs., 1 fig

  5. Reduced temperature phase diagrams of the silver-rare earths binary systems

    International Nuclear Information System (INIS)

    Ferro, R.; Delfino, S.; Capelli, R.; Borsese, A.

    1975-01-01

    Phase equilibria of the silver-rare earth binary systems have been reported in ''reduced temperature'' diagrams (the ''reduced temperature'' being defined as the ratio between a characteristic temperature of the Agsub(x)R.E. phase and the melting temperature of the corresponding R.E. metal, both in 0 K). The smooth trends of the various characteristic reduced temperatures, when plotted against the R.E. atomic number, have been demonstrated. On passing from the light- to the heavy-rare-earths, a correlation has been found between the crossing of these curves and other phenomena, such as the disappearing of the Ag 5 R.E. phases from incongruently, to congruently melting compounds. The trends of the reduced-temperature curves have been briefly discussed in terms of the treatment suggested by Gschneidner together with the volumetric data known for the different Agsub(x)R.E. phases. In addition, the characteristic data of the 1:1 AgR.E. compounds have been compared with those of the analogous AuR.E. phases. (Auth.)

  6. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  7. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....

  8. Uncertainties in Climatological Seawater Density Calculations

    Science.gov (United States)

    Dai, Hao; Zhang, Xining

    2018-03-01

    In most applications, with seawater conductivity, temperature, and pressure data measured in situ by various observation instruments e.g., Conductivity-Temperature-Depth instruments (CTD), the density which has strong ties to ocean dynamics and so on is computed according to equations of state for seawater. This paper, based on density computational formulae in the Thermodynamic Equation of Seawater 2010 (TEOS-10), follows the Guide of the expression of Uncertainty in Measurement (GUM) and assesses the main sources of uncertainties. By virtue of climatological decades-average temperature/Practical Salinity/pressure data sets in the global ocean provided by the National Oceanic and Atmospheric Administration (NOAA), correlation coefficients between uncertainty sources are determined and the combined standard uncertainties uc>(ρ>) in seawater density calculations are evaluated. For grid points in the world ocean with 0.25° resolution, the standard deviations of uc>(ρ>) in vertical profiles cover the magnitude order of 10-4 kg m-3. The uc>(ρ>) means in vertical profiles of the Baltic Sea are about 0.028kg m-3 due to the larger scatter of Absolute Salinity anomaly. The distribution of the uc>(ρ>) means in vertical profiles of the world ocean except for the Baltic Sea, which covers the range of >(0.004,0.01>) kg m-3, is related to the correlation coefficient r>(SA,p>) between Absolute Salinity SA and pressure p. The results in the paper are based on sensors' measuring uncertainties of high accuracy CTD. Larger uncertainties in density calculations may arise if connected with lower sensors' specifications. This work may provide valuable uncertainty information required for reliability considerations of ocean circulation and global climate models.

  9. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  10. Calibrating airborne measurements of airspeed, pressure and temperature using a Doppler laser air-motion sensor

    Directory of Open Access Journals (Sweden)

    W. A. Cooper

    2014-09-01

    Full Text Available A new laser air-motion sensor measures the true airspeed with a standard uncertainty of less than 0.1 m s−1 and so reduces uncertainty in the measured component of the relative wind along the longitudinal axis of the aircraft to about the same level. The calculated pressure expected from that airspeed at the inlet of a pitot tube then provides a basis for calibrating the measurements of dynamic and static pressure, reducing standard uncertainty in those measurements to less than 0.3 hPa and the precision applicable to steady flight conditions to about 0.1 hPa. These improved measurements of pressure, combined with high-resolution measurements of geometric altitude from the global positioning system, then indicate (via integrations of the hydrostatic equation during climbs and descents that the offset and uncertainty in temperature measurement for one research aircraft are +0.3 ± 0.3 °C. For airspeed, pressure and temperature, these are significant reductions in uncertainty vs. those obtained from calibrations using standard techniques. Finally, it is shown that although the initial calibration of the measured static and dynamic pressures requires a measured temperature, once calibrated these measured pressures and the measurement of airspeed from the new laser air-motion sensor provide a measurement of temperature that does not depend on any other temperature sensor.

  11. Visualizing uncertainty : Towards a better understanding of weather forecasts

    NARCIS (Netherlands)

    Toet, A.; Tak, S.; Erp, J.B.F. van

    2016-01-01

    Uncertainty visualizations are increasingly used in communications to the general public. A well-known example is the weather forecast. Rather than providing an exact temperature value, weather forecasts often show the range in which the temperature will lie. But uncertainty visualizations are also

  12. Isotopic effects in the neon fixed point: uncertainty of the calibration data correction

    Science.gov (United States)

    Steur, Peter P. M.; Pavese, Franco; Fellmuth, Bernd; Hermier, Yves; Hill, Kenneth D.; Seog Kim, Jin; Lipinski, Leszek; Nagao, Keisuke; Nakano, Tohru; Peruzzi, Andrea; Sparasci, Fernando; Szmyrka-Grzebyk, Anna; Tamura, Osamu; Tew, Weston L.; Valkiers, Staf; van Geel, Jan

    2015-02-01

    The neon triple point is one of the defining fixed points of the International Temperature Scale of 1990 (ITS-90). Although recognizing that natural neon is a mixture of isotopes, the ITS-90 definition only states that the neon should be of ‘natural isotopic composition’, without any further requirements. A preliminary study in 2005 indicated that most of the observed variability in the realized neon triple point temperatures within a range of about 0.5 mK can be attributed to the variability in isotopic composition among different samples of ‘natural’ neon. Based on the results of an International Project (EUROMET Project No. 770), the Consultative Committee for Thermometry decided to improve the realization of the neon fixed point by assigning the ITS-90 temperature value 24.5561 K to neon with the isotopic composition recommended by IUPAC, accompanied by a quadratic equation to take the deviations from the reference composition into account. In this paper, the uncertainties of the equation are discussed and an uncertainty budget is presented. The resulting standard uncertainty due to the isotopic effect (k = 1) after correction of the calibration data is reduced to (4 to 40) μK when using neon of ‘natural’ isotopic composition or to 30 μK when using 20Ne. For comparison, an uncertainty component of 0.15 mK should be included in the uncertainty budget for the neon triple point if the isotopic composition is unknown, i.e. whenever the correction cannot be applied.

  13. Multi-model analysis of terrestrial carbon cycles in Japan: reducing uncertainties in model outputs among different terrestrial biosphere models using flux observations

    Science.gov (United States)

    Ichii, K.; Suzuki, T.; Kato, T.; Ito, A.; Hajima, T.; Ueyama, M.; Sasai, T.; Hirata, R.; Saigusa, N.; Ohtani, Y.; Takagi, K.

    2009-08-01

    Terrestrial biosphere models show large uncertainties when simulating carbon and water cycles, and reducing these uncertainties is a priority for developing more accurate estimates of both terrestrial ecosystem statuses and future climate changes. To reduce uncertainties and improve the understanding of these carbon budgets, we investigated the ability of flux datasets to improve model simulations and reduce variabilities among multi-model outputs of terrestrial biosphere models in Japan. Using 9 terrestrial biosphere models (Support Vector Machine-based regressions, TOPS, CASA, VISIT, Biome-BGC, DAYCENT, SEIB, LPJ, and TRIFFID), we conducted two simulations: (1) point simulations at four flux sites in Japan and (2) spatial simulations for Japan with a default model (based on original settings) and an improved model (based on calibration using flux observations). Generally, models using default model settings showed large deviations in model outputs from observation with large model-by-model variability. However, after we calibrated the model parameters using flux observations (GPP, RE and NEP), most models successfully simulated seasonal variations in the carbon cycle, with less variability among models. We also found that interannual variations in the carbon cycle are mostly consistent among models and observations. Spatial analysis also showed a large reduction in the variability among model outputs, and model calibration using flux observations significantly improved the model outputs. These results show that to reduce uncertainties among terrestrial biosphere models, we need to conduct careful validation and calibration with available flux observations. Flux observation data significantly improved terrestrial biosphere models, not only on a point scale but also on spatial scales.

  14. Results from the Application of Uncertainty Methods in the CSNI Uncertainty Methods Study (UMS)

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a - best estimate - concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced -best estimate- thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges.

  15. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  16. Uncertainty in solid precipitation and snow depth prediction for Siberia using the Noah and Noah-MP land surface models

    Science.gov (United States)

    Suzuki, Kazuyoshi; Zupanski, Milija

    2018-01-01

    In this study, we investigate the uncertainties associated with land surface processes in an ensemble predication context. Specifically, we compare the uncertainties produced by a coupled atmosphere-land modeling system with two different land surface models, the Noah- MP land surface model (LSM) and the Noah LSM, by using the Maximum Likelihood Ensemble Filter (MLEF) data assimilation system as a platform for ensemble prediction. We carried out 24-hour prediction simulations in Siberia with 32 ensemble members beginning at 00:00 UTC on 5 March 2013. We then compared the model prediction uncertainty of snow depth and solid precipitation with observation-based research products and evaluated the standard deviation of the ensemble spread. The prediction skill and ensemble spread exhibited high positive correlation for both LSMs, indicating a realistic uncertainty estimation. The inclusion of a multiple snowlayer model in the Noah-MP LSM was beneficial for reducing the uncertainties of snow depth and snow depth change compared to the Noah LSM, but the uncertainty in daily solid precipitation showed minimal difference between the two LSMs. The impact of LSM choice in reducing temperature uncertainty was limited to surface layers of the atmosphere. In summary, we found that the more sophisticated Noah-MP LSM reduces uncertainties associated with land surface processes compared to the Noah LSM. Thus, using prediction models with improved skill implies improved predictability and greater certainty of prediction.

  17. Vacuum Radiance-Temperature Standard Facility for Infrared Remote Sensing at NIM

    Science.gov (United States)

    Hao, X. P.; Song, J.; Xu, M.; Sun, J. P.; Gong, L. Y.; Yuan, Z. D.; Lu, X. F.

    2018-06-01

    As infrared remote sensors are very important parts of Earth observation satellites, they must be calibrated based on the radiance temperature of a blackbody in a vacuum chamber prior to launch. The uncertainty of such temperature is thus an essential component of the sensors' uncertainty. This paper describes the vacuum radiance-temperature standard facility (VRTSF) at the National Institute of Metrology of China, which will serve to calibrate infrared remote sensors on Chinese meteorological satellites. The VRTSF can be used to calibrate vacuum blackbody radiance temperature, including those used to calibrate infrared remote sensors. The components of the VRTSF are described in this paper, including the VMTBB, the LNBB, the FTIR spectrometer, the reduced-background optical system, the vacuum chamber used to calibrate customers' blackbody, the vacuum-pumping system and the liquid-nitrogen-support system. The experimental methods and results are expounded. The uncertainty of the radiance temperature of VMTBB is 0.026 °C at 30 °C over 10 μm.

  18. Uncertainty Propagation in OMFIT

    Science.gov (United States)

    Smith, Sterling; Meneghini, Orso; Sung, Choongki

    2017-10-01

    A rigorous comparison of power balance fluxes and turbulent model fluxes requires the propagation of uncertainties in the kinetic profiles and their derivatives. Making extensive use of the python uncertainties package, the OMFIT framework has been used to propagate covariant uncertainties to provide an uncertainty in the power balance calculation from the ONETWO code, as well as through the turbulent fluxes calculated by the TGLF code. The covariant uncertainties arise from fitting 1D (constant on flux surface) density and temperature profiles and associated random errors with parameterized functions such as a modified tanh. The power balance and model fluxes can then be compared with quantification of the uncertainties. No effort is made at propagating systematic errors. A case study will be shown for the effects of resonant magnetic perturbations on the kinetic profiles and fluxes at the top of the pedestal. A separate attempt at modeling the random errors with Monte Carlo sampling will be compared to the method of propagating the fitting function parameter covariant uncertainties. Work supported by US DOE under DE-FC02-04ER54698, DE-FG2-95ER-54309, DE-SC 0012656.

  19. The trade-off between short- and long-lived greenhouse gases under uncertainty and learning

    International Nuclear Information System (INIS)

    Aaheim, H. Asbjoern; Brekke, Kjell Arne; Lystad, Terje; Torvanger, Asbjoern

    2001-01-01

    To find an optimal climate policy we must balance abatement of different greenhouse gases. There is substantial uncertainty about future damages from climate change, but we will learn more over the next few decades. Gases vary in terms of how long they remain in the atmosphere, which means that equivalent pulse emissions have very different climate impacts. Such differences between gases are important in consideration of uncertainty and learning about future damages, but they are disregarded by the conventional concept of Global Warming Potential We have developed a numerical model to analyze how uncertainty and learning affect optimal emissions of both CO 2 and CH 4 . In the model, emissions of these greenhouse gases lead to global temperature increases and production losses. New information about the severity of the climate problem arrives either in 2010 or in 2020. We find that uncertainty causes increased optimal abatement of both gases, compared to the certainty case. This effect amounts to 0.08 o C less expected temperature increase by year 2200. Learning leads to less abatement for both gases since expected future marginal damages from emissions are reduced. This effect is less pronounced for the short-lived CH 4 . (author)

  20. Uncertainty analysis of light water reactor unit fuel pin cells

    Energy Technology Data Exchange (ETDEWEB)

    Kamerow, S.; Ivanov, K., E-mail: sln107@PSU.EDU, E-mail: kni1@PSU.EDU [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, PA (United States); Moreno, C. Arenas, E-mail: cristina.arenas@UPC.EDU [Department of Physics and Nuclear Engineering, Technical University of Catalonia, Barcelona (Spain)

    2011-07-01

    The study explored the calculation of uncertainty based on available covariance data and computational tools. Uncertainty due to temperature changes and different fuel compositions are the main focus of this analysis. Selected unit fuel pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analyses were performed using TSUNAMI-1D sequence in SCALE 6.0. It was found that uncertainties increase with increasing temperature while k{sub eff} decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributor of uncertainty, namely nuclide reaction {sup 238}U (n, gamma). The sensitivity grew larger as the capture cross-section of {sup 238}U expanded due to Doppler broadening. In addition, three different compositions (UOx, MOx, and UOxGd{sub 2}O{sub 3}) of fuel cells were analyzed. It showed a remarkable increase in uncertainty in k{sub eff} for the case of the MOx fuel cell and UOxGd{sub 2}O{sub 3} fuel cell. The increase in the uncertainty of k{sub eff} in UOxGd{sub 2}O{sub 3} fuel was nearly twice of that in MOx fuel and almost four times the amount in UOx fuel. The components of the uncertainties in k{sub eff} in each case were examined and it was found that the neutron-nuclide reaction of {sup 238}U, mainly (n,n'), contributed the most to the uncertainties in the cases of MOx and UOxGd{sub 2}O{sub 3}. At higher energy, the covariance coefficient matrix of {sup 238}U (n,n') to {sup 238}U (n,n') and {sup 238}U (n,n') cross-section showed very large values. Further, examination of the UOxGd{sub 2}O{sub 3} case found that the {sup 238}U (n,n') became the dominant contributor to the uncertainty because most of the thermal neutrons in the cell were absorbed by Gadolinium in UOxGd{sub 2}O{sub 3} case and thus shifting the neutron spectrum to higher energy. For the MOx case on other hand, {sup 239}Pu has a very strong absorption cross-section at low energy

  1. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    Science.gov (United States)

    Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.

    2018-01-01

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

  2. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  3. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  4. Scenario and modelling uncertainty in global mean temperature change derived from emission driven Global Climate Models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D.

    2012-09-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration driven simulations (with 10-90 percentile ranges of 1.7 K for the aggressive mitigation scenario up to 3.9 K for the high end business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 degrees (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission driven experiments, they do not change existing expectations (based on previous concentration driven experiments) on the timescale that different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration pathways used to drive GCM ensembles lies towards the lower end of our simulated distribution. This design decision (a legecy of previous assessments) is likely to lead concentration driven experiments to under-sample strong feedback responses in concentration driven projections. Our ensemble of emission driven simulations span the global temperature response of other multi-model frameworks except at the low end, where combinations of low climate sensitivity and low carbon cycle feedbacks lead to responses outside our ensemble range. The ensemble simulates a number of high end responses which lie above the CMIP5 carbon

  5. Scenario and modelling uncertainty in global mean temperature change derived from emission-driven global climate models

    Science.gov (United States)

    Booth, B. B. B.; Bernie, D.; McNeall, D.; Hawkins, E.; Caesar, J.; Boulton, C.; Friedlingstein, P.; Sexton, D. M. H.

    2013-04-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10-90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario). A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5) and even under aggressive mitigation (RCP2.6) temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments) on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs), the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments) is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high-end responses which lie

  6. Scenario and modelling uncertainty in global mean temperature change derived from emission-driven global climate models

    Directory of Open Access Journals (Sweden)

    B. B. B. Booth

    2013-04-01

    Full Text Available We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission-driven rather than concentration-driven perturbed parameter ensemble of a global climate model (GCM. These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concentration-driven simulations (with 10–90th percentile ranges of 1.7 K for the aggressive mitigation scenario, up to 3.9 K for the high-end, business as usual scenario. A small minority of simulations resulting from combinations of strong atmospheric feedbacks and carbon cycle responses show temperature increases in excess of 9 K (RCP8.5 and even under aggressive mitigation (RCP2.6 temperatures in excess of 4 K. While the simulations point to much larger temperature ranges for emission-driven experiments, they do not change existing expectations (based on previous concentration-driven experiments on the timescales over which different sources of uncertainty are important. The new simulations sample a range of future atmospheric concentrations for each emission scenario. Both in the case of SRES A1B and the Representative Concentration Pathways (RCPs, the concentration scenarios used to drive GCM ensembles, lies towards the lower end of our simulated distribution. This design decision (a legacy of previous assessments is likely to lead concentration-driven experiments to under-sample strong feedback responses in future projections. Our ensemble of emission-driven simulations span the global temperature response of the CMIP5 emission-driven simulations, except at the low end. Combinations of low climate sensitivity and low carbon cycle feedbacks lead to a number of CMIP5 responses to lie below our ensemble range. The ensemble simulates a number of high

  7. Planck 2013 results. III. LFI systematic uncertainties

    CERN Document Server

    Aghanim, N; Arnaud, M; Ashdown, M; Atrio-Barandela, F; Aumont, J; Baccigalupi, C; Banday, A J; Barreiro, R B; Battaner, E; Benabed, K; Benoît, A; Benoit-Lévy, A; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bobin, J; Bock, J J; Bonaldi, A; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Bridges, M; Bucher, M; Burigana, C; Butler, R C; Cardoso, J -F; Catalano, A; Chamballu, A; Chiang, L -Y; Christensen, P R; Church, S; Colombi, S; Colombo, L P L; Crill, B P; Cruz, M; Curto, A; Cuttaia, F; Danese, L; Davies, R D; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Dick, J; Dickinson, C; Diego, J M; Dole, H; Donzelli, S; Doré, O; Douspis, M; Dupac, X; Efstathiou, G; Enßlin, T A; Eriksen, H K; Finelli, F; Forni, O; Frailis, M; Franceschi, E; Gaier, T C; Galeotta, S; Ganga, K; Giard, M; Giraud-Héraud, Y; Gjerløw, E; González-Nuevo, J; Górski, K M; Gratton, S; Gregorio, A; Gruppuso, A; Hansen, F K; Hanson, D; Harrison, D; Henrot-Versillé, S; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hivon, E; Hobson, M; Holmes, W A; Hornstrup, A; Hovest, W; Huffenberger, K M; Jaffe, T R; Jaffe, A H; Jewell, J; Jones, W C; Juvela, M; Kangaslahti, P; Keihänen, E; Keskitalo, R; Kiiveri, K; Kisner, T S; Knoche, J; Knox, L; Kunz, M; Kurki-Suonio, H; Lagache, G; Lähteenmäki, A; Lamarre, J -M; Lasenby, A; Laureijs, R J; Lawrence, C R; Leahy, J P; Leonardi, R; Lesgourgues, J; Liguori, M; Lilje, P B; Lindholm, V; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Macías-Pérez, J F; Maino, D; Mandolesi, N; Maris, M; Marshall, D J; Martin, P G; Martínez-González, E; Masi, S; Matarrese, S; Matthai, F; Mazzotta, P; Meinhold, P R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Naselsky, P; Natoli, P; Netterfield, C B; Nørgaard-Nielsen, H U; Novikov, D; Novikov, I; O'Dwyer, I J; Osborne, S; Paci, F; Pagano, L; Paladini, R; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Peel, M; Perdereau, O; Perotto, L; Perrotta, F; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Platania, P; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Poutanen, T; Pratt, G W; Prézeau, G; Prunet, S; Puget, J -L; Rachen, J P; Rebolo, R; Reinecke, M; Remazeilles, M; Ricciardi, S; Riller, T; Rocha, G; Rosset, C; Rossetti, M; Roudier, G; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Scott, D; Seiffert, M D; Shellard, E P S; Spencer, L D; Starck, J -L; Stolyarov, V; Stompor, R; Sureau, F; Sutton, D; Suur-Uski, A -S; Sygnet, J -F; Tauber, J A; Tavagnacco, D; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Tuovinen, J; Türler, M; Umana, G; Valenziano, L; Valiviita, J; Van Tent, B; Varis, J; Vielva, P; Villa, F; Vittorio, N; Wade, L A; Wandelt, B D; Watson, R; Wilkinson, A; Yvon, D; Zacchei, A; Zonca, A

    2014-01-01

    We present the current estimate of instrumental and systematic effect uncertainties for the Planck-Low Frequency Instrument relevant to the first release of the Planck cosmological results. We give an overview of the main effects and of the tools and methods applied to assess residuals in maps and power spectra. We also present an overall budget of known systematic effect uncertainties, which are dominated sidelobe straylight pick-up and imperfect calibration. However, even these two effects are at least two orders of magnitude weaker than the cosmic microwave background (CMB) fluctuations as measured in terms of the angular temperature power spectrum. A residual signal above the noise level is present in the multipole range $\\ell<20$, most notably at 30 GHz, and is likely caused by residual Galactic straylight contamination. Current analysis aims to further reduce the level of spurious signals in the data and to improve the systematic effects modelling, in particular with respect to straylight and calibra...

  8. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  9. Chemical kinetic model uncertainty minimization through laminar flame speed measurements

    Science.gov (United States)

    Park, Okjoo; Veloo, Peter S.; Sheen, David A.; Tao, Yujie; Egolfopoulos, Fokion N.; Wang, Hai

    2016-01-01

    Laminar flame speed measurements were carried for mixture of air with eight C3-4 hydrocarbons (propene, propane, 1,3-butadiene, 1-butene, 2-butene, iso-butene, n-butane, and iso-butane) at the room temperature and ambient pressure. Along with C1-2 hydrocarbon data reported in a recent study, the entire dataset was used to demonstrate how laminar flame speed data can be utilized to explore and minimize the uncertainties in a reaction model for foundation fuels. The USC Mech II kinetic model was chosen as a case study. The method of uncertainty minimization using polynomial chaos expansions (MUM-PCE) (D.A. Sheen and H. Wang, Combust. Flame 2011, 158, 2358–2374) was employed to constrain the model uncertainty for laminar flame speed predictions. Results demonstrate that a reaction model constrained only by the laminar flame speed values of methane/air flames notably reduces the uncertainty in the predictions of the laminar flame speeds of C3 and C4 alkanes, because the key chemical pathways of all of these flames are similar to each other. The uncertainty in model predictions for flames of unsaturated C3-4 hydrocarbons remain significant without considering fuel specific laminar flames speeds in the constraining target data set, because the secondary rate controlling reaction steps are different from those in the saturated alkanes. It is shown that the constraints provided by the laminar flame speeds of the foundation fuels could reduce notably the uncertainties in the predictions of laminar flame speeds of C4 alcohol/air mixtures. Furthermore, it is demonstrated that an accurate prediction of the laminar flame speed of a particular C4 alcohol/air mixture is better achieved through measurements for key molecular intermediates formed during the pyrolysis and oxidation of the parent fuel. PMID:27890938

  10. Iterative Boltzmann plot method for temperature and pressure determination in a xenon high pressure discharge lamp

    Energy Technology Data Exchange (ETDEWEB)

    Zalach, J.; Franke, St. [INP Greifswald, Felix-Hausdorff-Str. 2, 17489 Greifswald (Germany)

    2013-01-28

    The Boltzmann plot method allows to calculate plasma temperatures and pressures if absolutely calibrated emission coefficients of spectral lines are available. However, xenon arcs are not very well suited to be analyzed this way, as there are only a limited number of lines with atomic data available. These lines have high excitation energies in a small interval between 9.8 and 11.5 eV. Uncertainties in the experimental method and in the atomic data further limit the accuracy of the evaluation procedure. This may result in implausible values of temperature and pressure with inadmissible uncertainty. To omit these shortcomings, an iterative scheme is proposed that is making use of additional information about the xenon fill pressure. This method is proved to be robust against noisy data and significantly reduces the uncertainties. Intentionally distorted synthetic data are used to illustrate the performance of the method, and measurements performed on a laboratory xenon high pressure discharge lamp are analyzed resulting in reasonable temperatures and pressures with significantly reduced uncertainties.

  11. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  12. A reduced low-temperature electro-thermal coupled model for lithium-ion batteries

    International Nuclear Information System (INIS)

    Jiang, Jiuchun; Ruan, Haijun; Sun, Bingxiang; Zhang, Weige; Gao, Wenzhong; Wang, Le Yi; Zhang, Linjing

    2016-01-01

    Highlights: • A reduced low-temperature electro-thermal coupled model is proposed. • A novel frequency-dependent equation for polarization parameters is presented. • The model is validated under different frequency and low-temperature conditions. • The reduced model exhibits a high accuracy with a low computational effort. • The adaptability of the proposed methodology for model reduction is verified. - Abstract: A low-temperature electro-thermal coupled model, which is based on the electrochemical mechanism, is developed to accurately capture both electrical and thermal behaviors of batteries. Activation energies reveal that temperature dependence of resistances is greater than that of capacitances. The influence of frequency on polarization voltage and irreversible heat is discussed, and frequency dependence of polarization resistance and capacitance is obtained. Based on the frequency-dependent equation, a reduced low-temperature electro-thermal coupled model is proposed and experimentally validated under different temperature, frequency and amplitude conditions. Simulation results exhibit good agreement with experimental data, where the maximum relative voltage error and temperature error are below 2.65% and 1.79 °C, respectively. The reduced model is demonstrated to have almost the same accuracy as the original model and require a lower computational effort. The effectiveness and adaptability of the proposed methodology for model reduction is verified using batteries with three different cathode materials from different manufacturers. The reduced model, thanks to its high accuracy and simplicity, provides a promising candidate for development of rapid internal heating and optimal charging strategies at low temperature, and for evaluation of the state of battery health in on-board battery management system.

  13. The Uncertainty Multiplier and Business Cycles

    OpenAIRE

    Saijo, Hikaru

    2013-01-01

    I study a business cycle model where agents learn about the state of the economy by accumulating capital. During recessions, agents invest less, and this generates noisier estimates of macroeconomic conditions and an increase in uncertainty. The endogenous increase in aggregate uncertainty further reduces economic activity, which in turn leads to more uncertainty, and so on. Thus, through changes in uncertainty, learning gives rise to a multiplier effect that amplifies business cycles. I use ...

  14. A study on the temperature distribution in the hot leg pipe

    International Nuclear Information System (INIS)

    Choe, Yoon-Jae; Baik, Se-Jin; Jang, Ho-Cheol; Lee, Byung-Jin; Im, In-Young; Ro, Tae-Sun

    2003-01-01

    In the hot leg pipes of reactor coolant system of the Korean Standard Nuclear Power Plant (KSNP), a non-uniform distribution in temperature has been observed across the cross-section, which is attributed to the non-uniformity of power distribution in the reactor core usually having a peak in the center region, and to the colder coolant bypass flow through the reactor vessel outlet nozzle clearances. As a result, the arithmetic mean temperature of four Resistance Temperature Detectors (RTDs) installed in each hot leg - two in the upper region and two in the lower region around the pipe wall may not correctly represent the actual coolant bulk temperature. It is also believed that there is a skewness in the velocity profile in the hot leg pipe due to the sudden changes in the flow direction and area from the core to the hot leg pipe, through the reactor vessel outlet plenum. These temperature non-uniformity and velocity skewness affect the measurement of the plant parameter such as the reactor coolant flow rate which is calculated by using the bulk temperature of hot leg pipes. A computational analysis has been performed to simulate the temperature and velocity distributions and to evaluate the uncertainty of temperature correction offset in the hot leg pipe. A commercial CFD code, FLUENT, is used for this analysis. The analysis results are compared with the operational data of KSNP and the scaled-down model test data for System 80. From the comparisons, an uncertainty of correction offset is obtained to measure the bulk temperature of hot leg more accurately, which can be also applied to the operating plants, leading to the reduction of temperature measurement uncertainty. Since the uncertainty of temperature in the hot leg pipe is one of major parameters to calculate the uncertainty of the reactor coolant flow rate, the analysis results can contribute to the improvement of the plant performance and safety by reducing the uncertainty of temperature measurement

  15. Reducing consistency in human realism increases the uncanny valley effect; increasing category uncertainty does not.

    Science.gov (United States)

    MacDorman, Karl F; Chattopadhyay, Debaleena

    2016-01-01

    Human replicas may elicit unintended cold, eerie feelings in viewers, an effect known as the uncanny valley. Masahiro Mori, who proposed the effect in 1970, attributed it to inconsistencies in the replica's realism with some of its features perceived as human and others as nonhuman. This study aims to determine whether reducing realism consistency in visual features increases the uncanny valley effect. In three rounds of experiments, 548 participants categorized and rated humans, animals, and objects that varied from computer animated to real. Two sets of features were manipulated to reduce realism consistency. (For humans, the sets were eyes-eyelashes-mouth and skin-nose-eyebrows.) Reducing realism consistency caused humans and animals, but not objects, to appear eerier and colder. However, the predictions of a competing theory, proposed by Ernst Jentsch in 1906, were not supported: The most ambiguous representations-those eliciting the greatest category uncertainty-were neither the eeriest nor the coldest. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Assessing the Expected Value of Research Studies in Reducing Uncertainty and Improving Implementation Dynamics.

    Science.gov (United States)

    Grimm, Sabine E; Dixon, Simon; Stevens, John W

    2017-07-01

    With low implementation of cost-effective health technologies being a problem in many health systems, it is worth considering the potential effects of research on implementation at the time of health technology assessment. Meaningful and realistic implementation estimates must be of dynamic nature. To extend existing methods for assessing the value of research studies in terms of both reduction of uncertainty and improvement in implementation by considering diffusion based on expert beliefs with and without further research conditional on the strength of evidence. We use expected value of sample information and expected value of specific implementation measure concepts accounting for the effects of specific research studies on implementation and the reduction of uncertainty. Diffusion theory and elicitation of expert beliefs about the shape of diffusion curves inform implementation dynamics. We illustrate use of the resulting dynamic expected value of research in a preterm birth screening technology and results are compared with those from a static analysis. Allowing for diffusion based on expert beliefs had a significant impact on the expected value of research in the case study, suggesting that mistakes are made where static implementation levels are assumed. Incorporating the effects of research on implementation resulted in an increase in the expected value of research compared to the expected value of sample information alone. Assessing the expected value of research in reducing uncertainty and improving implementation dynamics has the potential to complement currently used analyses in health technology assessments, especially in recommendations for further research. The combination of expected value of research, diffusion theory, and elicitation described in this article is an important addition to the existing methods of health technology assessment.

  17. Explaining Delusions: Reducing Uncertainty Through Basic and Computational Neuroscience.

    Science.gov (United States)

    Feeney, Erin J; Groman, Stephanie M; Taylor, Jane R; Corlett, Philip R

    2017-03-01

    Delusions, the fixed false beliefs characteristic of psychotic illness, have long defied understanding despite their response to pharmacological treatments (e.g., D2 receptor antagonists). However, it can be challenging to discern what makes beliefs delusional compared with other unusual or erroneous beliefs. We suggest mapping the putative biology to clinical phenomenology with a cognitive psychology of belief, culminating in a teleological approach to beliefs and brain function supported by animal and computational models. We argue that organisms strive to minimize uncertainty about their future states by forming and maintaining a set of beliefs (about the organism and the world) that are robust, but flexible. If uncertainty is generated endogenously, beliefs begin to depart from consensual reality and can manifest into delusions. Central to this scheme is the notion that formal associative learning theory can provide an explanation for the development and persistence of delusions. Beliefs, in animals and humans, may be associations between representations (e.g., of cause and effect) that are formed by minimizing uncertainty via new learning and attentional allocation. Animal research has equipped us with a deep mechanistic basis of these processes, which is now being applied to delusions. This work offers the exciting possibility of completing revolutions of translation, from the bedside to the bench and back again. The more we learn about animal beliefs, the more we may be able to apply to human beliefs and their aberrations, enabling a deeper mechanistic understanding. © The Author 2017. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. The trade-off between short- and long-lived greenhouse gases under uncertainty and learning

    Energy Technology Data Exchange (ETDEWEB)

    Aaheim, H. Asbjoern; Brekke, Kjell Arne; Lystad, Terje; Torvanger, Asbjoern

    2001-07-01

    To find an optimal climate policy we must balance abatement of different greenhouse gases. There is substantial uncertainty about future damages from climate change, but we will learn more over the next few decades. Gases vary in terms of how long they remain in the atmosphere, which means that equivalent pulse emissions have very different climate impacts. Such differences between gases are important in consideration of uncertainty and learning about future damages, but they are disregarded by the conventional concept of Global Warming Potential We have developed a numerical model to analyze how uncertainty and learning affect optimal emissions of both CO{sub 2} and CH{sub 4}. In the model, emissions of these greenhouse gases lead to global temperature increases and production losses. New information about the severity of the climate problem arrives either in 2010 or in 2020. We find that uncertainty causes increased optimal abatement of both gases, compared to the certainty case. This effect amounts to 0.08 {sup o}C less expected temperature increase by year 2200. Learning leads to less abatement for both gases since expected future marginal damages from emissions are reduced. This effect is less pronounced for the short-lived CH{sub 4}. (author)

  19. Implicit Treatment of Technical Specification and Thermal Hydraulic Parameter Uncertainties in Gaussian Process Model to Estimate Safety Margin

    Directory of Open Access Journals (Sweden)

    Douglas A. Fynan

    2016-06-01

    Full Text Available The Gaussian process model (GPM is a flexible surrogate model that can be used for nonparametric regression for multivariate problems. A unique feature of the GPM is that a prediction variance is automatically provided with the regression function. In this paper, we estimate the safety margin of a nuclear power plant by performing regression on the output of best-estimate simulations of a large-break loss-of-coolant accident with sampling of safety system configuration, sequence timing, technical specifications, and thermal hydraulic parameter uncertainties. The key aspect of our approach is that the GPM regression is only performed on the dominant input variables, the safety injection flow rate and the delay time for AC powered pumps to start representing sequence timing uncertainty, providing a predictive model for the peak clad temperature during a reflood phase. Other uncertainties are interpreted as contributors to the measurement noise of the code output and are implicitly treated in the GPM in the noise variance term, providing local uncertainty bounds for the peak clad temperature. We discuss the applicability of the foregoing method to reduce the use of conservative assumptions in best estimate plus uncertainty (BEPU and Level 1 probabilistic safety assessment (PSA success criteria definitions while dealing with a large number of uncertainties.

  20. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 2: Ozone DIAL uncertainty budget

    Science.gov (United States)

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Godin-Beekmann, Sophie; Haefele, Alexander; Trickl, Thomas; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the ozone differential absorption lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One essential aspect of the proposed approach is the propagation in parallel of all independent uncertainty components through the data processing chain before they are combined together to form the ozone combined standard uncertainty. The independent uncertainty components contributing to the overall budget include random noise associated with signal detection, uncertainty due to saturation correction, background noise extraction, the absorption cross sections of O3, NO2, SO2, and O2, the molecular extinction cross sections, and the number densities of the air, NO2, and SO2. The expression of the individual uncertainty components and their step-by-step propagation through the ozone differential absorption lidar (DIAL) processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which requires knowledge of the covariance matrix when the lidar signal is vertically filtered. In addition, the covariance terms must be taken into account if the same detection hardware is shared by the lidar receiver channels at the absorbed and non-absorbed wavelengths. The ozone uncertainty budget is presented as much as possible in a generic form (i.e., as a function of instrument performance and wavelength) so that all NDACC ozone DIAL investigators across the network can estimate, for their own instrument and in a straightforward manner, the expected impact of each reviewed uncertainty component. In addition, two actual examples of full uncertainty budget are provided, using nighttime measurements from the tropospheric ozone DIAL located at the Jet Propulsion Laboratory (JPL) Table Mountain Facility, California, and nighttime measurements from the JPL

  1. On treatment of uncertainty in system planning

    International Nuclear Information System (INIS)

    Flage, R.; Aven, T.

    2009-01-01

    In system planning and operation considerable efforts and resources are spent to reduce uncertainties, as a part of project management, uncertainty management and safety management. The basic idea seems to be that uncertainties are purely negative and should be reduced. In this paper we challenge this way of thinking, using a common industry practice as an example. In accordance with this industry practice, three uncertainty interval categories are used: ±40% intervals for the feasibility phase, ±30% intervals for the concept development phase and ±20% intervals for the engineering phase. The problem is that such a regime could easily lead to a conservative management regime encouraging the use of existing methods and tools, as new activities and novel solutions and arrangements necessarily mean increased uncertainties. In the paper we suggest an alternative approach based on uncertainty and risk descriptions, but having no predefined uncertainty reduction structures. The approach makes use of risk assessments and economic optimisation tools such as the expected net present value, but acknowledges the need for broad risk management processes which extend beyond the analyses. Different concerns need to be balanced, including economic aspects, uncertainties and risk, and practicability

  2. Evaluating uncertainties in regional climate simulations over South America at the seasonal scale

    Energy Technology Data Exchange (ETDEWEB)

    Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)

    2012-07-15

    This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)

  3. Bookending the Opportunity to Lower Wind’s LCOE by Reducing the Uncertainty Surrounding Annual Energy Production

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Energy Analysis and Environmental Impacts Div.

    2017-06-01

    Reducing the performance risk surrounding a wind project can potentially lead to a lower weighted-average cost of capital (WACC), and hence a lower levelized cost of energy (LCOE), through an advantageous shift in capital structure, and possibly also a reduction in the cost of capital. Specifically, a reduction in performance risk will move the 1-year P99 annual energy production (AEP) estimate closer to the P50 AEP estimate, which in turn reduces the minimum debt service coverage ratio (DSCR) required by lenders, thereby allowing the project to be financed with a greater proportion of low-cost debt. In addition, a reduction in performance risk might also reduce the cost of one or more of the three sources of capital that are commonly used to finance wind projects: sponsor or cash equity, tax equity, and/or debt. Preliminary internal LBNL analysis of the maximum possible LCOE reduction attainable from reducing the performance risk of a wind project found a potentially significant opportunity for LCOE reduction of ~$10/MWh, by reducing the P50 DSCR to its theoretical minimum value of 1.0 (Bolinger 2015b, 2014) and by reducing the cost of sponsor equity and debt by one-third to one-half each (Bolinger 2015a, 2015b). However, with FY17 funding from the U.S. Department of Energy’s Atmosphere to Electrons (A2e) Performance Risk, Uncertainty, and Finance (PRUF) initiative, LBNL has been revisiting this “bookending” exercise in more depth, and now believes that its earlier preliminary assessment of the LCOE reduction opportunity was overstated. This reassessment is based on two new-found understandings: (1) Due to ever-present and largely irreducible inter-annual variability (IAV) in the wind resource, the minimum required DSCR cannot possibly fall to 1.0 (on a P50 basis), and (2) A reduction in AEP uncertainty will not necessarily lead to a reduction in the cost of capital, meaning that a shift in capital structure is perhaps the best that can be expected (perhaps

  4. Effects of Uncertainties in Hydrological Modelling. A Case Study of a Mountainous Catchment in Southern Norway

    Science.gov (United States)

    Engeland, Kolbjorn; Steinsland, Ingelin

    2016-04-01

    The aim of this study is to investigate how the inclusion of uncertainties in inputs and observed streamflow influence the parameter estimation, streamflow predictions and model evaluation. In particular we wanted to answer the following research questions: • What is the effect of including a random error in the precipitation and temperature inputs? • What is the effect of decreased information about precipitation by excluding the nearest precipitation station? • What is the effect of the uncertainty in streamflow observations? • What is the effect of reduced information about the true streamflow by using a rating curve where the measurement of the highest and lowest streamflow is excluded when estimating the rating curve? To answer these questions, we designed a set of calibration experiments and evaluation strategies. We used the elevation distributed HBV model operating on daily time steps combined with a Bayesian formulation and the MCMC routine Dream for parameter inference. The uncertainties in inputs was represented by creating ensembles of precipitation and temperature. The precipitation ensemble were created using a meta-gaussian random field approach. The temperature ensembles were created using a 3D Bayesian kriging with random sampling of the temperature laps rate. The streamflow ensembles were generated by a Bayesian multi-segment rating curve model. Precipitation and temperatures were randomly sampled for every day, whereas the streamflow ensembles were generated from rating curve ensembles, and the same rating curve was always used for the whole time series in a calibration or evaluation run. We chose a catchment with a meteorological station measuring precipitation and temperature, and a rating curve of relatively high quality. This allowed us to investigate and further test the effect of having less information on precipitation and streamflow during model calibration, predictions and evaluation. The results showed that including uncertainty

  5. Uncertainties associated with inertial-fusion ignition

    International Nuclear Information System (INIS)

    McCall, G.H.

    1981-01-01

    An estimate is made of a worst case driving energy which is derived from analytic and computer calculations. It will be shown that the uncertainty can be reduced by a factor of 10 to 100 if certain physical effects are understood. That is not to say that the energy requirement can necessarily be reduced below that of the worst case, but it is possible to reduce the uncertainty associated with ignition energy. With laser costs in the $0.5 to 1 billion per MJ range, it can be seen that such an exercise is worthwhile

  6. Reducing the uncertainty in the fidelity of seismic imaging results

    Science.gov (United States)

    Zhou, H. W.; Zou, Z.

    2017-12-01

    A key aspect in geoscientific inversion is quantifying the quality of the results. In seismic imaging, we must quantify the uncertainty of every imaging result based on field data, because data noise and methodology limitations may produce artifacts. Detection of artifacts is therefore an important aspect in uncertainty quantification in geoscientific inversion. Quantifying the uncertainty of seismic imaging solutions means assessing their fidelity, which defines the truthfulness of the imaged targets in terms of their resolution, position error and artifact. Key challenges to achieving the fidelity of seismic imaging include: (1) Difficulty to tell signal from artifact and noise; (2) Limitations in signal-to-noise ratio and seismic illumination; and (3) The multi-scale nature of the data space and model space. Most seismic imaging studies of the Earth's crust and mantle have employed inversion or modeling approaches. Though they are in opposite directions of mapping between the data space and model space, both inversion and modeling seek the best model to minimize the misfit in the data space, which unfortunately is not the output space. The fact that the selection and uncertainty of the output model are not judged in the output space has exacerbated the nonuniqueness problem for inversion and modeling. In contrast, the practice in exploration seismology has long established a two-fold approach of seismic imaging: Using velocity modeling building to establish the long-wavelength reference velocity models, and using seismic migration to map the short-wavelength reflectivity structures. Most interestingly, seismic migration maps the data into an output space called imaging space, where the output reflection images of the subsurface are formed based on an imaging condition. A good example is the reverse time migration, which seeks the reflectivity image as the best fit in the image space between the extrapolation of time-reversed waveform data and the prediction

  7. Why are agricultural impacts of climate change so uncertain? The importance of temperature relative to precipitation

    International Nuclear Information System (INIS)

    Lobell, David B; Burke, Marshall B

    2008-01-01

    Estimates of climate change impacts are often characterized by large uncertainties that reflect ignorance of many physical, biological, and socio-economic processes, and which hamper efforts to anticipate and adapt to climate change. A key to reducing these uncertainties is improved understanding of the relative contributions of individual factors. We evaluated uncertainties for projections of climate change impacts on crop production for 94 crop-region combinations that account for the bulk of calories consumed by malnourished populations. Specifically, we focused on the relative contributions of four factors: climate model projections of future temperature and precipitation, and the sensitivities of crops to temperature and precipitation changes. Surprisingly, uncertainties related to temperature represented a greater contribution to climate change impact uncertainty than those related to precipitation for most crops and regions, and in particular the sensitivity of crop yields to temperature was a critical source of uncertainty. These findings occurred despite rainfall's important contribution to year-to-year variability in crop yields and large disagreements among global climate models over the direction of future regional rainfall changes, and reflect the large magnitude of future warming relative to historical variability. We conclude that progress in understanding crop responses to temperature and the magnitude of regional temperature changes are two of the most important needs for climate change impact assessments and adaptation efforts for agriculture

  8. Verification and uncertainty evaluation of HELIOS/MASTER nuclear design system

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jae Seung; Kim, J. C.; Cho, B. O. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-03-01

    A nuclear design system HELIOS/MASTER was established and core follow calculations were performed for Yonggwang Unit 1 cycles 1 through 7 and Yonggwang Unit 3 cycles 1 through 2. The accuracy of HELIOS/MASTER system was evaluated by estimations of uncertainties of reactivity and peaking factors and by comparisons of the maximum differences of isothermal temperature coefficient, inverse boron worth and control rod worth with the CASMO-3/MASTER uncertainties. The reactivity uncertainty was estimated by 362 pcm, and the uncertainties of three-dimensional, axially integrated radial, and planar peaking factors were evaluated by 0.048, 0.034, and 0.044 in relative power unit, respectively. The maximum differences of isothermal temperature coefficient, inverse boron worth and control rod worth were within the CASMO-3/MASTER uncertainties. 17 refs., 17 figs., 10 tabs. (Author)

  9. Predicting Statistical Response and Extreme Events in Uncertainty Quantification through Reduced-Order Models

    Science.gov (United States)

    Qi, D.; Majda, A.

    2017-12-01

    A low-dimensional reduced-order statistical closure model is developed for quantifying the uncertainty in statistical sensitivity and intermittency in principal model directions with largest variability in high-dimensional turbulent system and turbulent transport models. Imperfect model sensitivity is improved through a recent mathematical strategy for calibrating model errors in a training phase, where information theory and linear statistical response theory are combined in a systematic fashion to achieve the optimal model performance. The idea in the reduced-order method is from a self-consistent mathematical framework for general systems with quadratic nonlinearity, where crucial high-order statistics are approximated by a systematic model calibration procedure. Model efficiency is improved through additional damping and noise corrections to replace the expensive energy-conserving nonlinear interactions. Model errors due to the imperfect nonlinear approximation are corrected by tuning the model parameters using linear response theory with an information metric in a training phase before prediction. A statistical energy principle is adopted to introduce a global scaling factor in characterizing the higher-order moments in a consistent way to improve model sensitivity. Stringent models of barotropic and baroclinic turbulence are used to display the feasibility of the reduced-order methods. Principal statistical responses in mean and variance can be captured by the reduced-order models with accuracy and efficiency. Besides, the reduced-order models are also used to capture crucial passive tracer field that is advected by the baroclinic turbulent flow. It is demonstrated that crucial principal statistical quantities like the tracer spectrum and fat-tails in the tracer probability density functions in the most important large scales can be captured efficiently with accuracy using the reduced-order tracer model in various dynamical regimes of the flow field with

  10. Uncertainty analysis for hot channel

    International Nuclear Information System (INIS)

    Panka, I.; Kereszturi, A.

    2006-01-01

    The fulfillment of the safety analysis acceptance criteria is usually evaluated by separate hot channel calculations using the results of neutronic or/and thermo hydraulic system calculations. In case of an ATWS event (inadvertent withdrawal of control assembly), according to the analysis, a number of fuel rods are experiencing DNB for a longer time and must be regarded as failed. Their number must be determined for a further evaluation of the radiological consequences. In the deterministic approach, the global power history must be multiplied by different hot channel factors (kx) taking into account the radial power peaking factors for each fuel pin. If DNB occurs it is necessary to perform a few number of hot channel calculations to determine the limiting kx leading just to DNB and fuel failure (the conservative DNBR limit is 1.33). Knowing the pin power distribution from the core design calculation, the number of failed fuel pins can be calculated. The above procedure can be performed by conservative assumptions (e.g. conservative input parameters in the hot channel calculations), as well. In case of hot channel uncertainty analysis, the relevant input parameters (k x, mass flow, inlet temperature of the coolant, pin average burnup, initial gap size, selection of power history influencing the gap conductance value) of hot channel calculations and the DNBR limit are varied considering the respective uncertainties. An uncertainty analysis methodology was elaborated combining the response surface method with the one sided tolerance limit method of Wilks. The results of deterministic and uncertainty hot channel calculations are compared regarding to the number of failed fuel rods, max. temperature of the clad surface and max. temperature of the fuel (Authors)

  11. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  12. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    International Nuclear Information System (INIS)

    Muelaner, J E; Wang, Z; Keogh, P S; Brownell, J; Fisher, D

    2016-01-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products. (paper)

  13. Understanding the origin of Paris Agreement emission uncertainties

    Science.gov (United States)

    Rogelj, Joeri; Fricko, Oliver; Meinshausen, Malte; Krey, Volker; Zilliacus, Johanna J. J.; Riahi, Keywan

    2017-06-01

    The UN Paris Agreement puts in place a legally binding mechanism to increase mitigation action over time. Countries put forward pledges called nationally determined contributions (NDC) whose impact is assessed in global stocktaking exercises. Subsequently, actions can then be strengthened in light of the Paris climate objective: limiting global mean temperature increase to well below 2 °C and pursuing efforts to limit it further to 1.5 °C. However, pledged actions are currently described ambiguously and this complicates the global stocktaking exercise. Here, we systematically explore possible interpretations of NDC assumptions, and show that this results in estimated emissions for 2030 ranging from 47 to 63 GtCO2e yr-1. We show that this uncertainty has critical implications for the feasibility and cost to limit warming well below 2 °C and further to 1.5 °C. Countries are currently working towards clarifying the modalities of future NDCs. We identify salient avenues to reduce the overall uncertainty by about 10 percentage points through simple, technical clarifications regarding energy accounting rules. Remaining uncertainties depend to a large extent on politically valid choices about how NDCs are expressed, and therefore raise the importance of a thorough and robust process that keeps track of where emissions are heading over time.

  14. Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.

    2015-01-15

    Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.

  15. Dynamics of entropic uncertainty for atoms immersed in thermal fluctuating massless scalar field

    Science.gov (United States)

    Huang, Zhiming

    2018-04-01

    In this article, the dynamics of quantum memory-assisted entropic uncertainty relation for two atoms immersed in a thermal bath of fluctuating massless scalar field is investigated. The master equation that governs the system evolution process is derived. It is found that the mixedness is closely associated with entropic uncertainty. For equilibrium state, the tightness of uncertainty vanishes. For the initial maximum entangled state, the tightness of uncertainty undergoes a slight increase and then declines to zero with evolution time. It is found that temperature can increase the uncertainty, but two-atom separation does not always increase the uncertainty. The uncertainty evolves to different relatively stable values for different temperatures and converges to a fixed value for different two-atom distances with evolution time. Furthermore, weak measurement reversal is employed to control the entropic uncertainty.

  16. Uncertainty for calculating transport on Titan: A probabilistic description of bimolecular diffusion parameters

    Science.gov (United States)

    Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.

    2015-11-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.

  17. Entropic uncertainty relation of a two-qutrit Heisenberg spin model in nonuniform magnetic fields and its dynamics under intrinsic decoherence

    Science.gov (United States)

    Zhang, Zuo-Yuan; Wei, DaXiu; Liu, Jin-Ming

    2018-06-01

    The precision of measurements for two incompatible observables in a physical system can be improved with the assistance of quantum memory. In this paper, we investigate the quantum-memory-assisted entropic uncertainty relation for a spin-1 Heisenberg model in the presence of external magnetic fields, the systemic quantum entanglement (characterized by the negativity) is analyzed as contrast. Our results show that for the XY spin chain in thermal equilibrium, the entropic uncertainty can be reduced by reinforcing the coupling between the two particles or decreasing the temperature of the environment. At zero-temperature, the strong magnetic field can result in the growth of the entropic uncertainty. Moreover, in the Ising case, the variation trends of the uncertainty are relied on the choices of anisotropic parameters. Taking the influence of intrinsic decoherence into account, we find that the strong coupling accelerates the inflation of the uncertainty over time, whereas the high magnetic field contributes to its reduction during the temporal evolution. Furthermore, we also verify that the evolution behavior of the entropic uncertainty is roughly anti-correlated with that of the entanglement in the whole dynamical process. Our results could offer new insights into quantum precision measurement for the high spin solid-state systems.

  18. Use of screening techniques to reduce uncertainty in risk assessment at a former manufactured gas plant site

    International Nuclear Information System (INIS)

    Logan, C.M.; Walden, R.H.; Baker, S.R.; Pekar, Z.; LaKind, J.S.; MacFarlane, I.D.

    1995-01-01

    Preliminary analysis of risks from a former manufactured gas plant (MGP) site revealed six media associated with potential exposure pathways: soils, air, surface water, groundwater, estuarine sediments, and aquatic biota. Contaminants of concern (COCs) include polycyclic aromatic hydrocarbons, volatile organic hydrocarbons, metals, cyanide, and PCBs. Available chemical data, including site-specific measurements and existing data from other sources (e.g., agency monitoring programs, Chesapeake Bay Program), were evaluated for potential utility in risk assessment. Where sufficient data existed, risk calculations were performed using central tendency and reasonable maximum exposure estimates. Where site-specific data were not available, risks were estimated using conservatively high default assumptions for dose and/or exposure duration. Because of the large number of potential exposure pathways and COCs, a sensitivity analysis was conducted to determine which information most influences risk assessment outcome so that any additional data collection to reduce uncertainty can be cost-effectively targeted. The sensitivity analysis utilized two types of information: (1) the impact that uncertainty in risk input values has on output risk estimates, and (2) the potential improvement in key risk input values, and consequently output values, if better site-specific data were available. A decision matrix using both quantitative and qualitative information was developed to prioritize sampling strategies to minimize uncertainty in the final risk assessment

  19. Regulatory risk assessments: Is there a need to reduce uncertainty and enhance robustness?

    Science.gov (United States)

    Snodin, D J

    2015-12-01

    A critical evaluation of several recent regulatory risk assessments has been undertaken. These relate to propyl paraben (as a food additive, cosmetic ingredient or pharmaceutical excipient), cobalt (in terms of a safety-based limit for pharmaceuticals) and the cancer Threshold of Toxicological Concern as applied to food contaminants and pharmaceutical impurities. In all cases, a number of concerns can be raised regarding the reliability of the current assessments, some examples being absence of data audits, use of single-dose and/or non-good laboratory practice studies to determine safety metrics, use of a biased data set and questionable methodology and lack of consistency with precedents and regulatory guidance. Drawing on these findings, a set of recommendations is provided to reduce uncertainty and improve the quality and robustness of future regulatory risk assessments. © The Author(s) 2015.

  20. It's the parameters, stupid! Moving beyond multi-model and multi-physics approaches to characterize and reduce predictive uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, Martyn; Samaniego, Luis; Freer, Jim

    2014-05-01

    Multi-model and multi-physics approaches are a popular tool in environmental modelling, with many studies focusing on optimally combining output from multiple model simulations to reduce predictive errors and better characterize predictive uncertainty. However, a careful and systematic analysis of different hydrological models reveals that individual models are simply small permutations of a master modeling template, and inter-model differences are overwhelmed by uncertainty in the choice of the parameter values in the model equations. Furthermore, inter-model differences do not explicitly represent the uncertainty in modeling a given process, leading to many situations where different models provide the wrong results for the same reasons. In other cases, the available morphological data does not support the very fine spatial discretization of the landscape that typifies many modern applications of process-based models. To make the uncertainty characterization problem worse, the uncertain parameter values in process-based models are often fixed (hard-coded), and the models lack the agility necessary to represent the tremendous heterogeneity in natural systems. This presentation summarizes results from a systematic analysis of uncertainty in process-based hydrological models, where we explicitly analyze the myriad of subjective decisions made throughout both the model development and parameter estimation process. Results show that much of the uncertainty is aleatory in nature - given a "complete" representation of dominant hydrologic processes, uncertainty in process parameterizations can be represented using an ensemble of model parameters. Epistemic uncertainty associated with process interactions and scaling behavior is still important, and these uncertainties can be represented using an ensemble of different spatial configurations. Finally, uncertainty in forcing data can be represented using ensemble methods for spatial meteorological analysis. Our systematic

  1. Crop model improvement reduces the uncertainty of the response to temperature of multi-model ensembles

    DEFF Research Database (Denmark)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold

    2017-01-01

    of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA...... ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures >24 °C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME...

  2. Temperature dependent thermoelectric property of reduced graphene oxide-polyaniline composite

    Energy Technology Data Exchange (ETDEWEB)

    Mitra, Mousumi, E-mail: mousumimitrabesu@gmail.com; Banerjee, Dipali, E-mail: dipalibanerjeebesu@gmail.com [Department of Physics, Indian Institute of Engineering Science and Technology (IIEST), Howrah-711103 (India); Kargupta, Kajari, E-mail: karguptakajari2010@gmail.com [Department of Chemical Engineering, Jadavpur University, Kolkata (India); Ganguly, Saibal, E-mail: gangulysaibal2011@gmail.com [Chemical Engineering department, Universiti Teknologi Petronas, Perak, Tronoh (Malaysia)

    2016-05-06

    A composite material of reduced graphene oxide (rG) nanosheets with polyaniline (PANI) protonated by 5-sulfosalicylic acid has been synthesized via in situ oxidative polymerization method. The morphological and spectral characterizations have been done using FESEM and XRD measurements. The thermoelectric (TE) properties of the reduced graphene oxide-polyaniline composite (rG-P) has been studied in the temperature range from 300-400 K. The electrical conductivity and the Seebeck coefficient of rG-P is higher than the of pure PANI, while the thermal conductivity of the composite still keeps much low value ensuing an increase in the dimensionless figure of merit (ZT) in the whole temperature range.

  3. Reconstruction of regional mean temperature for East Asia since 1900s and its uncertainties

    Science.gov (United States)

    Hua, W.

    2017-12-01

    Regional average surface air temperature (SAT) is one of the key variables often used to investigate climate change. Unfortunately, because of the limited observations over East Asia, there were also some gaps in the observation data sampling for regional mean SAT analysis, which was important to estimate past climate change. In this study, the regional average temperature of East Asia since 1900s is calculated by the Empirical Orthogonal Function (EOF)-based optimal interpolation (OA) method with considering the data errors. The results show that our estimate is more precise and robust than the results from simple average, which provides a better way for past climate reconstruction. In addition to the reconstructed regional average SAT anomaly time series, we also estimated uncertainties of reconstruction. The root mean square error (RMSE) results show that the the error decreases with respect to time, and are not sufficiently large to alter the conclusions on the persist warming in East Asia during twenty-first century. Moreover, the test of influence of data error on reconstruction clearly shows the sensitivity of reconstruction to the size of the data error.

  4. Uncertainty and sensitivity analysis of the nuclear fuel thermal behavior

    Energy Technology Data Exchange (ETDEWEB)

    Boulore, A., E-mail: antoine.boulore@cea.fr [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Struzik, C. [Commissariat a l' Energie Atomique (CEA), DEN, Fuel Research Department, 13108 Saint-Paul-lez-Durance (France); Gaudier, F. [Commissariat a l' Energie Atomique (CEA), DEN, Systems and Structure Modeling Department, 91191 Gif-sur-Yvette (France)

    2012-12-15

    Highlights: Black-Right-Pointing-Pointer A complete quantitative method for uncertainty propagation and sensitivity analysis is applied. Black-Right-Pointing-Pointer The thermal conductivity of UO{sub 2} is modeled as a random variable. Black-Right-Pointing-Pointer The first source of uncertainty is the linear heat rate. Black-Right-Pointing-Pointer The second source of uncertainty is the thermal conductivity of the fuel. - Abstract: In the global framework of nuclear fuel behavior simulation, the response of the models describing the physical phenomena occurring during the irradiation in reactor is mainly conditioned by the confidence in the calculated temperature of the fuel. Amongst all parameters influencing the temperature calculation in our fuel rod simulation code (METEOR V2), several sources of uncertainty have been identified as being the most sensitive: thermal conductivity of UO{sub 2}, radial distribution of power in the fuel pellet, local linear heat rate in the fuel rod, geometry of the pellet and thermal transfer in the gap. Expert judgment and inverse methods have been used to model the uncertainty of these parameters using theoretical distributions and correlation matrices. Propagation of these uncertainties in the METEOR V2 code using the URANIE framework and a Monte-Carlo technique has been performed in different experimental irradiations of UO{sub 2} fuel. At every time step of the simulated experiments, we get a temperature statistical distribution which results from the initial distributions of the uncertain parameters. We then can estimate confidence intervals of the calculated temperature. In order to quantify the sensitivity of the calculated temperature to each of the uncertain input parameters and data, we have also performed a sensitivity analysis using the Sobol' indices at first order.

  5. On Commitments and Other Uncertainty Reduction Tools in Joint Action

    Directory of Open Access Journals (Sweden)

    Michael John

    2015-01-01

    Full Text Available In this paper, we evaluate the proposal that a central function of commitments within joint action is to reduce various kinds of uncertainty, and that this accounts for the prevalence of commitments in joint action. While this idea is prima facie attractive, we argue that it faces two serious problems. First, commitments can only reduce uncertainty if they are credible, and accounting for the credibility of commitments proves not to be straightforward. Second, there are many other ways in which uncertainty is commonly reduced within joint actions, which raises the possibility that commitments may be superfluous. Nevertheless, we argue that the existence of these alternative uncertainty reduction processes does not make commitments superfluous after all but, rather, helps to explain how commitments may contribute in various ways to uncertainty reduction.

  6. Forecasting the Number of Soil Samples Required to Reduce Remediation Cost Uncertainty

    OpenAIRE

    Demougeot-Renard, Hélène; de Fouquet, Chantal; Renard, Philippe

    2008-01-01

    Sampling scheme design is an important step in the management of polluted sites. It largely controls the accuracy of remediation cost estimates. In practice, however, sampling is seldom designed to comply with a given level of remediation cost uncertainty. In this paper, we present a new technique that allows one to estimate of the number of samples that should be taken at a given stage of investigation to reach a forecasted level of accuracy. The uncertainty is expressed both in terms of vol...

  7. Investigation of uncertainty components in Coulomb blockade thermometry

    International Nuclear Information System (INIS)

    Hahtela, O. M.; Heinonen, M.; Manninen, A.; Meschke, M.; Savin, A.; Pekola, J. P.; Gunnarsson, D.; Prunnila, M.; Penttilä, J. S.; Roschier, L.

    2013-01-01

    Coulomb blockade thermometry (CBT) has proven to be a feasible method for primary thermometry in every day laboratory use at cryogenic temperatures from ca. 10 mK to a few tens of kelvins. The operation of CBT is based on single electron charging effects in normal metal tunnel junctions. In this paper, we discuss the typical error sources and uncertainty components that limit the present absolute accuracy of the CBT measurements to the level of about 1 % in the optimum temperature range. Identifying the influence of different uncertainty sources is a good starting point for improving the measurement accuracy to the level that would allow the CBT to be more widely used in high-precision low temperature metrological applications and for realizing thermodynamic temperature in accordance to the upcoming new definition of kelvin

  8. Investigation of uncertainty components in Coulomb blockade thermometry

    Energy Technology Data Exchange (ETDEWEB)

    Hahtela, O. M.; Heinonen, M.; Manninen, A. [MIKES Centre for Metrology and Accreditation, Tekniikantie 1, 02150 Espoo (Finland); Meschke, M.; Savin, A.; Pekola, J. P. [Low Temperature Laboratory, Aalto University, Tietotie 3, 02150 Espoo (Finland); Gunnarsson, D.; Prunnila, M. [VTT Technical Research Centre of Finland, Tietotie 3, 02150 Espoo (Finland); Penttilä, J. S.; Roschier, L. [Aivon Oy, Tietotie 3, 02150 Espoo (Finland)

    2013-09-11

    Coulomb blockade thermometry (CBT) has proven to be a feasible method for primary thermometry in every day laboratory use at cryogenic temperatures from ca. 10 mK to a few tens of kelvins. The operation of CBT is based on single electron charging effects in normal metal tunnel junctions. In this paper, we discuss the typical error sources and uncertainty components that limit the present absolute accuracy of the CBT measurements to the level of about 1 % in the optimum temperature range. Identifying the influence of different uncertainty sources is a good starting point for improving the measurement accuracy to the level that would allow the CBT to be more widely used in high-precision low temperature metrological applications and for realizing thermodynamic temperature in accordance to the upcoming new definition of kelvin.

  9. Treatment of uncertainty in low-level waste performance assessment

    International Nuclear Information System (INIS)

    Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.

    1991-01-01

    Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs

  10. A programme of research to set priorities and reduce uncertainties for the prevention and treatment of skin disease

    OpenAIRE

    Thomas, K. S.; Batchelor, J. M.; Bath-Hextall, F.; Chalmers, J. R.; Clarke, T.; Crowe, S.; Delamere, F. M.; Eleftheriadou, V.; Evans, N.; Firkins, L.; Greenlaw, N.; Lansbury, L.; Lawton, S.; Layfield, C.; Leonardi-Bee, J.

    2016-01-01

    BACKGROUND: Skin diseases are very common and can have a large impact on the quality of life of patients and caregivers. This programme addressed four diseases: (1) eczema, (2) vitiligo, (3) squamous cell skin cancer (SCC) and (4) pyoderma gangrenosum (PG). OBJECTIVE: To set priorities and reduce uncertainties for the treatment and prevention of skin disease in our four chosen diseases. DESIGN: Mixed methods including eight systematic reviews, three prioritisation exercises, tw...

  11. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  12. Sources of uncertainty in future changes in local precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Rowell, David P. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-10-15

    This study considers the large uncertainty in projected changes in local precipitation. It aims to map, and begin to understand, the relative roles of uncertain modelling and natural variability, using 20-year mean data from four perturbed physics or multi-model ensembles. The largest - 280-member - ensemble illustrates a rich pattern in the varying contribution of modelling uncertainty, with similar features found using a CMIP3 ensemble (despite its limited sample size, which restricts it value in this context). The contribution of modelling uncertainty to the total uncertainty in local precipitation change is found to be highest in the deep tropics, particularly over South America, Africa, the east and central Pacific, and the Atlantic. In the moist maritime tropics, the highly uncertain modelling of sea-surface temperature changes is transmitted to a large uncertain modelling of local rainfall changes. Over tropical land and summer mid-latitude continents (and to a lesser extent, the tropical oceans), uncertain modelling of atmospheric processes, land surface processes and the terrestrial carbon cycle all appear to play an additional substantial role in driving the uncertainty of local rainfall changes. In polar regions, inter-model variability of anomalous sea ice drives an uncertain precipitation response, particularly in winter. In all these regions, there is therefore the potential to reduce the uncertainty of local precipitation changes through targeted model improvements and observational constraints. In contrast, over much of the arid subtropical and mid-latitude oceans, over Australia, and over the Sahara in winter, internal atmospheric variability dominates the uncertainty in projected precipitation changes. Here, model improvements and observational constraints will have little impact on the uncertainty of time means shorter than at least 20 years. Last, a supplementary application of the metric developed here is that it can be interpreted as a measure

  13. High resolution remote sensing for reducing uncertainties in urban forest carbon offset life cycle assessments.

    Science.gov (United States)

    Tigges, Jan; Lakes, Tobia

    2017-10-04

    Urban forests reduce greenhouse gas emissions by storing and sequestering considerable amounts of carbon. However, few studies have considered the local scale of urban forests to effectively evaluate their potential long-term carbon offset. The lack of precise, consistent and up-to-date forest details is challenging for long-term prognoses. Therefore, this review aims to identify uncertainties in urban forest carbon offset assessment and discuss the extent to which such uncertainties can be reduced by recent progress in high resolution remote sensing. We do this by performing an extensive literature review and a case study combining remote sensing and life cycle assessment of urban forest carbon offset in Berlin, Germany. Recent progress in high resolution remote sensing and methods is adequate for delivering more precise details on the urban tree canopy, individual tree metrics, species, and age structures compared to conventional land use/cover class approaches. These area-wide consistent details can update life cycle inventories for more precise future prognoses. Additional improvements in classification accuracy can be achieved by a higher number of features derived from remote sensing data of increasing resolution, but first studies on this subject indicated that a smart selection of features already provides sufficient data that avoids redundancies and enables more efficient data processing. Our case study from Berlin could use remotely sensed individual tree species as consistent inventory of a life cycle assessment. However, a lack of growth, mortality and planting data forced us to make assumptions, therefore creating uncertainty in the long-term prognoses. Regarding temporal changes and reliable long-term estimates, more attention is required to detect changes of gradual growth, pruning and abrupt changes in tree planting and mortality. As such, precise long-term urban ecological monitoring using high resolution remote sensing should be intensified

  14. Section summary: Uncertainty and design considerations

    Science.gov (United States)

    Stephen Hagen

    2013-01-01

    Well planned sampling designs and robust approaches to estimating uncertainty are critical components of forest monitoring. The importance of uncertainty estimation increases as deforestation and degradation issues become more closely tied to financing incentives for reducing greenhouse gas emissions in the forest sector. Investors like to know risk and risk is tightly...

  15. Simulation codes and the impact of validation/uncertainty requirements

    International Nuclear Information System (INIS)

    Sills, H.E.

    1995-01-01

    Several of the OECD/CSNI members have adapted a proposed methodology for code validation and uncertainty assessment. Although the validation process adapted by members has a high degree of commonality, the uncertainty assessment processes selected are more variable, ranaing from subjective to formal. This paper describes the validation and uncertainty assessment process, the sources of uncertainty, methods of reducing uncertainty, and methods of assessing uncertainty.Examples are presented from the Ontario Hydro application of the validation methodology and uncertainty assessment to the system thermal hydraulics discipline and the TUF (1) system thermal hydraulics code. (author)

  16. Parameter Uncertainty for Repository Thermal Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hardin, Ernest [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Greenberg, Harris [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dupont, Mark [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-10-01

    This report is one follow-on to a study of reference geologic disposal design concepts (Hardin et al. 2011a). Based on an analysis of maximum temperatures, that study concluded that certain disposal concepts would require extended decay storage prior to emplacement, or the use of small waste packages, or both. The study used nominal values for thermal properties of host geologic media and engineered materials, demonstrating the need for uncertainty analysis to support the conclusions. This report is a first step that identifies the input parameters of the maximum temperature calculation, surveys published data on measured values, uses an analytical approach to determine which parameters are most important, and performs an example sensitivity analysis. Using results from this first step, temperature calculations planned for FY12 can focus on only the important parameters, and can use the uncertainty ranges reported here. The survey of published information on thermal properties of geologic media and engineered materials, is intended to be sufficient for use in generic calculations to evaluate the feasibility of reference disposal concepts. A full compendium of literature data is beyond the scope of this report. The term “uncertainty” is used here to represent both measurement uncertainty and spatial variability, or variability across host geologic units. For the most important parameters (e.g., buffer thermal conductivity) the extent of literature data surveyed samples these different forms of uncertainty and variability. Finally, this report is intended to be one chapter or section of a larger FY12 deliverable summarizing all the work on design concepts and thermal load management for geologic disposal (M3FT-12SN0804032, due 15Aug2012).

  17. A simplified analysis of uncertainty propagation in inherently controlled ATWS events

    International Nuclear Information System (INIS)

    Wade, D.C.

    1987-01-01

    The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ΔT across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect

  18. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    Science.gov (United States)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from

  19. The Uncertainties of Risk Management

    DEFF Research Database (Denmark)

    Vinnari, Eija; Skærbæk, Peter

    2014-01-01

    for expanding risk management. More generally, such uncertainties relate to the professional identities and responsibilities of operational managers as defined by the framing devices. Originality/value – The paper offers three contributions to the extant literature: first, it shows how risk management itself......Purpose – The purpose of this paper is to analyse the implementation of risk management as a tool for internal audit activities, focusing on unexpected effects or uncertainties generated during its application. Design/methodology/approach – Public and confidential documents as well as semi......-structured interviews are analysed through the lens of actor-network theory to identify the effects of risk management devices in a Finnish municipality. Findings – The authors found that risk management, rather than reducing uncertainty, itself created unexpected uncertainties that would otherwise not have emerged...

  20. Application of stochastic programming to reduce uncertainty in quality-based supply planning of slaughterhouses

    NARCIS (Netherlands)

    Rijpkema, W.A.; Hendrix, E.M.T.; Rossi, R.; Vorst, van der J.G.A.J.

    2016-01-01

    To match products of different quality with end market preferences under supply uncertainty, it is crucial to integrate product quality information in logistics decision making. We present a case of this integration in a meat processing company that faces uncertainty in delivered livestock quality.

  1. Reduced one-body density matrix of Tonks–Girardeau gas at finite temperature

    International Nuclear Information System (INIS)

    Fu Xiao-Chen; Hao Ya-Jiang

    2015-01-01

    With thermal Bose–Fermi mapping method, we investigate the Tonks–Girardeau gas at finite temperature. It is shown that at low temperature, the Tonks gas displays the Fermi-like density profiles, and with the increase in temperature, the Tonks gas distributes in wider region. The reduced one-body density matrix is diagonal dominant in the whole temperature region, and the off-diagonal elements shall vanish rapidly with the deviation from the diagonal part at high temperature. (paper)

  2. Understanding the origin of Paris Agreement emission uncertainties.

    Science.gov (United States)

    Rogelj, Joeri; Fricko, Oliver; Meinshausen, Malte; Krey, Volker; Zilliacus, Johanna J J; Riahi, Keywan

    2017-06-06

    The UN Paris Agreement puts in place a legally binding mechanism to increase mitigation action over time. Countries put forward pledges called nationally determined contributions (NDC) whose impact is assessed in global stocktaking exercises. Subsequently, actions can then be strengthened in light of the Paris climate objective: limiting global mean temperature increase to well below 2 °C and pursuing efforts to limit it further to 1.5 °C. However, pledged actions are currently described ambiguously and this complicates the global stocktaking exercise. Here, we systematically explore possible interpretations of NDC assumptions, and show that this results in estimated emissions for 2030 ranging from 47 to 63 GtCO 2 e yr -1 . We show that this uncertainty has critical implications for the feasibility and cost to limit warming well below 2 °C and further to 1.5 °C. Countries are currently working towards clarifying the modalities of future NDCs. We identify salient avenues to reduce the overall uncertainty by about 10 percentage points through simple, technical clarifications regarding energy accounting rules. Remaining uncertainties depend to a large extent on politically valid choices about how NDCs are expressed, and therefore raise the importance of a thorough and robust process that keeps track of where emissions are heading over time.

  3. Impacts of Spatial Climatic Representation on Hydrological Model Calibration and Prediction Uncertainty: A Mountainous Catchment of Three Gorges Reservoir Region, China

    Directory of Open Access Journals (Sweden)

    Yan Li

    2016-02-01

    Full Text Available Sparse climatic observations represent a major challenge for hydrological modeling of mountain catchments with implications for decision-making in water resources management. Employing elevation bands in the Soil and Water Assessment Tool-Sequential Uncertainty Fitting (SWAT2012-SUFI2 model enabled representation of precipitation and temperature variation with altitude in the Daning river catchment (Three Gorges Reservoir Region, China where meteorological inputs are limited in spatial extent and are derived from observations from relatively low lying locations. Inclusion of elevation bands produced better model performance for 1987–1993 with the Nash–Sutcliffe efficiency (NSE increasing by at least 0.11 prior to calibration. During calibration prediction uncertainty was greatly reduced. With similar R-factors from the earlier calibration iterations, a further 11% of observations were included within the 95% prediction uncertainty (95PPU compared to the model without elevation bands. For behavioral simulations defined in SWAT calibration using a NSE threshold of 0.3, an additional 3.9% of observations were within the 95PPU while the uncertainty reduced by 7.6% in the model with elevation bands. The calibrated model with elevation bands reproduced observed river discharges with the performance in the calibration period changing to “very good” from “poor” without elevation bands. The output uncertainty of calibrated model with elevation bands was satisfactory, having 85% of flow observations included within the 95PPU. These results clearly demonstrate the requirement to account for orographic effects on precipitation and temperature in hydrological models of mountainous catchments.

  4. Learning-induced uncertainty reduction in perceptual decisions is task-dependent

    Directory of Open Access Journals (Sweden)

    Feitong eYang

    2014-05-01

    Full Text Available Perceptual decision making in which decisions are reached primarily from extracting and evaluating sensory information requires close interactions between the sensory system and decision-related networks in the brain. Uncertainty pervades every aspect of this process and can be considered related to either the stimulus signal or decision criterion. Here, we investigated the learning-induced reduction of both the signal and criterion uncertainty in two perceptual decision tasks based on two Glass pattern stimulus sets. This was achieved by manipulating spiral angle and signal level of radial and concentric Glass patterns. The behavioral results showed that the participants trained with a task based on criterion comparison improved their categorization accuracy for both tasks, whereas the participants who were trained on a task based on signal detection improved their categorization accuracy only on their trained task. We fitted the behavioral data with a computational model that can dissociate the contribution of the signal and criterion uncertainties. The modeling results indicated that the participants trained on the criterion comparison task reduced both the criterion and signal uncertainty. By contrast, the participants who were trained on the signal detection task only reduced their signal uncertainty after training. Our results suggest that the signal uncertainty can be resolved by training participants to extract signals from noisy environments and to discriminate between clear signals, which are evidenced by reduced perception variance after both training procedures. Conversely, the criterion uncertainty can only be resolved by the training of fine discrimination. These findings demonstrate that uncertainty in perceptual decision-making can be reduced with training but that the reduction of different types of uncertainty is task-dependent.

  5. Essential information: Uncertainty and optimal control of Ebola outbreaks.

    Science.gov (United States)

    Li, Shou-Li; Bjørnstad, Ottar N; Ferrari, Matthew J; Mummah, Riley; Runge, Michael C; Fonnesbeck, Christopher J; Tildesley, Michael J; Probert, William J M; Shea, Katriona

    2017-05-30

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  6. Essential information: Uncertainty and optimal control of Ebola outbreaks

    Science.gov (United States)

    Li, Shou-Li; Bjornstad, Ottar; Ferrari, Matthew J.; Mummah, Riley; Runge, Michael C.; Fonnesbeck, Christopher J.; Tildesley, Michael J.; Probert, William J. M.; Shea, Katriona

    2017-01-01

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  7. Association between obesity and reduced body temperature in dogs.

    Science.gov (United States)

    Piccione, G; Giudice, E; Fazio, F; Refinetti, R

    2011-08-01

    Industrialized nations are currently experiencing an obesity epidemic, the causes of which are not fully known. One possible mechanism of enhanced energy efficiency that has received almost no attention is a reduction in the metabolic cost of homeothermy, which could be achieved by a modest lowering of body core temperature. We evaluated the potential of this obesity-inducing mechanism in a canine model of the metabolic syndrome. We compared the rectal temperature of lean dogs and obese dogs by (a) conducting cross-sectional measurements in 287 dogs of many breeds varying greatly in body size, (b) conducting longitudinal measurements in individual dogs over 7-10 years and (c) tracking rectal temperature of lean and obese dogs at 3-h intervals for 48 consecutive hours in the laboratory. We found that larger dogs have lower rectal temperatures than smaller dogs and that, for the same body mass, obese dogs have lower rectal temperatures than lean dogs. The results were consistent in the cross-sectional, longitudinal and around-the-clock measurements. These findings document an association between obesity and reduced body temperature in dogs and support the hypothesis that obesity in this and other species of homeotherms may result from an increase in metabolic efficiency achieved by a regulated lowering of body temperature.

  8. Can you put too much on your plate? Uncertainty exposure in servitized triads

    DEFF Research Database (Denmark)

    Kreye, Melanie E.

    2017-01-01

    -national servitized triad in a European-North African set-up which was collected through 29 semi-structured interviews and secondary data. Findings: The empirical study identified the existence of the three uncertainty types and directional knock-on effects between them. Specifically, environmental uncertainty...... relational governance reduced relational uncertainty. The knock-on effects were reduced through organisational and relational responses. Originality: This paper makes two contributions. First, a structured analysis of the uncertainty exposure in servitized triads is presented which shows the existence...... of three individual uncertainty types and the knock-on effects between them. Second, organisational responses to reduce the three uncertainty types individually and the knock-on effects between them are presented....

  9. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    Science.gov (United States)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    Current trends in the hydro-meteorological variables indicate the potential impact of climate change on hydrological extremes. Therefore, they trigger an increased importance climate adaptation strategies in water management. The impact of climate change on hydro-meteorological and hydrological extremes is, however, highly uncertain. This is due to uncertainties introduced by the climate models, the internal variability inherent to the climate system, the greenhouse gas scenarios and the statistical downscaling methods. In view of the need to define sustainable climate adaptation strategies, there is a need to assess these uncertainties. This is commonly done by means of ensemble approaches. Because more and more climate models and statistical downscaling methods become available, there is a need to facilitate the climate impact and uncertainty analysis. A Climate Perturbation Tool has been developed for that purpose, which combines a set of statistical downscaling methods including weather typing, weather generator, transfer function and advanced perturbation based approaches. By use of an interactive interface, climate impact modelers can apply these statistical downscaling methods in a semi-automatic way to an ensemble of climate model runs. The tool is applicable to any region, but has been demonstrated so far to cases in Belgium, Suriname, Vietnam and Bangladesh. Time series representing future local-scale precipitation, temperature and potential evapotranspiration (PET) conditions were obtained, starting from time series of historical observations. Uncertainties on the future meteorological conditions are represented in two different ways: through an ensemble of time series, and a reduced set of synthetic scenarios. The both aim to span the full uncertainty range as assessed from the ensemble of climate model runs and downscaling methods. For Belgium, for instance, use was made of 100-year time series of 10-minutes precipitation observations and daily

  10. Uncertainties in modeling and scaling in the prediction of fuel stored energy and thermal response

    International Nuclear Information System (INIS)

    Wulff, W.

    1987-01-01

    The steady-state temperature distribution and the stored energy in nuclear fuel elements are computed by analytical methods and used to rank, in the order of importance, the effects on stored energy from statistical uncertainties in modeling parameters, in boundary and in operating conditions. An integral technique is used to calculate the transient fuel temperature and to estimate the uncertainties in predicting the fuel thermal response and the peak clad temperature during a large-break loss of coolant accident. The uncertainty analysis presented here is an important part of evaluating the applicability, the uncertainties and the scaling capabilities of computer codes for nuclear reactor safety analyses. The methods employed in this analysis merit general attention because of their simplicity. It is shown that the blowdown peak is dominated by fuel stored energy alone or, equivalently, by linear heating rate. Gap conductance, peaking factors and fuel thermal conductivity are the three most important fuel modeling parameters affecting peak clad temperature uncertainty. 26 refs., 10 figs., 6 tabs

  11. Uncertainty in Simulating Wheat Yields Under Climate Change

    Energy Technology Data Exchange (ETDEWEB)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  12. Uncertainty analysis of thermal quantities measurement in a centrifugal compressor

    Science.gov (United States)

    Hurda, Lukáš; Matas, Richard

    2017-09-01

    Compressor performance characteristics evaluation process based on the measurement of pressure, temperature and other quantities is examined to find uncertainties for directly measured and derived quantities. CFD is used as a tool to quantify the influences of different sources of uncertainty of measurements for single- and multi-thermocouple total temperature probes. The heat conduction through the body of the thermocouple probe and the heat-up of the air in the intake piping are the main phenomena of interest.

  13. Measurement uncertainty: Friend or foe?

    Science.gov (United States)

    Infusino, Ilenia; Panteghini, Mauro

    2018-02-02

    The definition and enforcement of a reference measurement system, based on the implementation of metrological traceability of patients' results to higher order reference methods and materials, together with a clinically acceptable level of measurement uncertainty, are fundamental requirements to produce accurate and equivalent laboratory results. The uncertainty associated with each step of the traceability chain should be governed to obtain a final combined uncertainty on clinical samples fulfilling the requested performance specifications. It is important that end-users (i.e., clinical laboratory) may know and verify how in vitro diagnostics (IVD) manufacturers have implemented the traceability of their calibrators and estimated the corresponding uncertainty. However, full information about traceability and combined uncertainty of calibrators is currently very difficult to obtain. Laboratory professionals should investigate the need to reduce the uncertainty of the higher order metrological references and/or to increase the precision of commercial measuring systems. Accordingly, the measurement uncertainty should not be considered a parameter to be calculated by clinical laboratories just to fulfil the accreditation standards, but it must become a key quality indicator to describe both the performance of an IVD measuring system and the laboratory itself. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  14. Reducing uncertainty in wind turbine blade health inspection with image processing techniques

    Science.gov (United States)

    Zhang, Huiyi

    Structural health inspection has been widely applied in the operation of wind farms to find early cracks in wind turbine blades (WTBs). Increased numbers of turbines and expanded rotor diameters are driving up the workloads and safety risks for site employees. Therefore, it is important to automate the inspection process as well as minimize the uncertainties involved in routine blade health inspection. In addition, crack documentation and trending is vital to assess rotor blade and turbine reliability in the 20 year designed life span. A new crack recognition and classification algorithm is described that can support automated structural health inspection of the surface of large composite WTBs. The first part of the study investigated the feasibility of digital image processing in WTB health inspection and defined the capability of numerically detecting cracks as small as hairline thickness. The second part of the study identified and analyzed the uncertainty of the digital image processing method. A self-learning algorithm was proposed to recognize and classify cracks without comparing a blade image to a library of crack images. The last part of the research quantified the uncertainty in the field conditions and the image processing methods.

  15. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  16. Uncertainty quantification and sensitivity analysis with CASL Core Simulator VERA-CS

    International Nuclear Information System (INIS)

    Brown, C.S.; Zhang, Hongbin

    2016-01-01

    VERA-CS (Virtual Environment for Reactor Applications, Core Simulator) is a coupled neutron transport and thermal-hydraulics code under development by the Consortium for Advanced Simulation of Light Water Reactors (CASL). An approach to uncertainty quantification and sensitivity analysis with VERA-CS was developed and a new toolkit was created to perform uncertainty quantification and sensitivity analysis. A 2 × 2 fuel assembly model was developed and simulated by VERA-CS, and uncertainty quantification and sensitivity analysis were performed with fourteen uncertain input parameters. The minimum departure from nucleate boiling ratio (MDNBR), maximum fuel center-line temperature, and maximum outer clad surface temperature were chosen as the selected figures of merit. Pearson, Spearman, and partial correlation coefficients were considered for all of the figures of merit in sensitivity analysis and coolant inlet temperature was consistently the most influential parameter. Parameters used as inputs to the critical heat flux calculation with the W-3 correlation were shown to be the most influential on the MDNBR, maximum fuel center-line temperature, and maximum outer clad surface temperature.

  17. Where do uncertainties reside within environmental risk assessments? Expert opinion on uncertainty distributions for pesticide risks to surface water organisms.

    Science.gov (United States)

    Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T

    2016-12-01

    A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  18. Uncertainty relation on a world crystal and its applications to micro black holes

    International Nuclear Information System (INIS)

    Jizba, Petr; Kleinert, Hagen; Scardigli, Fabio

    2010-01-01

    We formulate generalized uncertainty relations in a crystal-like universe - a 'world crystal' - whose lattice spacing is of the order of Planck length. In the particular case when energies lie near the border of the Brillouin zone, i.e., for Planckian energies, the uncertainty relation for position and momenta does not pose any lower bound on involved uncertainties. We apply our results to micro black holes physics, where we derive a new mass-temperature relation for Schwarzschild micro black holes. In contrast to standard results based on Heisenberg and stringy uncertainty relations, our mass-temperature formula predicts both a finite Hawking's temperature and a zero rest-mass remnant at the end of the micro black hole evaporation. We also briefly mention some connections of the world-crystal paradigm with 't Hooft's quantization and double special relativity.

  19. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    Science.gov (United States)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate

  20. Reducing Multisensor Satellite Monthly Mean Aerosol Optical Depth Uncertainty: 1. Objective Assessment of Current AERONET Locations

    Science.gov (United States)

    Li, Jing; Li, Xichen; Carlson, Barbara E.; Kahn, Ralph A.; Lacis, Andrew A.; Dubovik, Oleg; Nakajima, Teruyuki

    2016-01-01

    Various space-based sensors have been designed and corresponding algorithms developed to retrieve aerosol optical depth (AOD), the very basic aerosol optical property, yet considerable disagreement still exists across these different satellite data sets. Surface-based observations aim to provide ground truth for validating satellite data; hence, their deployment locations should preferably contain as much spatial information as possible, i.e., high spatial representativeness. Using a novel Ensemble Kalman Filter (EnKF)- based approach, we objectively evaluate the spatial representativeness of current Aerosol Robotic Network (AERONET) sites. Multisensor monthly mean AOD data sets from Moderate Resolution Imaging Spectroradiometer, Multiangle Imaging Spectroradiometer, Sea-viewing Wide Field-of-view Sensor, Ozone Monitoring Instrument, and Polarization and Anisotropy of Reflectances for Atmospheric Sciences coupled with Observations from a Lidar are combined into a 605-member ensemble, and AERONET data are considered as the observations to be assimilated into this ensemble using the EnKF. The assessment is made by comparing the analysis error variance (that has been constrained by ground-based measurements), with the background error variance (based on satellite data alone). Results show that the total uncertainty is reduced by approximately 27% on average and could reach above 50% over certain places. The uncertainty reduction pattern also has distinct seasonal patterns, corresponding to the spatial distribution of seasonally varying aerosol types, such as dust in the spring for Northern Hemisphere and biomass burning in the fall for Southern Hemisphere. Dust and biomass burning sites have the highest spatial representativeness, rural and oceanic sites can also represent moderate spatial information, whereas the representativeness of urban sites is relatively localized. A spatial score ranging from 1 to 3 is assigned to each AERONET site based on the uncertainty

  1. Mid-infrared response of reduced graphene oxide and its high-temperature coefficient of resistance

    Directory of Open Access Journals (Sweden)

    Haifeng Liang

    2014-10-01

    Full Text Available Much effort has been made to study the formation mechanisms of photocurrents in graphene and reduced graphene oxide films under visible and near-infrared light irradiation. A built-in field and photo-thermal electrons have been applied to explain the experiments. However, much less attention has been paid to clarifying the mid-infrared response of reduced graphene oxide films at room temperature. Thus, mid-infrared photoresponse and annealing temperature-dependent resistance experiments were carried out on reduced graphene oxide films. A maximum photocurrent of 75 μA was observed at room temperature, which was dominated by the bolometer effect, where the resistance of the films decreased as the temperature increased after they had absorbed light. The electrons localized in the defect states and the residual oxygen groups were thermally excited into the conduction band, forming a photocurrent. In addition, a temperature increase of 2 °C for the films after light irradiation for 2 minutes was observed using absorption power calculations. This work details a way to use reduced graphene oxide films that contain appropriate defects and residual oxygen groups as bolometer-sensitive materials in the mid-infrared range.

  2. The Findings from the OECD/NEA/CSNI UMS (Uncertainty Method Study)

    International Nuclear Information System (INIS)

    D'Auria, F.; Glaeser, H.

    2013-01-01

    Within licensing procedures there is the incentive to replace the conservative requirements for code application by a 'best estimate' concept supplemented by an uncertainty analysis to account for predictive uncertainties of code results. Methods have been developed to quantify these uncertainties. The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI (Committee on the Safety of Nuclear Installations) of OECD/NEA (Organization for Economic Cooperation and Development / Nuclear Energy Agency), has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes. Most of the methods identify and combine input uncertainties. The major differences between the predictions of the methods came from the choice of uncertain parameters and the quantification of the input uncertainties, i.e. the wideness of the uncertainty ranges. Therefore, suitable experimental and analytical information has to be selected to specify these uncertainty ranges or distributions. After the closure of the Uncertainty Method Study (UMS) and after the report was issued comparison calculations of experiment LSTF-SB-CL-18 were performed by University of Pisa using different versions of the RELAP 5 code. It turned out that the version used by two of the participants calculated a 170 K higher peak clad temperature compared with other versions using the same input deck. This may contribute to the differences of the upper limit of the uncertainty ranges. A 'bifurcation' analysis was also performed by the same research group also providing another way of interpreting the high temperature peak calculated by two of the participants. (authors)

  3. Uncertainty analysis of suppression pool heating during an ATWS in a BWR-5 plant

    International Nuclear Information System (INIS)

    Wulff, W.; Cheng, H.S.; Mallen, A.N.; Johnsen, G.W.; Lellouche, G.S.

    1994-03-01

    The uncertainty has been estimated of predicting the peak temperature in the suppression pool of a BWR power plant, which undergoes an NRC-postulated Anticipated Transient Without Scram (ATWS). The ATWS is initiated by recirculation-pump trips, and then leads to power and flow oscillations as they had occurred at the LaSalle-2 Power Station in March of 1988. After limit-cycle oscillations have been established, the turbines are tripped, but without MSIV closure, allowing steam discharge through the turbine bypass into the condenser. Postulated operator actions, namely to lower the reactor vessel pressure and the level elevation in the downcomer, are simulated by a robot model which accounts for operator uncertainty. All balance of plant and control systems modeling uncertainties were part of the statistical uncertainty analysis that was patterned after the Code Scaling, Applicability and Uncertainty (CSAU) evaluation methodology. The analysis showed that the predicted suppression-pool peak temperature of 329.3 K (133 degrees F) has a 95-percentile uncertainty of 14.4 K (26 degrees F), and that the size of this uncertainty bracket is dominated by the experimental uncertainty of measuring Safety and Relief Valve mass flow rates under critical-flow conditions. The analysis showed also that the probability of exceeding the suppression-pool temperature limit of 352.6 K (175 degrees F) is most likely zero (it is estimated as < 5-104). The square root of the sum of the squares of all the computed peak pool temperatures is 350.7 K (171.6 degrees F)

  4. The reliability of structural systems operating at high temperature: Replacing engineering judgement with operational experience

    International Nuclear Information System (INIS)

    Chevalier, M.J.; Smith, D.J.; Dean, D.W.

    2012-01-01

    Deterministic assessments are used to assess the integrity of structural systems operating at high temperature by providing a lower bound lifetime prediction, requiring considerable engineering judgement. However such a result may not satisfy the structural integrity assessment purpose if the results are overly conservative or conversely plant observations (such as failures) could undermine the assessment result if observed before the lower bound lifetime. This paper develops a reliability methodology for high temperature assessments and illustrates the impact and importance of managing the uncertainties within such an analysis. This is done by separating uncertainties into three classifications; aleatory uncertainty, quantifiable epistemic uncertainty and unquantifiable epistemic uncertainty. The result is a reliability model that can predict the behaviour of a structural system based upon plant observations, including failure and survival data. This can be used to reduce the over reliance upon engineering judgement which is prevalent in deterministic assessments. Highlights: ► Deterministic assessments are shown to be heavily reliant upon engineering judgment. ► Based upon the R5 procedure, a reliability model for a structural system is developed. ► Variables must be classified as either aleatory or epistemic to model their impact on reliability. ► Operation experience is then used to reduce reliance upon engineering judgment. ► This results in a model which can predict system behaviour and learn from operational experience.

  5. Potential for improved radiation thermometry measurement uncertainty through implementing a primary scale in an industrial laboratory

    Science.gov (United States)

    Willmott, Jon R.; Lowe, David; Broughton, Mick; White, Ben S.; Machin, Graham

    2016-09-01

    A primary temperature scale requires realising a unit in terms of its definition. For high temperature radiation thermometry in terms of the International Temperature Scale of 1990 this means extrapolating from the signal measured at the freezing temperature of gold, silver or copper using Planck’s radiation law. The difficulty in doing this means that primary scales above 1000 °C require specialist equipment and careful characterisation in order to achieve the extrapolation with sufficient accuracy. As such, maintenance of the scale at high temperatures is usually only practicable for National Metrology Institutes, and calibration laboratories have to rely on a scale calibrated against transfer standards. At lower temperatures it is practicable for an industrial calibration laboratory to have its own primary temperature scale, which reduces the number of steps between the primary scale and end user. Proposed changes to the SI that will introduce internationally accepted high temperature reference standards might make it practicable to have a primary high temperature scale in a calibration laboratory. In this study such a scale was established by calibrating radiation thermometers directly to high temperature reference standards. The possible reduction in uncertainty to an end user as a result of the reduced calibration chain was evaluated.

  6. LOWERING UNCERTAINTY IN CRUDE OIL MEASUREMENT BY SELECTING OPTIMIZED ENVELOPE COLOR OF A PIPELINE

    Directory of Open Access Journals (Sweden)

    Morteza Saadat

    2011-01-01

    Full Text Available Lowering uncertainty in crude oil volume measurement has been widely considered as one of main purposes in an oil export terminal. It is found that crude oil temperature at metering station has big effects on measured volume and may cause big uncertainty at the metering point. As crude oil flows through an aboveground pipeline, pick up the solar radiation and heat up. This causes the oil temperature at the metering point to rise and higher uncertainty to be created. The amount of temperature rise is depended on exterior surface paint color. In the Kharg Island, there is about 3 km distance between the oil storage tanks and the metering point. The oil flows through the pipeline due to gravity effects as storage tanks are located 60m higher than the metering point. In this study, an analytical model has been conducted for predicting oil temperature at the pipeline exit (the metering point based on climate and geographical conditions of the Kharg Island. The temperature at the metering point has been calculated and the effects of envelope color have been investigated. Further, the uncertainty in the measurement system due to temperature rise has been studied.

  7. Uncertainty analysis of the FRAP code

    International Nuclear Information System (INIS)

    Peck, S.O.

    1978-01-01

    A user oriented, automated uncertainty analysis capability has been built into the Fuel Rod Analysis Program (FRAP) code and has been applied to a pressurized water reactor (PWR) fuel rod undergoing a loss-of-coolant accident (LOCA). The method of uncertainty analysis is the response surface method. The automated version significantly reduced the time required to complete the analysis and, at the same time, greatly increased the problem scope. Results of the analysis showed a significant difference in the total and relative contributions to the uncertainty of the response parameters between steady state and transient conditions

  8. Assessing uncertainty in high-resolution spatial climate data across the US Northeast.

    Science.gov (United States)

    Bishop, Daniel A; Beier, Colin M

    2013-01-01

    Local and regional-scale knowledge of climate change is needed to model ecosystem responses, assess vulnerabilities and devise effective adaptation strategies. High-resolution gridded historical climate (GHC) products address this need, but come with multiple sources of uncertainty that are typically not well understood by data users. To better understand this uncertainty in a region with a complex climatology, we conducted a ground-truthing analysis of two 4 km GHC temperature products (PRISM and NRCC) for the US Northeast using 51 Cooperative Network (COOP) weather stations utilized by both GHC products. We estimated GHC prediction error for monthly temperature means and trends (1980-2009) across the US Northeast and evaluated any landscape effects (e.g., elevation, distance from coast) on those prediction errors. Results indicated that station-based prediction errors for the two GHC products were similar in magnitude, but on average, the NRCC product predicted cooler than observed temperature means and trends, while PRISM was cooler for means and warmer for trends. We found no evidence for systematic sources of uncertainty across the US Northeast, although errors were largest at high elevations. Errors in the coarse-scale (4 km) digital elevation models used by each product were correlated with temperature prediction errors, more so for NRCC than PRISM. In summary, uncertainty in spatial climate data has many sources and we recommend that data users develop an understanding of uncertainty at the appropriate scales for their purposes. To this end, we demonstrate a simple method for utilizing weather stations to assess local GHC uncertainty and inform decisions among alternative GHC products.

  9. Regional amplification of projected changes in extreme temperatures strongly controlled by soil moisture-temperature feedbacks

    Science.gov (United States)

    Vogel, M. M.; Orth, R.; Cheruy, F.; Hagemann, S.; Lorenz, R.; Hurk, B. J. J. M.; Seneviratne, S. I.

    2017-02-01

    Regional hot extremes are projected to increase more strongly than global mean temperature, with substantially larger changes than 2°C even if global warming is limited to this level. We investigate the role of soil moisture-temperature feedbacks for this response based on multimodel experiments for the 21st century with either interactive or fixed (late 20th century mean seasonal cycle) soil moisture. We analyze changes in the hottest days in each year in both sets of experiments, relate them to the global mean temperature increase, and investigate processes leading to these changes. We find that soil moisture-temperature feedbacks significantly contribute to the amplified warming of the hottest days compared to that of global mean temperature. This contribution reaches more than 70% in Central Europe and Central North America. Soil moisture trends are more important for this response than short-term soil moisture variability. These results are relevant for reducing uncertainties in regional temperature projections.

  10. Accounting for uncertainty in marine reserve design.

    Science.gov (United States)

    Halpern, Benjamin S; Regan, Helen M; Possingham, Hugh P; McCarthy, Michael A

    2006-01-01

    Ecosystems and the species and communities within them are highly complex systems that defy predictions with any degree of certainty. Managing and conserving these systems in the face of uncertainty remains a daunting challenge, particularly with respect to developing networks of marine reserves. Here we review several modelling frameworks that explicitly acknowledge and incorporate uncertainty, and then use these methods to evaluate reserve spacing rules given increasing levels of uncertainty about larval dispersal distances. Our approach finds similar spacing rules as have been proposed elsewhere - roughly 20-200 km - but highlights several advantages provided by uncertainty modelling over more traditional approaches to developing these estimates. In particular, we argue that uncertainty modelling can allow for (1) an evaluation of the risk associated with any decision based on the assumed uncertainty; (2) a method for quantifying the costs and benefits of reducing uncertainty; and (3) a useful tool for communicating to stakeholders the challenges in managing highly uncertain systems. We also argue that incorporating rather than avoiding uncertainty will increase the chances of successfully achieving conservation and management goals.

  11. Using internal discharge data in a distributed conceptual model to reduce uncertainty in streamflow simulations

    Science.gov (United States)

    Guerrero, J.; Halldin, S.; Xu, C.; Lundin, L.

    2011-12-01

    Distributed hydrological models are important tools in water management as they account for the spatial variability of the hydrological data, as well as being able to produce spatially distributed outputs. They can directly incorporate and assess potential changes in the characteristics of our basins. A recognized problem for models in general is equifinality, which is only exacerbated for distributed models who tend to have a large number of parameters. We need to deal with the fundamentally ill-posed nature of the problem that such models force us to face, i.e. a large number of parameters and very few variables that can be used to constrain them, often only the catchment discharge. There is a growing but yet limited literature showing how the internal states of a distributed model can be used to calibrate/validate its predictions. In this paper, a distributed version of WASMOD, a conceptual rainfall runoff model with only three parameters, combined with a routing algorithm based on the high-resolution HydroSHEDS data was used to simulate the discharge in the Paso La Ceiba basin in Honduras. The parameter space was explored using Monte-Carlo simulations and the region of space containing the parameter-sets that were considered behavioral according to two different criteria was delimited using the geometric concept of alpha-shapes. The discharge data from five internal sub-basins was used to aid in the calibration of the model and to answer the following questions: Can this information improve the simulations at the outlet of the catchment, or decrease their uncertainty? Also, after reducing the number of model parameters needing calibration through sensitivity analysis: Is it possible to relate them to basin characteristics? The analysis revealed that in most cases the internal discharge data can be used to reduce the uncertainty in the discharge at the outlet, albeit with little improvement in the overall simulation results.

  12. Quantifying reactor safety margins: Application of CSAU [Code Scalability, Applicability and Uncertainty] methodology to LBLOCA: Part 3, Assessment and ranging of parameters for the uncertainty analysis of LBLOCA codes

    International Nuclear Information System (INIS)

    Wulff, W.; Boyack, B.E.; Duffey, R.B.

    1988-01-01

    Comparisons of results from TRAC-PF1/MOD1 code calculations with measurements from Separate Effects Tests, and published experimental data for modeling parameters have been used to determine the uncertainty ranges of code input and modeling parameters which dominate the uncertainty in predicting the Peak Clad Temperature for a postulated Large Break Loss of Coolant Accident (LBLOCA) in a four-loop Westinghouse Pressurized Water Reactor. The uncertainty ranges are used for a detailed statistical analysis to calculate the probability distribution function for the TRAC code-predicted Peak Clad Temperature, as is described in an attendant paper. Measurements from Separate Effects Tests and Integral Effects Tests have been compared with results from corresponding TRAC-PF1/MOD1 code calculations to determine globally the total uncertainty in predicting the Peak Clad Temperature for LBLOCAs. This determination is in support of the detailed statistical analysis mentioned above. The analyses presented here account for uncertainties in input parameters, in modeling and scaling, in computing and in measurements. The analyses are an important part of the work needed to implement the Code Scalability, Applicability and Uncertainty (CSAU) methodology. CSAU is needed to determine the suitability of a computer code for reactor safety analyses and the uncertainty in computer predictions. The results presented here are used to estimate the safety margin of a particular nuclear reactor power plant for a postulated accident. 25 refs., 10 figs., 11 tabs

  13. Reduced density matrix functional theory at finite temperature

    Energy Technology Data Exchange (ETDEWEB)

    Baldsiefen, Tim

    2012-10-15

    Density functional theory (DFT) is highly successful in many fields of research. There are, however, areas in which its performance is rather limited. An important example is the description of thermodynamical variables of a quantum system in thermodynamical equilibrium. Although the finite-temperature version of DFT (FT-DFT) rests on a firm theoretical basis and is only one year younger than its brother, groundstate DFT, it has been successfully applied to only a few problems. Because FT-DFT, like DFT, is in principle exact, these shortcomings can be attributed to the difficulties of deriving valuable functionals for FT-DFT. In this thesis, we are going to present an alternative theoretical description of quantum systems in thermal equilibrium. It is based on the 1-reduced density matrix (1RDM) of the system, rather than on its density and will rather cumbersomly be called finite-temperature reduced density matrix functional theory (FT-RDMFT). Its zero-temperature counterpart (RDMFT) proved to be successful in several fields, formerly difficult to address via DFT. These fields include, for example, the calculation of dissociation energies or the calculation of the fundamental gap, also for Mott insulators. This success is mainly due to the fact that the 1RDM carries more directly accessible ''manybody'' information than the density alone, leading for example to an exact description of the kinetic energy functional. This sparks the hope that a description of thermodynamical systems employing the 1RDM via FT-RDMFT can yield an improvement over FT-DFT. Giving a short review of RDMFT and pointing out difficulties when describing spin-polarized systems initiates our work. We then lay the theoretical framework for FT-RDMFT by proving the required Hohenberg-Kohn-like theorems, investigating and determining the domain of FT-RDMFT functionals and by deriving several properties of the exact functional. Subsequently, we present a perturbative method to

  14. Reduced density matrix functional theory at finite temperature

    International Nuclear Information System (INIS)

    Baldsiefen, Tim

    2012-10-01

    Density functional theory (DFT) is highly successful in many fields of research. There are, however, areas in which its performance is rather limited. An important example is the description of thermodynamical variables of a quantum system in thermodynamical equilibrium. Although the finite-temperature version of DFT (FT-DFT) rests on a firm theoretical basis and is only one year younger than its brother, groundstate DFT, it has been successfully applied to only a few problems. Because FT-DFT, like DFT, is in principle exact, these shortcomings can be attributed to the difficulties of deriving valuable functionals for FT-DFT. In this thesis, we are going to present an alternative theoretical description of quantum systems in thermal equilibrium. It is based on the 1-reduced density matrix (1RDM) of the system, rather than on its density and will rather cumbersomly be called finite-temperature reduced density matrix functional theory (FT-RDMFT). Its zero-temperature counterpart (RDMFT) proved to be successful in several fields, formerly difficult to address via DFT. These fields include, for example, the calculation of dissociation energies or the calculation of the fundamental gap, also for Mott insulators. This success is mainly due to the fact that the 1RDM carries more directly accessible ''manybody'' information than the density alone, leading for example to an exact description of the kinetic energy functional. This sparks the hope that a description of thermodynamical systems employing the 1RDM via FT-RDMFT can yield an improvement over FT-DFT. Giving a short review of RDMFT and pointing out difficulties when describing spin-polarized systems initiates our work. We then lay the theoretical framework for FT-RDMFT by proving the required Hohenberg-Kohn-like theorems, investigating and determining the domain of FT-RDMFT functionals and by deriving several properties of the exact functional. Subsequently, we present a perturbative method to iteratively construct

  15. Evaluation of Sources of Uncertainties in Solar Resource Measurement

    Energy Technology Data Exchange (ETDEWEB)

    Habte, Aron M [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sengupta, Manajit [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-25

    This poster presents a high-level overview of sources of uncertainties in solar resource measurement, demonstrating the impact of various sources of uncertainties -- such as cosine response, thermal offset, spectral response, and others -- on the accuracy of data from several radiometers. The study provides insight on how to reduce the impact of some of the sources of uncertainties.

  16. Impacts of uncertainties in weather and streamflow observations in calibration and evaluation of an elevation distributed HBV-model

    Science.gov (United States)

    Engeland, K.; Steinsland, I.; Petersen-Øverleir, A.; Johansen, S.

    2012-04-01

    The aim of this study is to assess the uncertainties in streamflow simulations when uncertainties in both observed inputs (precipitation and temperature) and streamflow observations used in the calibration of the hydrological model are explicitly accounted for. To achieve this goal we applied the elevation distributed HBV model operating on daily time steps to a small catchment in high elevation in Southern Norway where the seasonal snow cover is important. The uncertainties in precipitation inputs were quantified using conditional simulation. This procedure accounts for the uncertainty related to the density of the precipitation network, but neglects uncertainties related to measurement bias/errors and eventual elevation gradients in precipitation. The uncertainties in temperature inputs were quantified using a Bayesian temperature interpolation procedure where the temperature lapse rate is re-estimated every day. The uncertainty in the lapse rate was accounted for whereas the sampling uncertainty related to network density was neglected. For every day a random sample of precipitation and temperature inputs were drawn to be applied as inputs to the hydrologic model. The uncertainties in observed streamflow were assessed based on the uncertainties in the rating curve model. A Bayesian procedure was applied to estimate the probability for rating curve models with 1 to 3 segments and the uncertainties in their parameters. This method neglects uncertainties related to errors in observed water levels. Note that one rating curve was drawn to make one realisation of a whole time series of streamflow, thus the rating curve errors lead to a systematic bias in the streamflow observations. All these uncertainty sources were linked together in both calibration and evaluation of the hydrologic model using a DREAM based MCMC routine. Effects of having less information (e.g. missing one streamflow measurement for defining the rating curve or missing one precipitation station

  17. Reduced uncertainty of regional scale CLM predictions of net carbon fluxes and leaf area indices with estimated plant-specific parameters

    Science.gov (United States)

    Post, Hanna; Hendricks Franssen, Harrie-Jan; Han, Xujun; Baatz, Roland; Montzka, Carsten; Schmidt, Marius; Vereecken, Harry

    2016-04-01

    Reliable estimates of carbon fluxes and states at regional scales are required to reduce uncertainties in regional carbon balance estimates and to support decision making in environmental politics. In this work the Community Land Model version 4.5 (CLM4.5-BGC) was applied at a high spatial resolution (1 km2) for the Rur catchment in western Germany. In order to improve the model-data consistency of net ecosystem exchange (NEE) and leaf area index (LAI) for this study area, five plant functional type (PFT)-specific CLM4.5-BGC parameters were estimated with time series of half-hourly NEE data for one year in 2011/2012, using the DiffeRential Evolution Adaptive Metropolis (DREAM) algorithm, a Markov Chain Monte Carlo (MCMC) approach. The parameters were estimated separately for four different plant functional types (needleleaf evergreen temperate tree, broadleaf deciduous temperate tree, C3-grass and C3-crop) at four different sites. The four sites are located inside or close to the Rur catchment. We evaluated modeled NEE for one year in 2012/2013 with NEE measured at seven eddy covariance sites in the catchment, including the four parameter estimation sites. Modeled LAI was evaluated by means of LAI derived from remotely sensed RapidEye images of about 18 days in 2011/2012. Performance indices were based on a comparison between measurements and (i) a reference run with CLM default parameters, and (ii) a 60 instance CLM ensemble with parameters sampled from the DREAM posterior probability density functions (pdfs). The difference between the observed and simulated NEE sum reduced 23% if estimated parameters instead of default parameters were used as input. The mean absolute difference between modeled and measured LAI was reduced by 59% on average. Simulated LAI was not only improved in terms of the absolute value but in some cases also in terms of the timing (beginning of vegetation onset), which was directly related to a substantial improvement of the NEE estimates in

  18. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  19. Potential effects of organizational uncertainty on safety

    Energy Technology Data Exchange (ETDEWEB)

    Durbin, N.E. [MPD Consulting Group, Kirkland, WA (United States); Lekberg, A. [Swedish Nuclear Power Inspectorate, Stockholm (Sweden); Melber, B.D. [Melber Consulting, Seattle WA (United States)

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals.

  20. Potential effects of organizational uncertainty on safety

    International Nuclear Information System (INIS)

    Durbin, N.E.; Lekberg, A.; Melber, B.D.

    2001-12-01

    When organizations face significant change - reorganization, mergers, acquisitions, down sizing, plant closures or decommissioning - both the organizations and the workers in those organizations experience significant uncertainty about the future. This uncertainty affects the organization and the people working in the organization - adversely affecting morale, reducing concentration on safe operations, and resulting in the loss of key staff. Hence, organizations, particularly those using high risk technologies, which are facing significant change need to consider and plan for the effects of organizational uncertainty on safety - as well as planning for other consequences of change - technical, economic, emotional, and productivity related. This paper reviews some of what is known about the effects of uncertainty on organizations and individuals, discusses the potential consequences of uncertainty on organizational and individual behavior, and presents some of the implications for safety professionals

  1. Use of Paired Simple and Complex Models to Reduce Predictive Bias and Quantify Uncertainty

    DEFF Research Database (Denmark)

    Doherty, John; Christensen, Steen

    2011-01-01

    -constrained uncertainty analysis. Unfortunately, however, many system and process details on which uncertainty may depend are, by design, omitted from simple models. This can lead to underestimation of the uncertainty associated with many predictions of management interest. The present paper proposes a methodology...... of these details born of the necessity for model outputs to replicate observations of historical system behavior. In contrast, the rapid run times and general numerical reliability of simple models often promulgates good calibration and ready implementation of sophisticated methods of calibration...... that attempts to overcome the problems associated with complex models on the one hand and simple models on the other hand, while allowing access to the benefits each of them offers. It provides a theoretical analysis of the simplification process from a subspace point of view, this yielding insights...

  2. Deterministic uncertainty analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1987-12-01

    This paper presents a deterministic uncertainty analysis (DUA) method for calculating uncertainties that has the potential to significantly reduce the number of computer runs compared to conventional statistical analysis. The method is based upon the availability of derivative and sensitivity data such as that calculated using the well known direct or adjoint sensitivity analysis techniques. Formation of response surfaces using derivative data and the propagation of input probability distributions are discussed relative to their role in the DUA method. A sample problem that models the flow of water through a borehole is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. Propogation of uncertainties by the DUA method is compared for ten cases in which the number of reference model runs was varied from one to ten. The DUA method gives a more accurate representation of the true cumulative distribution of the flow rate based upon as few as two model executions compared to fifty model executions using a statistical approach. 16 refs., 4 figs., 5 tabs

  3. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    Science.gov (United States)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and

  4. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  5. Numerical Investigation of Temperature Distribution in an Eroded Bend Pipe and Prediction of Erosion Reduced Thickness

    Science.gov (United States)

    Zhu, Hongjun; Feng, Guang; Wang, Qijun

    2014-01-01

    Accurate prediction of erosion thickness is essential for pipe engineering. The objective of the present paper is to study the temperature distribution in an eroded bend pipe and find a new method to predict the erosion reduced thickness. Computational fluid dynamic (CFD) simulations with FLUENT software are carried out to investigate the temperature field. And effects of oil inlet rate, oil inlet temperature, and erosion reduced thickness are examined. The presence of erosion pit brings about the obvious fluctuation of temperature drop along the extrados of bend. And the minimum temperature drop presents at the most severe erosion point. Small inlet temperature or large inlet velocity can lead to small temperature drop, while shallow erosion pit causes great temperature drop. The dimensionless minimum temperature drop is analyzed and the fitting formula is obtained. Using the formula we can calculate the erosion reduced thickness, which is only needed to monitor the outer surface temperature of bend pipe. This new method can provide useful guidance for pipeline monitoring and replacement. PMID:24719576

  6. Reducing Reliability Uncertainties for Marine Renewable Energy

    Directory of Open Access Journals (Sweden)

    Sam D. Weller

    2015-11-01

    Full Text Available Technology Readiness Levels (TRLs are a widely used metric of technology maturity and risk for marine renewable energy (MRE devices. To-date, a large number of device concepts have been proposed which have reached the early validation stages of development (TRLs 1–3. Only a handful of mature designs have attained pre-commercial development status following prototype sea trials (TRLs 7–8. In order to navigate through the aptly named “valley of death” (TRLs 4–6 towards commercial realisation, it is necessary for new technologies to be de-risked in terms of component durability and reliability. In this paper the scope of the reliability assessment module of the DTOcean Design Tool is outlined including aspects of Tool integration, data provision and how prediction uncertainties are accounted for. In addition, two case studies are reported of mooring component fatigue testing providing insight into long-term component use and system design for MRE devices. The case studies are used to highlight how test data could be utilised to improve the prediction capabilities of statistical reliability assessment approaches, such as the bottom–up statistical method.

  7. Graphene synthesis on SiC: Reduced graphitization temperature by C-cluster and Ar-ion implantation

    International Nuclear Information System (INIS)

    Zhang, R.; Li, H.; Zhang, Z.D.; Wang, Z.S.; Zhou, S.Y.; Wang, Z.; Li, T.C.; Liu, J.R.; Fu, D.J.

    2015-01-01

    Thermal decomposition of SiC is a promising method for high quality production of wafer-scale graphene layers, when the high decomposition temperature of SiC is substantially reduced. The high decomposition temperature of SiC around 1400 °C is a technical obstacle. In this work, we report on graphene synthesis on 6H–SiC with reduced graphitization temperature via ion implantation. When energetic Ar, C 1 and C 6 -cluster ions implanted into 6H–SiC substrates, some of the Si–C bonds have been broken due to the electronic and nuclear collisions. Owing to the radiation damage induced bond breaking and the implanted C atoms as an additional C source the graphitization temperature was reduced by up to 200 °C

  8. Incorporating outcome uncertainty and prior outcome beliefs in stated preferences

    DEFF Research Database (Denmark)

    Lundhede, Thomas; Jacobsen, Jette Bredahl; Hanley, Nick

    2015-01-01

    Stated preference studies tell respondents that policies create environmental changes with varying levels of uncertainty. However, respondents may include their own a priori assessments of uncertainty when making choices among policy options. Using a choice experiment eliciting respondents......’ preferences for conservation policies under climate change, we find that higher outcome uncertainty reduces utility. When accounting for endogeneity, we find that prior beliefs play a significant role in this cost of uncertainty. Thus, merely stating “objective” levels of outcome uncertainty...

  9. Radiotherapy for breast cancer: respiratory and set-up uncertainties

    International Nuclear Information System (INIS)

    Saliou, M.G.; Giraud, P.; Simon, L.; Fournier-Bidoz, N.; Fourquet, A.; Dendale, R.; Rosenwald, J.C.; Cosset, J.M.

    2005-01-01

    Adjuvant Radiotherapy has been shown to significantly reduce locoregional recurrence but this advantage is associated with increased cardiovascular and pulmonary morbidities. All uncertainties inherent to conformal radiation therapy must be identified in order to increase the precision of treatment; misestimation of these uncertainties increases the potential risk of geometrical misses with, as a consequence, under-dosage of the tumor and/or overdosage of healthy tissues. Geometric uncertainties due to respiratory movements or set-up errors are well known. Two strategies have been proposed to limit their effect: quantification of these uncertainties, which are then taken into account in the final calculation of safety margins and/or reduction of respiratory and set-up uncertainties by an efficient immobilization or gating systems. Measured on portal films with two tangential fields. CLD (central lung distance), defined as the distance between the deep field edge and the interior chest wall at the central axis, seems to be the best predictor of set-up uncertainties. Using CLD, estimated mean set-up errors from the literature are 3.8 and 3.2 mm for the systematic and random errors respectively. These depend partly on the type of immobilization device and could be reduced by the use of portal imaging systems. Furthermore, breast is mobile during respiration with motion amplitude as high as 0.8 to 10 mm in the anteroposterior direction. Respiratory gating techniques, currently on evaluation, have the potential to reduce effect of these movements. Each radiotherapy department should perform its own assessments and determine the geometric uncertainties with respect of the equipment used and its particular treatment practices. This paper is a review of the main geometric uncertainties in breast treatment, due to respiration and set-up, and solutions proposed to limit their impact. (author)

  10. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    Science.gov (United States)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  11. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  12. Climate change impacts on groundwater hydrology – where are the main uncertainties and can they be reduced?

    DEFF Research Database (Denmark)

    Refsgaard, Jens C.; Sonnenborg, Torben; Butts, Michael

    2016-01-01

    This paper assesses how various sources of uncertainty propagate through the uncertainty cascade from emission scenarios through climate models and hydrological models to impacts with particular focus on groundwater aspects for a number of coordinated studies in Denmark. We find results similar...... to surface water studies showing that climate model uncertainty dominates for projections of climate change impacts on streamflow and groundwater heads. However, we find uncertainties related to geological conceptualisation and hydrological model discretisation to be dominating for projections of well field...... climate-hydrology models....

  13. Dynamical attribution of oceanic prediction uncertainty in the North Atlantic: application to the design of optimal monitoring systems

    Science.gov (United States)

    Sévellec, Florian; Dijkstra, Henk A.; Drijfhout, Sybren S.; Germe, Agathe

    2017-11-01

    In this study, the relation between two approaches to assess the ocean predictability on interannual to decadal time scales is investigated. The first pragmatic approach consists of sampling the initial condition uncertainty and assess the predictability through the divergence of this ensemble in time. The second approach is provided by a theoretical framework to determine error growth by estimating optimal linear growing modes. In this paper, it is shown that under the assumption of linearized dynamics and normal distributions of the uncertainty, the exact quantitative spread of ensemble can be determined from the theoretical framework. This spread is at least an order of magnitude less expensive to compute than the approximate solution given by the pragmatic approach. This result is applied to a state-of-the-art Ocean General Circulation Model to assess the predictability in the North Atlantic of four typical oceanic metrics: the strength of the Atlantic Meridional Overturning Circulation (AMOC), the intensity of its heat transport, the two-dimensional spatially-averaged Sea Surface Temperature (SST) over the North Atlantic, and the three-dimensional spatially-averaged temperature in the North Atlantic. For all tested metrics, except for SST, ˜ 75% of the total uncertainty on interannual time scales can be attributed to oceanic initial condition uncertainty rather than atmospheric stochastic forcing. The theoretical method also provide the sensitivity pattern to the initial condition uncertainty, allowing for targeted measurements to improve the skill of the prediction. It is suggested that a relatively small fleet of several autonomous underwater vehicles can reduce the uncertainty in AMOC strength prediction by 70% for 1-5 years lead times.

  14. Method for estimating effects of unknown correlations in spectral irradiance data on uncertainties of spectrally integrated colorimetric quantities

    Science.gov (United States)

    Kärhä, Petri; Vaskuri, Anna; Mäntynen, Henrik; Mikkonen, Nikke; Ikonen, Erkki

    2017-08-01

    Spectral irradiance data are often used to calculate colorimetric properties, such as color coordinates and color temperatures of light sources by integration. The spectral data may contain unknown correlations that should be accounted for in the uncertainty estimation. We propose a new method for estimating uncertainties in such cases. The method goes through all possible scenarios of deviations using Monte Carlo analysis. Varying spectral error functions are produced by combining spectral base functions, and the distorted spectra are used to calculate the colorimetric quantities. Standard deviations of the colorimetric quantities at different scenarios give uncertainties assuming no correlations, uncertainties assuming full correlation, and uncertainties for an unfavorable case of unknown correlations, which turn out to be a significant source of uncertainty. With 1% standard uncertainty in spectral irradiance, the expanded uncertainty of the correlated color temperature of a source corresponding to the CIE Standard Illuminant A may reach as high as 37.2 K in unfavorable conditions, when calculations assuming full correlation give zero uncertainty, and calculations assuming no correlations yield the expanded uncertainties of 5.6 K and 12.1 K, with wavelength steps of 1 nm and 5 nm used in spectral integrations, respectively. We also show that there is an absolute limit of 60.2 K in the error of the correlated color temperature for Standard Illuminant A when assuming 1% standard uncertainty in the spectral irradiance. A comparison of our uncorrelated uncertainties with those obtained using analytical methods by other research groups shows good agreement. We re-estimated the uncertainties for the colorimetric properties of our 1 kW photometric standard lamps using the new method. The revised uncertainty of color temperature is a factor of 2.5 higher than the uncertainty assuming no correlations.

  15. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  16. Evaluating a multispecies adaptive management framework: Must uncertainty impede effective decision-making?

    Science.gov (United States)

    Smith, David R.; McGowan, Conor P.; Daily, Jonathan P.; Nichols, James D.; Sweka, John A.; Lyons, James E.

    2013-01-01

    Application of adaptive management to complex natural resource systems requires careful evaluation to ensure that the process leads to improved decision-making. As part of that evaluation, adaptive policies can be compared with alternative nonadaptive management scenarios. Also, the value of reducing structural (ecological) uncertainty to achieving management objectives can be quantified.A multispecies adaptive management framework was recently adopted by the Atlantic States Marine Fisheries Commission for sustainable harvest of Delaware Bay horseshoe crabs Limulus polyphemus, while maintaining adequate stopover habitat for migrating red knots Calidris canutus rufa, the focal shorebird species. The predictive model set encompassed the structural uncertainty in the relationships between horseshoe crab spawning, red knot weight gain and red knot vital rates. Stochastic dynamic programming was used to generate a state-dependent strategy for harvest decisions given that uncertainty. In this paper, we employed a management strategy evaluation approach to evaluate the performance of this adaptive management framework. Active adaptive management was used by including model weights as state variables in the optimization and reducing structural uncertainty by model weight updating.We found that the value of information for reducing structural uncertainty is expected to be low, because the uncertainty does not appear to impede effective management. Harvest policy responded to abundance levels of both species regardless of uncertainty in the specific relationship that generated those abundances. Thus, the expected horseshoe crab harvest and red knot abundance were similar when the population generating model was uncertain or known, and harvest policy was robust to structural uncertainty as specified.Synthesis and applications. The combination of management strategy evaluation with state-dependent strategies from stochastic dynamic programming was an informative approach to

  17. Projected uranium measurement uncertainties for the Gas Centrifuge Enrichment Plant

    International Nuclear Information System (INIS)

    Younkin, J.M.

    1979-02-01

    An analysis was made of the uncertainties associated with the measurements of the declared uranium streams in the Portsmouth Gas Centrifuge Enrichment Plant (GCEP). The total uncertainty for the GCEP is projected to be from 54 to 108 kg 235 U/year out of a measured total of 200,000 kg 235 U/year. The systematic component of uncertainty of the UF 6 streams is the largest and the dominant contributor to the total uncertainty. A possible scheme for reducing the total uncertainty is given

  18. How to correct the ambient temperature influence on the thermal response test results

    International Nuclear Information System (INIS)

    Borinaga-Treviño, Roque; Norambuena-Contreras, Jose; Castro-Fresno, Daniel

    2015-01-01

    Due to global warming and to the increasing energy demand, it is necessary to improve energy efficiency on buildings. In this context, Ground-Coupled Heat Pumps (GCHP) have proved to be the most efficient heating and cooling system. The main parameters to define a ground heat exchanger are obtained via an in situ test called Thermal Response Test (TRT). However, ambient air influence on this test is remarkable due to the exposition of the testing machine, and even the ground undisturbed temperature varies with the ambient temperature oscillations. Therefore, despite the fact that the influence of ambient conditions on the TRT results is an important topic in order to define a ground heat exchanger, there is yet a limited literature on new theoretical methods to correct the ambient temperature influence on the predicted ground thermal conductivity measured via TRT. This paper presents a new methodology to analyse and mitigate the influence of the ambient conditions on the TRT results, with the main advantage that it is not necessary to know its physical origin previously. The method is focused on reducing the mean fluid temperature oscillations caused by the ambient temperature, by analysing the influence of the chosen time interval to fit the data to the infinite line source theory formulae that finally predicts the ground thermal conductivity. With these purpose, results of two different TRTs were analysed, each of them with a different equipment and ambient exposition. Results using the proposed method showed that thermal conductivity oscillations were reduced in both tests. For the first test, the uncertainty associated to the chosen time interval for the estimation was diminished by 33%, reducing significantly its predicted value and thus avoiding the future installation possible under-designing. However, because of the equipment insulation improvements and the smoother ambient temperature variations, the method obtained similar results for the predicted

  19. A reduced model for ion temperature gradient turbulent transport in helical plasmas

    International Nuclear Information System (INIS)

    Nunami, M.; Watanabe, T.-H.; Sugama, H.

    2013-07-01

    A novel reduced model for ion temperature gradient (ITG) turbulent transport in helical plasmas is presented. The model enables one to predict nonlinear gyrokinetic simulation results from linear gyrokinetic analyses. It is shown from nonlinear gyrokinetic simulations of the ITG turbulence in helical plasmas that the transport coefficient can be expressed as a function of the turbulent fluctuation level and the averaged zonal flow amplitude. Then, the reduced model for the turbulent ion heat diffusivity is derived by representing the nonlinear turbulent fluctuations and zonal flow amplitude in terms of the linear growth rate of the ITG instability and the linear response of the zonal flow potentials. It is confirmed that the reduced transport model results are in good agreement with those from nonlinear gyrokinetic simulations for high ion temperature plasmas in the Large Helical Device. (author)

  20. Reducing uncertainty in nitrogen budgets for African livestock systems

    International Nuclear Information System (INIS)

    Rufino, M C; Brandt, P; Herrero, M; Butterbach-Bahl, K

    2014-01-01

    Livestock is poorly represented in N budgets for the African continent although some studies have examined livestock-related N flows at different levels. Livestock plays an important role in N cycling and therefore on N budgets including livestock-related flows. This study reviews the literature on N budgets for Africa to identify factors contributing to uncertainties. Livestock densities are usually modelled because of the lack of observational spatial data. Even though feed availability and quality varies across seasons, most studies use constant livestock excretion rates, and excreta are usually assumed to be uniformly distributed onto the land. Major uncertainties originate in the fraction of manure managed, and emission factors which may not reflect the situation of Africa. N budgets use coarse assumptions on production, availability, and use of crop residues as livestock feed. No flows between croplands–livestock and rangelands reflect the lack of data. Joint efforts are needed for spatial data collection of livestock data, crowdsourcing appears to be a promising option. The focus of the assessment of N budgets must go beyond croplands to include livestock and crop–livestock flows. We propose a nested systems definition of livestock systems to link local, regional level, and continental level and to increase the usefulness of point measurements of N losses. Scientists working at all levels should generate data to calibrate process-based models. Measurements in the field should not only concentrate on greenhouse gas emissions, but need to include crop and livestock production measurements, soil stock changes and other N loss pathways such as leaching, run-off and volatilization to assess management practices and trade-offs. Compared to the research done in other continents on N flows in livestock systems, there are few data for Africa, and therefore concerted effort will be needed to generate sufficient data for modelling. (paper)

  1. Scientific Uncertainties in Climate Change Detection and Attribution Studies

    Science.gov (United States)

    Santer, B. D.

    2017-12-01

    It has been claimed that the treatment and discussion of key uncertainties in climate science is "confined to hushed sidebar conversations at scientific conferences". This claim is demonstrably incorrect. Climate change detection and attribution studies routinely consider key uncertainties in observational climate data, as well as uncertainties in model-based estimates of natural variability and the "fingerprints" in response to different external forcings. The goal is to determine whether such uncertainties preclude robust identification of a human-caused climate change fingerprint. It is also routine to investigate the impact of applying different fingerprint identification strategies, and to assess how detection and attribution results are impacted by differences in the ability of current models to capture important aspects of present-day climate. The exploration of the uncertainties mentioned above will be illustrated using examples from detection and attribution studies with atmospheric temperature and moisture.

  2. Comments on Uncertainty in Groundwater Governance in the Volcanic Canary Islands, Spain

    OpenAIRE

    Custodio, Emilio; Cabrera, María; Poncela, Roberto; Cruz-Fuentes, Tatiana; Naranjo, Gema; Miguel, Luis de

    2015-01-01

    The uncertainty associated with natural magnitudes and processes is conspicuous in water resources and groundwater evaluation. This uncertainty has an essential component and a part that can be reduced to some extent by increasing knowledge, improving monitoring coverage, continuous elaboration of data and accuracy and addressing the related economic and social aspects involved. Reducing uncertainty has a cost that may not be justified by the improvement that is obtainable, but that has to be...

  3. The uncertainty analysis of model results a practical guide

    CERN Document Server

    Hofer, Eduard

    2018-01-01

    This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.

  4. How Well Does Fracture Set Characterization Reduce Uncertainty in Capture Zone Size for Wells Situated in Sedimentary Bedrock Aquifers?

    Science.gov (United States)

    West, A. C.; Novakowski, K. S.

    2005-12-01

    Regional groundwater flow models are rife with uncertainty. The three-dimensional flux vector fields must generally be inferred using inverse modelling from sparse measurements of hydraulic head, from measurements of hydraulic parameters at a scale that is miniscule in comparison to that of the domain, and from none to a very few measurements of recharge or discharge rate. Despite the inherent uncertainty in these models they are routinely used to delineate steady-state or time-of-travel capture zones for the purpose of wellhead protection. The latter are defined as the volume of the aquifer within which released particles will arrive at the well within the specified time and their delineation requires the additional step of dividing the magnitudes of the flux vectors by the assumed porosity to arrive at the ``average linear groundwater velocity'' vector field. Since the porosity is usually assumed constant over the domain one could be forgiven for thinking that the uncertainty introduced at this step is minor in comparison to the flow model calibration step. We consider this question when the porosity in question is fracture porosity in flat-lying sedimentary bedrock. We also consider whether or not the diffusive uptake of solute into the rock matrix which lies between the source and the production well reduces or enhances the uncertainty. To evaluate the uncertainty an aquifer cross section is conceptualized as an array of horizontal, randomly-spaced, parallel-plate fractures of random aperture, with adjacent horizontal fractures connected by vertical fractures again of random spacing and aperture. The source is assumed to be a continuous concentration (i.e. a dirichlet boundary condition) representing a leaking tank or a DNAPL pool, and the receptor is a fully pentrating well located in the down-gradient direction. In this context the time-of-travel capture zone is defined as the separation distance required such that the source does not contaminate the well

  5. One Approach to the Fire PSA Uncertainty Analysis

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2002-01-01

    Experienced practical events and findings from the number of fire probabilistic safety assessment (PSA) studies show that fire has high relative importance for nuclear power plant safety. Fire PSA is a very challenging phenomenon and a number of issues are still in the area of research and development. This has a major impact on the conservatism of fire PSA findings. One way to reduce the level of conservatism is to conduct uncertainty analysis. At the top-level, uncertainty of the fire PSA can be separated in to three segments. The first segment is related to fire initiating events frequencies. The second uncertainty segment is connected to the uncertainty of fire damage. Finally, there is uncertainty related to the PSA model, which propagates this fire-initiated damage to the core damage or other analyzed risk. This paper discusses all three segments of uncertainty. Some recent experience with fire PSA study uncertainty analysis, usage of fire analysis code COMPBRN IIIe, and uncertainty evaluation importance to the final result is presented.(author)

  6. Uncertainties assessment for safety margins evaluation in MTR reactors core thermal-hydraulic design

    International Nuclear Information System (INIS)

    Gimenez, M.; Schlamp, M.; Vertullo, A.

    2002-01-01

    This report contains a bibliographic review and a critical analysis of different methodologies used for uncertainty evaluation in research reactors core safety related parameters. Different parameters where uncertainties are considered are also presented and discussed, as well as their intrinsic nature regarding the way their uncertainty combination must be done. Finally a combined statistical method with direct propagation of uncertainties and a set of basic parameters as wall and DNB temperatures, CHF, PRD and their respective ratios where uncertainties should be considered is proposed. (author)

  7. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  8. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  9. Observational uncertainty and regional climate model evaluation: A pan-European perspective

    Science.gov (United States)

    Kotlarski, Sven; Szabó, Péter; Herrera, Sixto; Räty, Olle; Keuler, Klaus; Soares, Pedro M.; Cardoso, Rita M.; Bosshard, Thomas; Pagé, Christian; Boberg, Fredrik; Gutiérrez, José M.; Jaczewski, Adam; Kreienkamp, Frank; Liniger, Mark. A.; Lussana, Cristian; Szepszo, Gabriella

    2017-04-01

    Local and regional climate change assessments based on downscaling methods crucially depend on the existence of accurate and reliable observational reference data. In dynamical downscaling via regional climate models (RCMs) observational data can influence model development itself and, later on, model evaluation, parameter calibration and added value assessment. In empirical-statistical downscaling, observations serve as predictand data and directly influence model calibration with corresponding effects on downscaled climate change projections. Focusing on the evaluation of RCMs, we here analyze the influence of uncertainties in observational reference data on evaluation results in a well-defined performance assessment framework and on a European scale. For this purpose we employ three different gridded observational reference grids, namely (1) the well-established EOBS dataset (2) the recently developed EURO4M-MESAN regional re-analysis, and (3) several national high-resolution and quality-controlled gridded datasets that recently became available. In terms of climate models five reanalysis-driven experiments carried out by five different RCMs within the EURO-CORDEX framework are used. Two variables (temperature and precipitation) and a range of evaluation metrics that reflect different aspects of RCM performance are considered. We furthermore include an illustrative model ranking exercise and relate observational spread to RCM spread. The results obtained indicate a varying influence of observational uncertainty on model evaluation depending on the variable, the season, the region and the specific performance metric considered. Over most parts of the continent, the influence of the choice of the reference dataset for temperature is rather small for seasonal mean values and inter-annual variability. Here, model uncertainty (as measured by the spread between the five RCM simulations considered) is typically much larger than reference data uncertainty. For

  10. Optimization Under Uncertainty for Wake Steering Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-08-03

    Offsetting turbines' yaw orientations from incoming wind is a powerful tool that may be leveraged to reduce undesirable wake effects on downstream turbines. First, we examine a simple two-turbine case to gain intuition as to how inflow direction uncertainty affects the optimal solution. The turbines are modeled with unidirectional inflow such that one turbine directly wakes the other, using ten rotor diameter spacing. We perform optimization under uncertainty (OUU) via a parameter sweep of the front turbine. The OUU solution generally prefers less steering. We then do this optimization for a 60-turbine wind farm with unidirectional inflow, varying the degree of inflow uncertainty and approaching this OUU problem by nesting a polynomial chaos expansion uncertainty quantification routine within an outer optimization. We examined how different levels of uncertainty in the inflow direction effect the ratio of the expected values of deterministic and OUU solutions for steering strategies in the large wind farm, assuming the directional uncertainty used to reach said OUU solution (this ratio is defined as the value of the stochastic solution or VSS).

  11. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    Science.gov (United States)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  12. Opportunities to Reduce Air-Conditioning Loads Through Lower Cabin Soak Temperatures

    International Nuclear Information System (INIS)

    Farrington, R.; Cuddy, M.; Keyser, M.; Rugh, J.

    1999-01-01

    Air-conditioning loads can significantly reduce electric vehicle (EV) range and hybrid electric vehicle (HEV) fuel economy. In addition, a new U. S. emissions procedure, called the Supplemental Federal Test Procedure (SFTP), has provided the motivation for reducing the size of vehicle air-conditioning systems in the United States. The SFTP will measure tailpipe emissions with the air-conditioning system operating. If the size of the air-conditioning system is reduced, the cabin soak temperature must also be reduced, with no penalty in terms of passenger thermal comfort. This paper presents the impact of air-conditioning on EV range and HEV fuel economy, and compares the effectiveness of advanced glazing and cabin ventilation. Experimental and modeled results are presented

  13. Dimensional measurements with submicrometer uncertainty in production environment

    DEFF Research Database (Denmark)

    De Chiffre, L.; Gudnason, M. M.; Madruga, D.

    2015-01-01

    The work concerns a laboratory investigation of a method to achieve dimensional measurements with submicrometer uncertainty under conditions that are typical of a production environment. The method involves the concurrent determination of dimensions and material properties from measurements carried...... gauge blocks along with their uncertainties were estimated directly from the measurements. The length of the two workpieces at the reference temperature of 20 °C was extrapolated from the measurements and compared to certificate values. The investigations have documented that the developed approach...

  14. Uncertainty assessing of measure result of tungsten in U3O8 by ICP-AES

    International Nuclear Information System (INIS)

    Du Guirong; Nie Jie; Tang Lilei

    2011-01-01

    According as the determining method and the assessing criterion,the uncertainty assessing of measure result of tungsten in U 3 O 8 by ICP-AES is researched. With the assessment of each component in detail, the result shows that u rel (sc)> u rel (c)> u rel (F)> u rel (m) by uncertainty contribution. Other uncertainty is random, calculated by repetition. u rel (sc) is contributed to uncertainty mainly. So the general uncertainty is reduced with strict operation to reduce u rel (sc). (authors)

  15. A new approach to systematic uncertainties and self-consistency in helium abundance determinations

    International Nuclear Information System (INIS)

    Aver, Erik; Olive, Keith A.; Skillman, Evan D.

    2010-01-01

    Tests of big bang nucleosynthesis and early universe cosmology require precision measurements for helium abundance determinations. However, efforts to determine the primordial helium abundance via observations of metal poor H II regions have been limited by significant uncertainties (compared with the value inferred from BBN theory using the CMB determined value of the baryon density). This work builds upon previous work by providing an updated and extended program in evaluating these uncertainties. Procedural consistency is achieved by integrating the hydrogen based reddening correction with the helium based abundance calculation, i.e., all physical parameters are solved for simultaneously. We include new atomic data for helium recombination and collisional emission based upon recent work by Porter \\etal and wavelength dependent corrections to underlying absorption are investigated. The set of physical parameters has been expanded here to include the effects of neutral hydrogen collisional emission. It is noted that Hγ and Hδ allow better isolation of the collisional effects from the reddening. Because of a degeneracy between the solutions for density and temperature, the precision of the helium abundance determinations is limited. Also, at lower temperatures (T ∼ p as 0.2561 ± 0.0108, in broad agreement with the WMAP result. Alternatively, a simple average of the data yields Y p 0.2566 ± 0.0028. Tests with synthetic data show a potential for distinct improvement, via removal of underlying absorption, using higher resolution spectra. A small bias in the abundance determination can be reduced significantly and the calculated helium abundance error can be reduced by ∼ 25%

  16. Exploring entropic uncertainty relation in the Heisenberg XX model with inhomogeneous magnetic field

    Science.gov (United States)

    Huang, Ai-Jun; Wang, Dong; Wang, Jia-Ming; Shi, Jia-Dong; Sun, Wen-Yang; Ye, Liu

    2017-08-01

    In this work, we investigate the quantum-memory-assisted entropic uncertainty relation in a two-qubit Heisenberg XX model with inhomogeneous magnetic field. It has been found that larger coupling strength J between the two spin-chain qubits can effectively reduce the entropic uncertainty. Besides, we observe the mechanics of how the inhomogeneous field influences the uncertainty, and find out that when the inhomogeneous field parameter b1. Intriguingly, the entropic uncertainty can shrink to zero when the coupling coefficients are relatively large, while the entropic uncertainty only reduces to 1 with the increase of the homogeneous magnetic field. Additionally, we observe the purity of the state and Bell non-locality and obtain that the entropic uncertainty is anticorrelated with both the purity and Bell non-locality of the evolution state.

  17. Comments on Uncertainty in Groundwater Governance in the Volcanic Canary Islands, Spain

    Directory of Open Access Journals (Sweden)

    Emilio Custodio

    2015-06-01

    Full Text Available The uncertainty associated with natural magnitudes and processes is conspicuous in water resources and groundwater evaluation. This uncertainty has an essential component and a part that can be reduced to some extent by increasing knowledge, improving monitoring coverage, continuous elaboration of data and accuracy and addressing the related economic and social aspects involved. Reducing uncertainty has a cost that may not be justified by the improvement that is obtainable, but that has to be known to make the right decisions. With this idea, this paper contributes general comments on the evaluation of groundwater resources in the semiarid Canary Islands and on some of the main sources of uncertainty, but a full treatment is not attempted, nor how to reduce it. Although the point of view is local, these comments may help to address similar situations on other islands where similar problems appear. A consequence of physical and hydrological uncertainty is that different hydrogeological and water resource studies and evaluations may yield different results. Understanding and coarsely evaluating uncertainty helps in reducing administrative instability, poor decisions that may harm groundwater property rights, the rise of complaints and the sub-optimal use of the scarce water resources available in semiarid areas. Transparency and honesty are needed, but especially a clear understanding of what numbers mean and the uncertainty around them, to act soundly and avoid conflicting and damaging rigid attitudes. However, the different situations could condition that what may be good in a place, may not always be the case in other places.

  18. Uncertainty governance: an integrated framework for managing and communicating uncertainties

    International Nuclear Information System (INIS)

    Umeki, H.; Naito, M.; Takase, H.

    2004-01-01

    Treatment of uncertainty, or in other words, reasoning with imperfect information is widely recognised as being of great importance within performance assessment (PA) of the geological disposal mainly because of the time scale of interest and spatial heterogeneity that geological environment exhibits. A wide range of formal methods have been proposed for the optimal processing of incomplete information. Many of these methods rely on the use of numerical information, the frequency based concept of probability in particular, to handle the imperfections. However, taking quantitative information as a base for models that solve the problem of handling imperfect information merely creates another problem, i.e., how to provide the quantitative information. In many situations this second problem proves more resistant to solution, and in recent years several authors have looked at a particularly ingenious way in accordance with the rules of well-founded methods such as Bayesian probability theory, possibility theory, and the Dempster-Shafer theory of evidence. Those methods, while drawing inspiration from quantitative methods, do not require the kind of complete numerical information required by quantitative methods. Instead they provide information that, though less precise than that provided by quantitative techniques, is often, if not sufficient, the best that could be achieved. Rather than searching for the best method for handling all imperfect information, our strategy for uncertainty management, that is recognition and evaluation of uncertainties associated with PA followed by planning and implementation of measures to reduce them, is to use whichever method best fits the problem at hand. Such an eclectic position leads naturally to integration of the different formalisms. While uncertainty management based on the combination of semi-quantitative methods forms an important part of our framework for uncertainty governance, it only solves half of the problem

  19. Dual direction blower system powered by solar energy to reduce car cabin temperature in open parking condition

    Science.gov (United States)

    Hamdan, N. S.; Radzi, M. F. M.; Damanhuri, A. A. M.; Mokhtar, S. N.

    2017-10-01

    El-nino phenomenon that strikes Malaysia with temperature recorded more than 35°C can lead to extreme temperature rise in car cabin up to 80°C. Various problems will arise due to this extreme rising of temperature such as the occupant are vulnerable to heat stroke, emission of benzene gas that can cause cancer due to reaction of high temperature with interior compartments, and damage of compartments in the car. The current solution available to reduce car cabin temperature including tinted of window and portable heat rejection device that are available in the market. As an alternative to reduce car cabin temperature, this project modifies the car’s air conditioning blower motor into dual direction powered by solar energy and identifies its influence to temperature inside the car, parked under scorching sun. By reducing the car cabin temperature up to 10°C which equal to 14% of reduction in the car cabin temperature, this simple proposed system aims to provide comfort to users due to its capability in improving the quality of air and moisture in the car cabin.

  20. Perseveration induces dissociative uncertainty in obsessive-compulsive disorder.

    Science.gov (United States)

    Giele, Catharina L; van den Hout, Marcel A; Engelhard, Iris M; Dek, Eliane C P; Toffolo, Marieke B J; Cath, Danielle C

    2016-09-01

    Obsessive compulsive (OC)-like perseveration paradoxically increases feelings of uncertainty. We studied whether the underlying mechanism between perseveration and uncertainty is a reduced accessibility of meaning ('semantic satiation'). OCD patients (n = 24) and matched non-clinical controls (n = 24) repeated words 2 (non-perseveration) or 20 times (perseveration). They decided whether this word was related to another target word. Speed of relatedness judgments and feelings of dissociative uncertainty were measured. The effects of real-life perseveration on dissociative uncertainty were tested in a smaller subsample of the OCD group (n = 9). Speed of relatedness judgments was not affected by perseveration. However, both groups reported more dissociative uncertainty after perseveration compared to non-perseveration, which was higher in OCD patients. Patients reported more dissociative uncertainty after 'clinical' perseveration compared to non-perseveration.. Both parts of this study are limited by some methodological issues and a small sample size. Although the mechanism behind 'perseveration → uncertainty' is still unclear, results suggest that the effects of perseveration are counterproductive. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Statistically based uncertainty assessments in nuclear risk analysis

    International Nuclear Information System (INIS)

    Spencer, F.W.; Diegert, K.V.; Easterling, R.G.

    1987-01-01

    Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however

  2. Tightness Entropic Uncertainty Relation in Quantum Markovian-Davies Environment

    Science.gov (United States)

    Zhang, Jun; Liu, Liang; Han, Yan

    2018-05-01

    In this paper, we investigate the tightness of entropic uncertainty relation in the absence (presence) of the quantum memory which the memory particle being weakly coupled to a decohering Davies-type Markovian environment. The results show that the tightness of the quantum uncertainty relation can be controlled by the energy relaxation time F, the dephasing time G and the rescaled temperature p, the perfect tightness can be arrived by dephasing and energy relaxation satisfying F = 2G and p = 1/2. In addition, the tightness of the memory-assisted entropic uncertainty relation and the entropic uncertainty relation can be influenced mainly by the purity. While in memory-assisted model, the purity and quantum correlation can also influence the tightness actively while the quantum entanglement can influence the tightness slightly.

  3. Uncertainty evaluation methods for waste package performance assessment

    International Nuclear Information System (INIS)

    Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.

    1991-01-01

    This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs

  4. Paradoxical effects of compulsive perseveration : Sentence repetition causes semantic uncertainty

    NARCIS (Netherlands)

    Giele, Catharina L.; van den Hout, Marcel A.; Engelhard, Iris M.; Dek, Eliane C P

    2014-01-01

    Many patients with obsessive compulsive disorder (OCD) perform perseverative checking behavior to reduce uncertainty, but studies have shown that this ironically increases uncertainty. Some patients also tend to perseveratively repeat sentences. The aim of this study was to examine whether sentence

  5. Probabilistic estimates of 1.5-degree carbon budgets based on uncertainty in transient climate response and aerosol forcing

    Science.gov (United States)

    Partanen, A. I.; Mengis, N.; Jalbert, J.; Matthews, D.

    2017-12-01

    Nations agreed to limit the increase in global mean surface temperature relative to the preindustrial era below 2 degrees Celsius and pursue efforts to a more ambitious goal of 1.5 degrees Celsius. To achieve these goals, it is necessary to assess the amount of cumulative carbon emissions compatible with these temperature targets, i.e. so called carbon budgets. In this work, we use the intermediate complexity University of Victoria Earth System Climate Model (UVic ESCM) to assess how uncertainty in aerosol forcing and transient climate response transfers to uncertainty in future carbon budgets for burning fossil fuels. We create a perturbed parameter ensemble of model simulations by scaling aerosol forcing and transient climate response, and assess the likelihood of each simulation by comparing the simulated historical cumulative carbon emissions, CO2 concentration and radiative balance to observations. By weighting the results of each simulation with the likelihood of the simulation, the preliminary results give a carbon budget of 48 Pg C to reach 1.5 degree Celsius temperature increase. The small weighted mean is due to large fraction of simulations with strong aerosol forcing and transient climate response giving negative carbon budgets for this time period. The probability of the carbon budget being over 100 Pg C was 38% and 23% for over 200 Pg carbon budget. The carbon budgets after temperature stabilization at 1.5 degrees are even smaller with a weighted mean of -100 Pg C until the year 2200. The main reason for the negative carbon budgets after temperature stabilization is an assumed strong decrease in aerosol forcing in the 21st century. Conversely, simulations with weak aerosol forcing and transient climate response give positive carbon budgets. Our results highlight both the importance of reducing uncertainty in aerosol forcing and transient climate response, and of taking the non-CO2 forcers into account when estimating carbon budgets.

  6. High Temperature Thermosetting Polyimide Nanocomposites Prepared with Reduced Charge Organoclay

    Science.gov (United States)

    Campbell, Sandi; Liang, Margaret I.

    2005-01-01

    The naturally occurring sodium and calcium cations found in bentonite clay galleries were exchanged with lithium cations. Following the cation exchange, a series of reduced charge clays were prepared by heat treatment of the lithium bentonite at 130 C, 150 C, or 170 C. Inductively coupled plasma (ICP) analysis showed that heating the lithium clay at elevated temperatures reduced its cation exchange capacity. Ion exchange of heat-treated clays with either a protonated alkyl amine or a protonated aromatic diamine resulted in decreasing amounts of the organic modifier incorporated into the lithium clay. The level of silicate dispersion in a thermosetting polyimide matrix was dependent upon the temperature of Li-clay heat treatment as well as the organic modification. In general, clays treated at 150 C or 170 C, and exchanged with protonated octadcylamine or protonated 2,2'-dimethlybenzidine (DMBZ) showed a higher degree of dispersion than clays treated at 130 C, or exchanged with protonated dodecylamine. Dynamic mechanical analysis showed little change in the storage modulus or T(sub g) of the nanocomposites compared to the base resin. However, long term isothermal aging of the samples showed a significant decrease in the resin oxidative weight loss. Nanocomposite samples aged in air for 1000 hours at 288 C showed of to a decrease in weight loss compared to that of the base resin. This again was dependent on the temperature at which the Li-clay was heated and the choice of organic modification.

  7. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  8. Daphnia fed algal food grown at elevated temperature have reduced fitness

    Directory of Open Access Journals (Sweden)

    Anna B. Sikora

    2014-05-01

    Full Text Available Lake water temperature is negatively correlated with fatty acids content and P:C ratio in green algae. Hence, elevated temperature may indirectly reduce the fitness of Daphnia due to induced decrease in algal food quality. The aim of this study was to test the hypotheses that quality of algal food decreases with increasing temperature of its culture and that large-bodied Daphnia are more vulnerable to the temperature-related deterioration of algal food quality than small-bodied ones. Laboratory life-table experiments were performed at 20°C with large-bodied D. pulicaria and small-bodied D. cucullata fed with the green alga Scenedesmus obliquus, that had been grown at temperatures of 16, 24 or 32°C. The somatic growth rates of both species decreased significantly with increasing algal culture temperature and this effect was more pronounced in D. pulicaria than in D. cucullata. In the former species, age at first reproduction significantly increased and clutch size significantly decreased with increasing temperature of algae growth, while no significant changes in these two parameters were observed in the latter species. The proportion of egg-bearing females decreased with increasing algal culture temperature in both species. The results of this study support the notion that the quality of algal food decreases with increasing water temperature and also suggest that small-bodied Daphnia species might be less vulnerable to temperature-related decreases in algal food quality than large-bodied ones.

  9. Synthesis copolymer use to reduce pour point temperature of diamond crude oil

    Science.gov (United States)

    Than, Dao Viet; Chuong, Thai Hong; Tuy, Dao Quoc

    2017-09-01

    Diamond oil field is located in Block 01&02 Offshore Vietnam. Crude oil from Diamond Well Head Platform (WHP) is evacuated to FPSO via 20km 10" subsea flexible pipeline. The lowest seabed temperature in the field is 22°C, while the pour point temperature (PPT) of Diamond crude oil is very high (36°C) due to high paraffin content (25%). So studying to research a suitable Pour Point Depressant (PPD) for the crude oil is very important. The PPD must have ability to reduce pour point temperature of crude oil from 36°C to 21°C.

  10. Uncertainty and global climate change research

    Energy Technology Data Exchange (ETDEWEB)

    Tonn, B.E. [Oak Ridge National Lab., TN (United States); Weiher, R. [National Oceanic and Atmospheric Administration, Boulder, CO (United States)

    1994-06-01

    The Workshop on Uncertainty and Global Climate Change Research March 22--23, 1994, in Knoxville, Tennessee. This report summarizes the results and recommendations of the workshop. The purpose of the workshop was to examine in-depth the concept of uncertainty. From an analytical point of view, uncertainty is a central feature of global climate science, economics and decision making. The magnitude and complexity of uncertainty surrounding global climate change has made it quite difficult to answer even the most simple and important of questions-whether potentially costly action is required now to ameliorate adverse consequences of global climate change or whether delay is warranted to gain better information to reduce uncertainties. A major conclusion of the workshop is that multidisciplinary integrated assessments using decision analytic techniques as a foundation is key to addressing global change policy concerns. First, uncertainty must be dealt with explicitly and rigorously since it is and will continue to be a key feature of analysis and recommendations on policy questions for years to come. Second, key policy questions and variables need to be explicitly identified, prioritized, and their uncertainty characterized to guide the entire scientific, modeling, and policy analysis process. Multidisciplinary integrated assessment techniques and value of information methodologies are best suited for this task. In terms of timeliness and relevance of developing and applying decision analytic techniques, the global change research and policy communities are moving rapidly toward integrated approaches to research design and policy analysis.

  11. Reducing the sensitivity of IMPT treatment plans to setup errors and range uncertainties via probabilistic treatment planning

    International Nuclear Information System (INIS)

    Unkelbach, Jan; Bortfeld, Thomas; Martin, Benjamin C.; Soukup, Martin

    2009-01-01

    Treatment plans optimized for intensity modulated proton therapy (IMPT) may be very sensitive to setup errors and range uncertainties. If these errors are not accounted for during treatment planning, the dose distribution realized in the patient may by strongly degraded compared to the planned dose distribution. The authors implemented the probabilistic approach to incorporate uncertainties directly into the optimization of an intensity modulated treatment plan. Following this approach, the dose distribution depends on a set of random variables which parameterize the uncertainty, as does the objective function used to optimize the treatment plan. The authors optimize the expected value of the objective function. They investigate IMPT treatment planning regarding range uncertainties and setup errors. They demonstrate that incorporating these uncertainties into the optimization yields qualitatively different treatment plans compared to conventional plans which do not account for uncertainty. The sensitivity of an IMPT plan depends on the dose contributions of individual beam directions. Roughly speaking, steep dose gradients in beam direction make treatment plans sensitive to range errors. Steep lateral dose gradients make plans sensitive to setup errors. More robust treatment plans are obtained by redistributing dose among different beam directions. This can be achieved by the probabilistic approach. In contrast, the safety margin approach as widely applied in photon therapy fails in IMPT and is neither suitable for handling range variations nor setup errors.

  12. Uncertainty Assessment: What Good Does it Do? (Invited)

    Science.gov (United States)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    The scientific community has devoted considerable time and energy to understanding, quantifying and articulating the uncertainties related to anthropogenic climate change. However, informed decision-making and good public policy arguably rely far more on a central core of understanding of matters that are scientifically well established than on detailed understanding and articulation of all relevant uncertainties. Advocates of vaccination, for example, stress its overall efficacy in preventing morbidity and mortality--not the uncertainties over how long the protective effects last. Advocates for colonoscopy for cancer screening stress its capacity to detect polyps before they become cancerous, with relatively little attention paid to the fact that many, if not most, polyps, would not become cancerous even if left unremoved. So why has the climate science community spent so much time focused on uncertainty? One reason, of course, is that articulation of uncertainty is a normal and appropriate part of scientific work. However, we argue that there is another reason that involves the pressure that the scientific community has experienced from individuals and groups promoting doubt about anthropogenic climate change. Specifically, doubt-mongering groups focus public attention on scientific uncertainty as a means to undermine scientific claims, equating uncertainty with untruth. Scientists inadvertently validate these arguments by agreeing that much of the science is uncertain, and thus seemingly implying that our knowledge is insecure. The problem goes further, as the scientific community attempts to articulate more clearly, and reduce, those uncertainties, thus, seemingly further agreeing that the knowledge base is insufficient to warrant public and governmental action. We refer to this effect as 'seepage,' as the effects of doubt-mongering seep into the scientific community and the scientific agenda, despite the fact that addressing these concerns does little to alter

  13. Sensitivity functions for uncertainty analysis: Sensitivity and uncertainty analysis of reactor performance parameters

    International Nuclear Information System (INIS)

    Greenspan, E.

    1982-01-01

    This chapter presents the mathematical basis for sensitivity functions, discusses their physical meaning and information they contain, and clarifies a number of issues concerning their application, including the definition of group sensitivities, the selection of sensitivity functions to be included in the analysis, and limitations of sensitivity theory. Examines the theoretical foundation; criticality reset sensitivities; group sensitivities and uncertainties; selection of sensitivities included in the analysis; and other uses and limitations of sensitivity functions. Gives the theoretical formulation of sensitivity functions pertaining to ''as-built'' designs for performance parameters of the form of ratios of linear flux functionals (such as reaction-rate ratios), linear adjoint functionals, bilinear functions (such as reactivity worth ratios), and for reactor reactivity. Offers a consistent procedure for reducing energy-dependent or fine-group sensitivities and uncertainties to broad group sensitivities and uncertainties. Provides illustrations of sensitivity functions as well as references to available compilations of such functions and of total sensitivities. Indicates limitations of sensitivity theory originating from the fact that this theory is based on a first-order perturbation theory

  14. Reducing the ordering temperature of CoPt nanoparticles by B additive

    Energy Technology Data Exchange (ETDEWEB)

    Khemjeen, Yutthaya [Materials Science and Nanotechnology Program, Faculty of Science, Khon Kaen University, Khon Kaen 40002 (Thailand); Pinitsoontorn, Supree, E-mail: psupree@kku.ac.th; Chompoosor, Apiwat [Department of Physics, Faculty of Science, Khon Kaen University, Khon Kaen 40002 (Thailand); Integrated Nanotechnology Research Center, Khon Kaen University, Khon Kaen 40002 (Thailand); Nanotec-KKU Center of Excellence on Advanced Nanomaterials for Energy Production and Storage, Khon Kaen University, Khon Kaen 40002 (Thailand); Maensiri, Santi [School of Physics, Institute of Science, Suranaree University of Technology, Nakhon Ratchasima 30000 (Thailand)

    2014-08-07

    We reported the effect of boron addition on magnetic properties and structure of CoPt nanoparticles prepared by a polyol method. The magnetic property measurement showed that the CoPt-B sample exhibited a much larger coercivity compared to the sample without B additive at the same annealing temperature. Transmission electron microscopy and energy dispersive X-ray spectroscopy revealed that the average particle size was about 2 nm for the as-synthesized sample with the ratio of Co and Pt close to 1:1. After annealing, the particle sizes increased but the composition was maintained. The phase transformation of the nanoparticles versus temperature was investigated using a combination of X-ray diffraction and in-situ X-ray absorption analysis. It was shown that the phase transition temperature at which the nanoparticles change from the disordered A1 phase to the ordered L1{sub 0} phase occurs at temperature of 600 °C. We concluded that boron additives could reduce the ordering temperature of CoPt of about 100 °C.

  15. Summary of existing uncertainty methods

    International Nuclear Information System (INIS)

    Glaeser, Horst

    2013-01-01

    A summary of existing and most used uncertainty methods is presented, and the main features are compared. One of these methods is the order statistics method based on Wilks' formula. It is applied in safety research as well as in licensing. This method has been first proposed by GRS for use in deterministic safety analysis, and is now used by many organisations world-wide. Its advantage is that the number of potential uncertain input and output parameters is not limited to a small number. Such a limitation was necessary for the first demonstration of the Code Scaling Applicability Uncertainty Method (CSAU) by the United States Regulatory Commission (USNRC). They did not apply Wilks' formula in their statistical method propagating input uncertainties to obtain the uncertainty of a single output variable, like peak cladding temperature. A Phenomena Identification and Ranking Table (PIRT) was set up in order to limit the number of uncertain input parameters, and consequently, the number of calculations to be performed. Another purpose of such a PIRT process is to identify the most important physical phenomena which a computer code should be suitable to calculate. The validation of the code should be focused on the identified phenomena. Response surfaces are used in some applications replacing the computer code for performing a high number of calculations. The second well known uncertainty method is the Uncertainty Methodology Based on Accuracy Extrapolation (UMAE) and the follow-up method 'Code with the Capability of Internal Assessment of Uncertainty (CIAU)' developed by the University Pisa. Unlike the statistical approaches, the CIAU does compare experimental data with calculation results. It does not consider uncertain input parameters. Therefore, the CIAU is highly dependent on the experimental database. The accuracy gained from the comparison between experimental data and calculated results are extrapolated to obtain the uncertainty of the system code predictions

  16. Uncertainties in Steric Sea Level Change Estimation During the Satellite Altimeter Era: Concepts and Practices

    Science.gov (United States)

    MacIntosh, C. R.; Merchant, C. J.; von Schuckmann, K.

    2017-01-01

    This article presents a review of current practice in estimating steric sea level change, focussed on the treatment of uncertainty. Steric sea level change is the contribution to the change in sea level arising from the dependence of density on temperature and salinity. It is a significant component of sea level rise and a reflection of changing ocean heat content. However, tracking these steric changes still remains a significant challenge for the scientific community. We review the importance of understanding the uncertainty in estimates of steric sea level change. Relevant concepts of uncertainty are discussed and illustrated with the example of observational uncertainty propagation from a single profile of temperature and salinity measurements to steric height. We summarise and discuss the recent literature on methodologies and techniques used to estimate steric sea level in the context of the treatment of uncertainty. Our conclusions are that progress in quantifying steric sea level uncertainty will benefit from: greater clarity and transparency in published discussions of uncertainty, including exploitation of international standards for quantifying and expressing uncertainty in measurement; and the development of community "recipes" for quantifying the error covariances in observations and from sparse sampling and for estimating and propagating uncertainty across spatio-temporal scales.

  17. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    the TE model predictions. This analysis highlights the primary measurements that merit further development to reduce the uncertainty associated with their use in TE models. While we develop and apply this mathematical framework to a specific biorefinery scenario here, this analysis can be readily adapted to other types of biorefining processes and provides a general framework for propagating uncertainty due to analytical measurements through a TE model.

  18. Confronting the Uncertainty in Aerosol Forcing Using Comprehensive Observational Data

    Science.gov (United States)

    Johnson, J. S.; Regayre, L. A.; Yoshioka, M.; Pringle, K.; Sexton, D.; Lee, L.; Carslaw, K. S.

    2017-12-01

    The effect of aerosols on cloud droplet concentrations and radiative properties is the largest uncertainty in the overall radiative forcing of climate over the industrial period. In this study, we take advantage of a large perturbed parameter ensemble of simulations from the UK Met Office HadGEM-UKCA model (the aerosol component of the UK Earth System Model) to comprehensively sample uncertainty in aerosol forcing. Uncertain aerosol and atmospheric parameters cause substantial aerosol forcing uncertainty in climatically important regions. As the aerosol radiative forcing itself is unobservable, we investigate the potential for observations of aerosol and radiative properties to act as constraints on the large forcing uncertainty. We test how eight different theoretically perfect aerosol and radiation observations can constrain the forcing uncertainty over Europe. We find that the achievable constraint is weak unless many diverse observations are used simultaneously. This is due to the complex relationships between model output responses and the multiple interacting parameter uncertainties: compensating model errors mean there are many ways to produce the same model output (known as model equifinality) which impacts on the achievable constraint. However, using all eight observable quantities together we show that the aerosol forcing uncertainty can potentially be reduced by around 50%. This reduction occurs as we reduce a large sample of model variants (over 1 million) that cover the full parametric uncertainty to around 1% that are observationally plausible.Constraining the forcing uncertainty using real observations is a more complex undertaking, in which we must account for multiple further uncertainties including measurement uncertainties, structural model uncertainties and the model discrepancy from reality. Here, we make a first attempt to determine the true potential constraint on the forcing uncertainty from our model that is achievable using a comprehensive

  19. Estimating uncertainty of inference for validation

    Energy Technology Data Exchange (ETDEWEB)

    Booker, Jane M [Los Alamos National Laboratory; Langenbrunner, James R [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  20. A taxonomy of endogenous and exogenous uncertainty in high-risk, high-impact contexts.

    Science.gov (United States)

    Alison, Laurence; Power, Nicola; van den Heuvel, Claudia; Waring, Sara

    2015-07-01

    By reference to a live hostage negotiation exercise, this study presents a taxonomy of uncertainty that can be usefully applied to assist in the categorization and application of findings from decision-making research conducted in naturalistic (specifically critical incident) settings. Uncertainty was measured via observational methods (during the exercise and by reference to video footage), decision logs, and postincident simulated recall interviews with trainee police officers. Transcripts were coded and analyzed thematically. Uncertainty was dichotomized as deriving from either endogenous sources (about the problem situation itself) or exogenous sources (about the operating system that is dealing with the incident). Overall, exogenous uncertainty (75%) was more prevalent than endogenous uncertainty (25%), specifically during discussions on plan formulation and execution. It was also qualitatively associated with poor role understanding and trust. Endogenous uncertainty was more prevalent during discussions on situation assessment and plan formulation. The taxonomy provides a useful way for organizational researchers to categorize uncertainty during the naturalistic observations of workplace interactions and decision making. It reduces the complexity associated with observational research to allow organizational psychologists to better tailor their recommendations for reducing uncertainty. Dealing with endogenous uncertainties would entail targeting decision making specific to the problem incident (e.g., introduce training or policy to reduce redundant fixation on rote-repetitive superordinate goals and focus on more short-term actionable goals during situation assessments). Dealing with exogenous uncertainties would entail improving decision making relating to management and team processes across critical incidents (e.g., training to clarify distributed roles in critical incident teams to aid plan formulation and execution). Organizational researchers interested

  1. Spatial GHG Inventory: Analysis of Uncertainty Sources. A Case Study for Ukraine

    International Nuclear Information System (INIS)

    Bun, R.; Gusti, M.; Kujii, L.; Tokar, O.; Tsybrivskyy, Y.; Bun, A.

    2007-01-01

    A geoinformation technology for creating spatially distributed greenhouse gas inventories based on a methodology provided by the Intergovernmental Panel on Climate Change and special software linking input data, inventory models, and a means for visualization are proposed. This technology opens up new possibilities for qualitative and quantitative spatially distributed presentations of inventory uncertainty at the regional level. Problems concerning uncertainty and verification of the distributed inventory are discussed. A Monte Carlo analysis of uncertainties in the energy sector at the regional level is performed, and a number of simulations concerning the effectiveness of uncertainty reduction in some regions are carried out. Uncertainties in activity data have a considerable influence on overall inventory uncertainty, for example, the inventory uncertainty in the energy sector declines from 3.2 to 2.0% when the uncertainty of energy-related statistical data on fuels combusted in the energy industries declines from 10 to 5%. Within the energy sector, the 'energy industries' subsector has the greatest impact on inventory uncertainty. The relative uncertainty in the energy sector inventory can be reduced from 2.19 to 1.47% if the uncertainty of specific statistical data on fuel consumption decreases from 10 to 5%. The 'energy industries' subsector has the greatest influence in the Donetsk oblast. Reducing the uncertainty of statistical data on electricity generation in just three regions - the Donetsk, Dnipropetrovsk, and Luhansk oblasts - from 7.5 to 4.0% results in a decline from 2.6 to 1.6% in the uncertainty in the national energy sector inventory

  2. Simulations of dimensionally reduced effective theories of high temperature QCD

    CERN Document Server

    Hietanen, Ari

    Quantum chromodynamics (QCD) is the theory describing interaction between quarks and gluons. At low temperatures, quarks are confined forming hadrons, e.g. protons and neutrons. However, at extremely high temperatures the hadrons break apart and the matter transforms into plasma of individual quarks and gluons. In this theses the quark gluon plasma (QGP) phase of QCD is studied using lattice techniques in the framework of dimensionally reduced effective theories EQCD and MQCD. Two quantities are in particular interest: the pressure (or grand potential) and the quark number susceptibility. At high temperatures the pressure admits a generalised coupling constant expansion, where some coefficients are non-perturbative. We determine the first such contribution of order g^6 by performing lattice simulations in MQCD. This requires high precision lattice calculations, which we perform with different number of colors N_c to obtain N_c-dependence on the coefficient. The quark number susceptibility is studied by perf...

  3. Ionization balance for Ti and Cr ions: effects of uncertainty in dielectronic recombination rate

    International Nuclear Information System (INIS)

    Seon, Kwang-Il; Nam, Uk-Won; Park, Il H

    2003-01-01

    The available electron-impact ionization cross sections for Ti and Cr ions are reviewed, and calculations of the ionization balance for the ions under coronal equilibrium are presented. The calculated ionic abundance fractions are compared with those of previous works. The effects of modelling uncertainty in dielectronic recombination on isoelectronic line ratios, which are formed using the same spectral line from two elements of slightly different atomic numbers, are discussed concentrating on high temperature ranges. Also discussed are the effects of modelling uncertainty on inter-ionization stage line ratios formed from adjacent ionization stages. It is demonstrated that the modelling uncertainty in dielectronic recombination tends to cancel out only when the isoelectronic line ratio of He-like ions is considered, and that the sensitivity of the isoelectronic line ratios to the modelling uncertainty tends to increase for less ionized stages. It is also found that the interstage line ratios are less sensitive to the typical ∼20% uncertainties of dielectronic rates than the isoelectronic line ratios, and that the interstage line ratio of He-to Li-like ions in Ti and Cr plasmas is a better choice for a temperature diagnostic in the temperature ranges from ∼0.6 to ∼1.5 keV in which Li-like ions have maximum ionic abundances

  4. Climate-carbon cycle feedbacks under stabilization: uncertainty and observational constraints

    International Nuclear Information System (INIS)

    Jones, Chris D.; Cox, Peter M.; Huntingford, Chris

    2006-01-01

    Avoiding 'dangerous climate change' by stabilization of atmospheric CO 2 concentrations at a desired level requires reducing the rate of anthropogenic carbon emissions so that they are balanced by uptake of carbon by the natural terrestrial and oceanic carbon cycles. Previous calculations of profiles of emissions which lead to stabilized CO 2 levels have assumed no impact of climate change on this natural carbon uptake. However, future climate change effects on the land carbon cycle are predicted to reduce its ability to act as a sink for anthropogenic carbon emissions and so quantification of this feedback is required to determine future permissible emissions. Here, we assess the impact of the climate-carbon cycle feedback and attempt to quantify its uncertainty due to both within-model parameter uncertainty and between-model structural uncertainty. We assess the use of observational constraints to reduce uncertainty in the future permissible emissions for climate stabilization and find that all realistic carbon cycle feedbacks consistent with the observational record give permissible emissions significantly less than previously assumed. However, the observational record proves to be insufficient to tightly constrain carbon cycle processes or future feedback strength with implications for climate-carbon cycle model evaluation

  5. Uncertainty in projected climate change arising from uncertain fossil-fuel emission factors

    Science.gov (United States)

    Quilcaille, Y.; Gasser, T.; Ciais, P.; Lecocq, F.; Janssens-Maenhout, G.; Mohr, S.

    2018-04-01

    Emission inventories are widely used by the climate community, but their uncertainties are rarely accounted for. In this study, we evaluate the uncertainty in projected climate change induced by uncertainties in fossil-fuel emissions, accounting for non-CO2 species co-emitted with the combustion of fossil-fuels and their use in industrial processes. Using consistent historical reconstructions and three contrasted future projections of fossil-fuel extraction from Mohr et al we calculate CO2 emissions and their uncertainties stemming from estimates of fuel carbon content, net calorific value and oxidation fraction. Our historical reconstructions of fossil-fuel CO2 emissions are consistent with other inventories in terms of average and range. The uncertainties sum up to a ±15% relative uncertainty in cumulative CO2 emissions by 2300. Uncertainties in the emissions of non-CO2 species associated with the use of fossil fuels are estimated using co-emission ratios varying with time. Using these inputs, we use the compact Earth system model OSCAR v2.2 and a Monte Carlo setup, in order to attribute the uncertainty in projected global surface temperature change (ΔT) to three sources of uncertainty, namely on the Earth system’s response, on fossil-fuel CO2 emission and on non-CO2 co-emissions. Under the three future fuel extraction scenarios, we simulate the median ΔT to be 1.9, 2.7 or 4.0 °C in 2300, with an associated 90% confidence interval of about 65%, 52% and 42%. We show that virtually all of the total uncertainty is attributable to the uncertainty in the future Earth system’s response to the anthropogenic perturbation. We conclude that the uncertainty in emission estimates can be neglected for global temperature projections in the face of the large uncertainty in the Earth system response to the forcing of emissions. We show that this result does not hold for all variables of the climate system, such as the atmospheric partial pressure of CO2 and the

  6. BEMUSE Phase III Report - Uncertainty and Sensitivity Analysis of the LOFT L2-5 Test

    International Nuclear Information System (INIS)

    Bazin, P.; Crecy, A. de; Glaeser, H.; Skorek, T.; Joucla, J.; Probst, P.; Chung, B.; Oh, D.Y.; Kyncl, M.; Pernica, R.; Macek, J.; Meca, R.; Macian, R.; D'Auria, F.; Petruzzi, A.; Perez, M.; Reventos, F.; Fujioka, K.

    2007-02-01

    This report summarises the various contributions (ten participants) for phase 3 of BEMUSE: Uncertainty and Sensitivity Analyses of the LOFT L2-5 experiment, a Large-Break Loss-of-Coolant-Accident (LB-LOCA). For this phase, precise requirements step by step were provided to the participants. Four main parts are defined, which are: 1. List and uncertainties of the input uncertain parameters. 2. Uncertainty analysis results. 3. Sensitivity analysis results. 4. Improved methods, assessment of the methods (optional). 5% and 95% percentiles have to be estimated for 6 output parameters, which are of two kinds: 1. Scalar output parameters (First Peak Cladding Temperature (PCT), Second Peak Cladding Temperature, Time of accumulator injection, Time of complete quenching); 2. Time trends output parameters (Maximum cladding temperature, Upper plenum pressure). The main lessons learnt from phase 3 of the BEMUSE programme are the following: - for uncertainty analysis, all the participants use a probabilistic method associated with the use of Wilks' formula, except for UNIPI with its CIAU method (Code with the Capability of Internal Assessment of Uncertainty). Use of both methods has been successfully mastered. - Compared with the experiment, the results of uncertainty analysis are good on the whole. For example, for the cladding temperature-type output parameters (1. PCT, 2. PCT, time of complete quenching, maximum cladding temperature), 8 participants out of 10 find upper and lower bounds which envelop the experimental data. - Sensitivity analysis has been successfully performed by all the participants using the probabilistic method. All the used influence measures include the range of variation of the input parameters. Synthesis tables of the most influential phenomena and parameters have been plotted and participants will be able to use them for the continuation of the BEMUSE programme

  7. Entropic uncertainty for spin-1/2 XXX chains in the presence of inhomogeneous magnetic fields and its steering via weak measurement reversals

    Science.gov (United States)

    Wang, Dong; Ming, Fei; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2017-09-01

    The uncertainty principle configures a low bound to the measuring precision for a pair of non-commuting observables, and hence is considerably nontrivial to quantum precision measurement in the field of quantum information theory. In this letter, we consider the entropic uncertainty relation (EUR) in the context of quantum memory in a two-qubit isotropic Heisenberg spin chain. Specifically, we explore the dynamics of EUR in a practical scenario, where two associated nodes of a one-dimensional XXX-spin chain, under an inhomogeneous magnetic field, are connected to a thermal entanglement. We show that the temperature and magnetic field effect can lead to the inflation of the measuring uncertainty, stemming from the reduction of systematic quantum correlation. Notably, we reveal that, firstly, the uncertainty is not fully dependent on the observed quantum correlation of the system; secondly, the dynamical behaviors of the measuring uncertainty are relatively distinct with respect to ferromagnetism and antiferromagnetism chains. Meanwhile, we deduce that the measuring uncertainty is dramatically correlated with the mixedness of the system, implying that smaller mixedness tends to reduce the uncertainty. Furthermore, we propose an effective strategy to control the uncertainty of interest by means of quantum weak measurement reversal. Therefore, our work may shed light on the dynamics of the measuring uncertainty in the Heisenberg spin chain, and thus be important to quantum precision measurement in various solid-state systems.

  8. Use of extremity insulation during whole body hyperthermia to reduce temperature nonuniformity

    International Nuclear Information System (INIS)

    Thrall, D.E.; Page, R.L.

    1987-01-01

    The author previously documented during whole body hyperthermia in dogs using a radiant heating device that temperature at superficial sites, including tibial bone marrow, falls below systemic arterial temperature during the plateau phase of heating. This may be due to direct heat loss to the environment. Sites where temperature is lower than systemic arterial temperature during the plateau phase may become sanctuary sites where tumor deposits are spared because they do not receive the prescribed thermal dose. In an attempt to decrease temperature nonuniformity and increase thermal dose delivered to such superficial sites, extremity insulation has been employed during whole body hyperthermia in dogs. The author measured temperature at cutaneous and subcutaneous sites and within tibial bone marrow in insulated and noninsulated extremities of dogs undergoing whole body hyperthermia in the radiant heating device. The author found that extremity insulation is effective in reducing extremity temperature nonuniformity. Specific results are presented. Extremity insulation may be necessary during whole body hyperthermia to assure that extremity tumor deposits receive a thermal dose similar to that prescribed for the entire body

  9. Dealing with rainfall forecast uncertainties in real-time flood control along the Demer river

    Directory of Open Access Journals (Sweden)

    Vermuyten Evert

    2016-01-01

    Full Text Available Real-time Model Predictive Control (MPC of hydraulic structures strongly reduces flood consequences under ideal circumstances. The performance of such flood control may, however, be significantly affected by uncertainties. This research quantifies the influence of rainfall forecast uncertainties and related uncertainties in the catchment rainfall-runoff discharges on the control performance for the Herk river case study in Belgium. To limit the model computational times, a fast conceptual model is applied. It is calibrated to a full hydrodynamic river model. A Reduced Genetic Algorithm is used as optimization method. Next to the analysis of the impact of the rainfall forecast uncertainties on the control performance, a Multiple Model Predictive Control (MMPC approach is tested to reduce this impact. Results show that the deterministic MPC-RGA outperforms the MMPC and that it is inherently robust against rainfall forecast uncertainties due to its receding horizon strategy.

  10. Maximizing the probability of satisfying the clinical goals in radiation therapy treatment planning under setup uncertainty

    International Nuclear Information System (INIS)

    Fredriksson, Albin; Hårdemark, Björn; Forsgren, Anders

    2015-01-01

    Purpose: This paper introduces a method that maximizes the probability of satisfying the clinical goals in intensity-modulated radiation therapy treatments subject to setup uncertainty. Methods: The authors perform robust optimization in which the clinical goals are constrained to be satisfied whenever the setup error falls within an uncertainty set. The shape of the uncertainty set is included as a variable in the optimization. The goal of the optimization is to modify the shape of the uncertainty set in order to maximize the probability that the setup error will fall within the modified set. Because the constraints enforce the clinical goals to be satisfied under all setup errors within the uncertainty set, this is equivalent to maximizing the probability of satisfying the clinical goals. This type of robust optimization is studied with respect to photon and proton therapy applied to a prostate case and compared to robust optimization using an a priori defined uncertainty set. Results: Slight reductions of the uncertainty sets resulted in plans that satisfied a larger number of clinical goals than optimization with respect to a priori defined uncertainty sets, both within the reduced uncertainty sets and within the a priori, nonreduced, uncertainty sets. For the prostate case, the plans taking reduced uncertainty sets into account satisfied 1.4 (photons) and 1.5 (protons) times as many clinical goals over the scenarios as the method taking a priori uncertainty sets into account. Conclusions: Reducing the uncertainty sets enabled the optimization to find better solutions with respect to the errors within the reduced as well as the nonreduced uncertainty sets and thereby achieve higher probability of satisfying the clinical goals. This shows that asking for a little less in the optimization sometimes leads to better overall plan quality

  11. Treatment of uncertainties in atmospheric chemical systems: A combined modeling and experimental approach

    Science.gov (United States)

    Pun, Betty Kong-Ling

    1998-12-01

    Uncertainty is endemic in modeling. This thesis is a two- phase program to understand the uncertainties in urban air pollution model predictions and in field data used to validate them. Part I demonstrates how to improve atmospheric models by analyzing the uncertainties in these models and using the results to guide new experimentation endeavors. Part II presents an experiment designed to characterize atmospheric fluctuations, which have significant implications towards the model validation process. A systematic study was undertaken to investigate the effects of uncertainties in the SAPRC mechanism for gas- phase chemistry in polluted atmospheres. The uncertainties of more than 500 parameters were compiled, including reaction rate constants, product coefficients, organic composition, and initial conditions. Uncertainty propagation using the Deterministic Equivalent Modeling Method (DEMM) revealed that the uncertainties in ozone predictions can be up to 45% based on these parametric uncertainties. The key parameters found to dominate the uncertainties of the predictions include photolysis rates of NO2, O3, and formaldehyde; the rate constant for nitric acid formation; and initial amounts of NOx and VOC. Similar uncertainty analysis procedures applied to two other mechanisms used in regional air quality models led to the conclusion that in the presence of parametric uncertainties, the mechanisms cannot be discriminated. Research efforts should focus on reducing parametric uncertainties in photolysis rates, reaction rate constants, and source terms. A new tunable diode laser (TDL) infrared spectrometer was designed and constructed to measure multiple pollutants simultaneously in the same ambient air parcels. The sensitivities of the one hertz measurements were 2 ppb for ozone, 1 ppb for NO, and 0.5 ppb for NO2. Meteorological data were also collected for wind, temperature, and UV intensity. The field data showed clear correlations between ozone, NO, and NO2 in the one

  12. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  13. Strategies for Reduced-Order Models in Uncertainty Quantification of Complex Turbulent Dynamical Systems

    Science.gov (United States)

    Qi, Di

    Turbulent dynamical systems are ubiquitous in science and engineering. Uncertainty quantification (UQ) in turbulent dynamical systems is a grand challenge where the goal is to obtain statistical estimates for key physical quantities. In the development of a proper UQ scheme for systems characterized by both a high-dimensional phase space and a large number of instabilities, significant model errors compared with the true natural signal are always unavoidable due to both the imperfect understanding of the underlying physical processes and the limited computational resources available. One central issue in contemporary research is the development of a systematic methodology for reduced order models that can recover the crucial features both with model fidelity in statistical equilibrium and with model sensitivity in response to perturbations. In the first part, we discuss a general mathematical framework to construct statistically accurate reduced-order models that have skill in capturing the statistical variability in the principal directions of a general class of complex systems with quadratic nonlinearity. A systematic hierarchy of simple statistical closure schemes, which are built through new global statistical energy conservation principles combined with statistical equilibrium fidelity, are designed and tested for UQ of these problems. Second, the capacity of imperfect low-order stochastic approximations to model extreme events in a passive scalar field advected by turbulent flows is investigated. The effects in complicated flow systems are considered including strong nonlinear and non-Gaussian interactions, and much simpler and cheaper imperfect models with model error are constructed to capture the crucial statistical features in the stationary tracer field. Several mathematical ideas are introduced to improve the prediction skill of the imperfect reduced-order models. Most importantly, empirical information theory and statistical linear response theory are

  14. Prenatal temperature shocks reduce cooperation

    NARCIS (Netherlands)

    Duchoslav, Jan

    2017-01-01

    Climate change has not only led to a sustained rise in mean global temperature over the past decades, but also increased the frequency of extreme weather events. This paper explores the effect of temperature shocks in utero on later-life taste for cooperation. Using historical climate data combined

  15. Understanding and quantifying foliar temperature acclimation for Earth System Models

    Science.gov (United States)

    Smith, N. G.; Dukes, J.

    2015-12-01

    Photosynthesis and respiration on land are the two largest carbon fluxes between the atmosphere and Earth's surface. The parameterization of these processes represent major uncertainties in the terrestrial component of the Earth System Models used to project future climate change. Research has shown that much of this uncertainty is due to the parameterization of the temperature responses of leaf photosynthesis and autotrophic respiration, which are typically based on short-term empirical responses. Here, we show that including longer-term responses to temperature, such as temperature acclimation, can help to reduce this uncertainty and improve model performance, leading to drastic changes in future land-atmosphere carbon feedbacks across multiple models. However, these acclimation formulations have many flaws, including an underrepresentation of many important global flora. In addition, these parameterizations were done using multiple studies that employed differing methodology. As such, we used a consistent methodology to quantify the short- and long-term temperature responses of maximum Rubisco carboxylation (Vcmax), maximum rate of Ribulos-1,5-bisphosphate regeneration (Jmax), and dark respiration (Rd) in multiple species representing each of the plant functional types used in global-scale land surface models. Short-term temperature responses of each process were measured in individuals acclimated for 7 days at one of 5 temperatures (15-35°C). The comparison of short-term curves in plants acclimated to different temperatures were used to evaluate long-term responses. Our analyses indicated that the instantaneous response of each parameter was highly sensitive to the temperature at which they were acclimated. However, we found that this sensitivity was larger in species whose leaves typically experience a greater range of temperatures over the course of their lifespan. These data indicate that models using previous acclimation formulations are likely incorrectly

  16. Modelling small groundwater systems - the role of targeted field investigations and observational data in reducing model uncertainty

    Science.gov (United States)

    Abesser, Corinna; Hughes, Andrew; Boon, David

    2017-04-01

    the fit between predicted and observed heads and reduction in overall model uncertainty. The impact of availability of observational data on model calibration was tested as part of this study, confirming that equifinality remains an issue despite improved system characterisation and suggesting that uncertainty relating to the distribution of hydraulic conductivity (K) within the dune system must be further reduced. This study illustrates that groundwater modelling is not linear but should be an iterative process, especially in systems where large geological uncertainties exist. It should be carried out in conjunction with field studies, i.e. not as a postscript, but as ongoing interaction. This interaction is required throughout the investigation process and is key to heuristic learning and improved system understanding. Given that the role of modelling is to raise questions as well as answer them, this study demonstrates that this applies even in small systems that are thought to be well understood. This research is funded by the UK Natural Environmental Research Council (NERC). The work is distributed under the Creative Commons Attribution 3.0 Unported License together with an author copyright. This licence does not conflict with the regulations of the Crown Copyright.

  17. Uncertainty in predictions of forest carbon dynamics: separating driver error from model error.

    Science.gov (United States)

    Spadavecchia, L; Williams, M; Law, B E

    2011-07-01

    We present an analysis of the relative magnitude and contribution of parameter and driver uncertainty to the confidence intervals on estimates of net carbon fluxes. Model parameters may be difficult or impractical to measure, while driver fields are rarely complete, with data gaps due to sensor failure and sparse observational networks. Parameters are generally derived through some optimization method, while driver fields may be interpolated from available data sources. For this study, we used data from a young ponderosa pine stand at Metolius, Central Oregon, and a simple daily model of coupled carbon and water fluxes (DALEC). An ensemble of acceptable parameterizations was generated using an ensemble Kalman filter and eddy covariance measurements of net C exchange. Geostatistical simulations generated an ensemble of meteorological driving variables for the site, consistent with the spatiotemporal autocorrelations inherent in the observational data from 13 local weather stations. Simulated meteorological data were propagated through the model to derive the uncertainty on the CO2 flux resultant from driver uncertainty typical of spatially extensive modeling studies. Furthermore, the model uncertainty was partitioned between temperature and precipitation. With at least one meteorological station within 25 km of the study site, driver uncertainty was relatively small ( 10% of the total net flux), while parameterization uncertainty was larger, 50% of the total net flux. The largest source of driver uncertainty was due to temperature (8% of the total flux). The combined effect of parameter and driver uncertainty was 57% of the total net flux. However, when the nearest meteorological station was > 100 km from the study site, uncertainty in net ecosystem exchange (NEE) predictions introduced by meteorological drivers increased by 88%. Precipitation estimates were a larger source of bias in NEE estimates than were temperature estimates, although the biases partly

  18. Estimating Sampling Biases and Measurement Uncertainties of AIRS-AMSU-A Temperature and Water Vapor Observations Using MERRA Reanalysis

    Science.gov (United States)

    Hearty, Thomas J.; Savtchenko, Andrey K.; Tian, Baijun; Fetzer, Eric; Yung, Yuk L.; Theobald, Michael; Vollmer, Bruce; Fishbein, Evan; Won, Young-In

    2014-01-01

    We use MERRA (Modern Era Retrospective-Analysis for Research Applications) temperature and water vapor data to estimate the sampling biases of climatologies derived from the AIRS/AMSU-A (Atmospheric Infrared Sounder/Advanced Microwave Sounding Unit-A) suite of instruments. We separate the total sampling bias into temporal and instrumental components. The temporal component is caused by the AIRS/AMSU-A orbit and swath that are not able to sample all of time and space. The instrumental component is caused by scenes that prevent successful retrievals. The temporal sampling biases are generally smaller than the instrumental sampling biases except in regions with large diurnal variations, such as the boundary layer, where the temporal sampling biases of temperature can be +/- 2 K and water vapor can be 10% wet. The instrumental sampling biases are the main contributor to the total sampling biases and are mainly caused by clouds. They are up to 2 K cold and greater than 30% dry over mid-latitude storm tracks and tropical deep convective cloudy regions and up to 20% wet over stratus regions. However, other factors such as surface emissivity and temperature can also influence the instrumental sampling bias over deserts where the biases can be up to 1 K cold and 10% wet. Some instrumental sampling biases can vary seasonally and/or diurnally. We also estimate the combined measurement uncertainties of temperature and water vapor from AIRS/AMSU-A and MERRA by comparing similarly sampled climatologies from both data sets. The measurement differences are often larger than the sampling biases and have longitudinal variations.

  19. CREOLE experiment study on the reactivity temperature coefficient with sensitivity and uncertainty analysis using the MCNP5 code and different neutron cross section evaluations

    International Nuclear Information System (INIS)

    Boulaich, Y.; El Bardouni, T.; Erradi, L.; Chakir, E.; Boukhal, H.; Nacir, B.; El Younoussi, C.; El Bakkari, B.; Merroun, O.; Zoubair, M.

    2011-01-01

    Highlights: → In the present work, we have analyzed the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. → Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values. → In order to specify the source of the relatively large discrepancy in the case of ENDF-BVII nuclear data evaluation, the k eff discrepancy between ENDF-BVII and JENDL3.3 was decomposed by using sensitivity and uncertainty analysis technique. - Abstract: In the present work, we analyze the CREOLE experiment on the reactivity temperature coefficient (RTC) by using the three-dimensional continuous energy code (MCNP5) and the last updated nuclear data evaluations. This experiment performed in the EOLE critical facility located at CEA/Cadarache, was mainly dedicated to the RTC studies for both UO 2 and UO 2 -PuO 2 PWR type lattices covering the whole temperature range from 20 deg. C to 300 deg. C. We have developed an accurate 3D model of the EOLE reactor by using the MCNP5 Monte Carlo code which guarantees a high level of fidelity in the description of different configurations at various temperatures taking into account their consequence on neutron cross section data and all thermal expansion effects. In this case, the remaining error between calculation and experiment will be awarded mainly to uncertainties on nuclear data. Our own cross section library was constructed by using NJOY99.259 code with point-wise nuclear data based on ENDF-BVII, JEFF3.1 and JENDL3.3 evaluation files. The MCNP model was validated through the axial and radial fission rate measurements at room and hot temperatures. Calculation-experiment discrepancies of the RTC were analyzed and the results have shown that the JENDL3.3 and JEFF3.1 evaluations give the most consistent values; the discrepancy is

  20. Estimation of uncertainty in TLD calibration

    International Nuclear Information System (INIS)

    Hasabelrasoul, H. A.

    2013-07-01

    In this study thermoluminescence dosimeter TLD was use of individual control devices to make sure the quality assurance and quality control in individual monitoring. The uncertainty measured in reader calibration coefficients for tow reader and uncertainty in radiation dose after irradiate in SSDL laboratory. Fifty sample was selected for the study was placed in the oven at a temperature of 400 for an hour to get zero or background and took zero count by or background and took zero count by reader (1) and reader (2) and then irradiate in SSDL by cesium-137 at a dose of 5 mGy and laid back in the oven at degrees 100 and degrees 10 minutes, to 10 chips for calibration and readout count by reader one and reader two. The RCF was found for each reader above 1.47 and 1.11, respectively, and found the uncertainty RCF was found for each reader above 1.47 and 1.11, respectively, and found the uncertainly RCF 0.430629 and 0.431973. Radiation dose was measured for fifty samples irradiate to dose of 5 mGy and read the count by reader 1 and reader 2 the uncertainty was found for each reader 0.490446 and 0.587602.(Author)

  1. Effects of utility demand-side management programs on uncertainty

    International Nuclear Information System (INIS)

    Hirst, E.

    1994-01-01

    Electric utilities face a variety of uncertainties that complicate their long-term resource planning. These uncertainties include future economic and load growths, fuel prices, environmental and economic regulations, performance of existing power plants, cost and availability of purchased power, and the costs and performance of new demand and supply resources. As utilities increasingly turn to demand-side management (DSM) programs to provide resources, it becomes more important to analyze the interactions between these programs and the uncertainties facing utilities. This paper uses a dynamic planning model to quantify the uncertainty effects of supply-only vs DSM + supply resource portfolios. The analysis considers four sets of uncertainties: economic growth, fuel prices, the costs to build new power plants, and the costs to operate DSM programs. The two types of portfolios are tested against these four sets of uncertainties for the period 1990 to 2010. Sensitivity, scenario, and worst-case analysis methods are used. The sensitivity analyses show that the DSM + supply resource portfolio is less sensitive to unanticipated changes in economic growth, fuel prices, and power-plant construction costs than is the supply-only portfolio. The supply-only resource mix is better only with respect to uncertainties about the costs of DSM programs. The base-case analysis shows that including DSM programs in the utility's resource portfolio reduces the net present value of revenue requirements (NPV-RR) by 490 million dollars. The scenario-analysis results show an additional 30 million dollars (6%) in benefits associated with reduction in these uncertainties. In the worst-case analysis, the DSM + supply portfolio again reduces the cost penalty associated with guessing wrong for both cases, when the utility plans for high needs and learns it has low needs and vice versa. 20 refs

  2. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    uncertainties in CL and EX estimates were found to be efficiently mitigated by reducing data uncertainty in the critical limit of the chemical criteria - the BC/Al ratio. The distributed CL and EX assessment on local level in Sweden was found to be efficiently improved by enhancing the resolution of the underlying vegetation map 68 refs, 15 figs, 3 tabs

  3. Thickened water-based hydraulic fluid with reduced dependence of viscosity on temperature

    Energy Technology Data Exchange (ETDEWEB)

    Deck, C. F.

    1985-01-01

    Improved hydraulic fluids or metalworking lubricants, utilizing mixtures of water, metal lubricants, metal corrosion inhibitors, and an associative polyether thickener, have reduced dependence of the viscosity on temperature achieved by the incorporation therein of an ethoxylated polyether surfactant.

  4. Regional amplification of projected changes in extreme temperatures strongly controlled by soil moisture-temperature feedbacks

    Science.gov (United States)

    Vogel, Martha Marie; Orth, René; Cheruy, Frederique; Hagemann, Stefan; Lorenz, Ruth; van den Hurk, Bart; Seneviratne, Sonia Isabelle

    2017-04-01

    Regional hot extremes are projected to increase more strongly than global mean temperature, with substantially larger changes than 2°C even if global warming is limited to this level. We investigate here the role of soil moisture-temperature feedbacks for this response based on multi-model experiments for the 21st century with either interactive or fixed (late 20th century mean seasonal cycle) soil moisture. We analyze changes in the hottest days in each year in both sets of experiments, relate them to the global mean temperature increase, and investigate physical processes leading to these changes. We find that soil moisture-temperature feedbacks significantly contribute to the amplified warming of hottest days compared to that of global mean temperature. This contribution reaches more than 70% in Central Europe and Central North America and between 42%-52% in Amazonia, Northern Australia and Southern Africa. Soil moisture trends (multi-decadal soil moisture variability) are more important for this response than short-term (e.g. seasonal, interannual) soil moisture variability. These results are relevant for reducing uncertainties in regional temperature projections. Vogel, M.M. et al.,2017. Regional amplification of projected changes in extreme temperatures strongly controlled by soil moisture-temperature feedbacks. Geophysical Research Letters, accepted.

  5. Inflation and Inflation Uncertainty Revisited: Evidence from Egypt

    Directory of Open Access Journals (Sweden)

    Mesbah Fathy Sharaf

    2015-07-01

    Full Text Available The welfare costs of inflation and inflation uncertainty are well documented in the literature and empirical evidence on the link between the two is sparse in the case of Egypt. This paper investigates the causal relationship between inflation and inflation uncertainty in Egypt using monthly time series data during the period January 1974–April 2015. To endogenously control for any potential structural breaks in the inflation time series, Zivot and Andrews (2002 and Clemente–Montanes–Reyes (1998 unit root tests are used. The inflation–inflation uncertainty relation is modeled by the standard two-step approach as well as simultaneously using various versions of the GARCH-M model to control for any potential feedback effects. The analyses explicitly control for the effect of the Economic Reform and Structural Adjustment Program (ERSAP undertaken by the Egyptian government in the early 1990s, which affected inflation rate and its associated volatility. Results show a high degree of inflation–volatility persistence in the response to inflationary shocks. Granger-causality test along with symmetric and asymmetric GARCH-M models indicate a statistically significant bi-directional positive relationship between inflation and inflation uncertainty, supporting both the Friedman–Ball and the Cukierman–Meltzer hypotheses. The findings are robust to the various estimation methods and model specifications. The findings of this paper support the view of adopting inflation-targeting policy in Egypt, after fulfilling its preconditions, to reduce the welfare cost of inflation and its related uncertainties. Monetary authorities in Egypt should enhance the credibility of monetary policy and attempt to reduce inflation uncertainty, which will help lower inflation rates.

  6. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  7. Uncertainty relations and reduced density matrices: Mapping many-body quantum mechanics onto four particles

    Science.gov (United States)

    Mazziotti, David A.; Erdahl, Robert M.

    2001-04-01

    For the description of ground-state correlation phenomena an accurate mapping of many-body quantum mechanics onto four particles is developed. The energy for a quantum system with no more than two-particle interactions may be expressed in terms of a two-particle reduced density matrix (2-RDM), but variational optimization of the 2-RDM requires that it corresponds to an N-particle wave function. We derive N-representability conditions on the 2-RDM that guarantee the validity of the uncertainty relations for all operators with two-particle interactions. One of these conditions is shown to be necessary and sufficient to make the RDM solutions of the dispersion condition equivalent to those from the contracted Schrödinger equation (CSE) [Mazziotti, Phys. Rev. A 57, 4219 (1998)]. In general, the CSE is a stronger N-representability condition than the dispersion condition because the CSE implies the dispersion condition as well as additional N-representability constraints from the Hellmann-Feynman theorem. Energy minimization subject to the representability constraints is performed for a boson model with 10, 30, and 75 particles. Even when traditional wave-function methods fail at large perturbations, the present method yields correlation energies within 2%.

  8. Development of Evaluation Code for MUF Uncertainty

    International Nuclear Information System (INIS)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan

    2015-01-01

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities

  9. Development of Evaluation Code for MUF Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Won, Byung Hee; Han, Bo Young; Shin, Hee Sung; Ahn, Seong-Kyu; Park, Geun-Il; Park, Se Hwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    Material Unaccounted For (MUF) is the material balance evaluated by measured nuclear material in a Material Balance Area (MBA). Assuming perfect measurements and no diversion from a facility, one can expect a zero MUF. However, non-zero MUF is always occurred because of measurement uncertainty even though the facility is under normal operation condition. Furthermore, there are many measurements using different equipment at various Key Measurement Points (KMPs), and the MUF uncertainty is affected by errors of those measurements. Evaluating MUF uncertainty is essentially required to develop safeguards system including nuclear measurement system in pyroprocessing, which is being developed for reducing radioactive waste from spent fuel in Korea Atomic Energy Research Institute (KAERI). The evaluation code for analyzing MUF uncertainty has been developed and it was verified using sample problem from the IAEA reference. MUF uncertainty can be simply and quickly calculated by using this evaluation code which is made based on graphical user interface for user friendly. It is also expected that the code will make the sensitivity analysis on the MUF uncertainty for the various safeguards systems easy and more systematic. It is suitable for users who want to evaluate the conventional safeguards system as well as to develop a new system for developing facilities.

  10. Temperature trends with reduced impact of ocean air temperature

    DEFF Research Database (Denmark)

    Lansner, Frank; Pedersen, Jens Olaf Pepke

    Temperature data 1900-2010 from meteorological stations across the world have been analysed and it has been found that all areas generally have two different valid temperature trends. Coastal stations and hill stations facing dominant ocean winds are normally more warm-trended than the valley sta...

  11. Temperature trends with reduced impact of ocean air temperature

    DEFF Research Database (Denmark)

    Lansner, Frank; Pedersen, Jens Olaf Pepke

    2018-01-01

    Temperature data 1900–2010 from meteorological stations across the world have been analyzed and it has been found that all land areas generally have two different valid temperature trends. Coastal stations and hill stations facing ocean winds are normally more warm-trended than the valley station...

  12. Predicting Comfort Temperature in Indonesia, an Initial Step to Reduce Cooling Energy Consumption

    Directory of Open Access Journals (Sweden)

    Tri Harso Karyono

    2015-07-01

    Full Text Available Indonesia has no reliable thermal comfort standard that is based on research works. The current national standard (SNI 6390:2011 states only a single range of comfort temperature that is 25.5 °C Ta, with a range of +1.5 °C Ta. Previous thermal studies in a number of different buildings in Indonesia showed that the neutral (comfort temperatures of subjects were about 27 to 28 °C, which is higher than the values stated in the standard. As a big country with various ambient temperatures, Indonesian needs a better and more reliable thermal comfort predictor which can be applied properly across the country. This study is an attempt to propose an initial Indonesian thermal predictor, in the form of a simple equation, which could predict comfort temperatures properly across the country. Reanalysing the previous comfort studies in Indonesia, a simple regression equation is constructed as to be used as the initial Indonesian comfort predictor. Using this predictor, the comfort temperatures in a lowland or coastal cities like Jakarta is found to be higher than the current comfort standard. It is expected that this predictor would help to provide a better indoor thermal environment and at the same reduce the cooling energy in air conditioning (AC building, thus reducing a building’s carbon emissions.

  13. Exploring uncertainty of Amazon dieback in a perturbed parameter Earth system ensemble.

    Science.gov (United States)

    Boulton, Chris A; Booth, Ben B B; Good, Peter

    2017-12-01

    The future of the Amazon rainforest is unknown due to uncertainties in projected climate change and the response of the forest to this change (forest resiliency). Here, we explore the effect of some uncertainties in climate and land surface processes on the future of the forest, using a perturbed physics ensemble of HadCM3C. This is the first time Amazon forest changes are presented using an ensemble exploring both land vegetation processes and physical climate feedbacks in a fully coupled modelling framework. Under three different emissions scenarios, we measure the change in the forest coverage by the end of the 21st century (the transient response) and make a novel adaptation to a previously used method known as "dry-season resilience" to predict the long-term committed response of the forest, should the state of the climate remain constant past 2100. Our analysis of this ensemble suggests that there will be a high chance of greater forest loss on longer timescales than is realized by 2100, especially for mid-range and low emissions scenarios. In both the transient and predicted committed responses, there is an increasing uncertainty in the outcome of the forest as the strength of the emissions scenarios increases. It is important to note however, that very few of the simulations produce future forest loss of the magnitude previously shown under the standard model configuration. We find that low optimum temperatures for photosynthesis and a high minimum leaf area index needed for the forest to compete for space appear to be precursors for dieback. We then decompose the uncertainty into that associated with future climate change and that associated with forest resiliency, finding that it is important to reduce the uncertainty in both of these if we are to better determine the Amazon's outcome. © 2017 John Wiley & Sons Ltd.

  14. Energy price uncertainty, energy intensity and firm investment

    International Nuclear Information System (INIS)

    Yoon, Kyung Hwan; Ratti, Ronald A.

    2011-01-01

    This paper examines the effect of energy price uncertainty on firm-level investment. An error correction model of capital stock adjustment is estimated with data on U.S. manufacturing firms. Higher energy price uncertainty is found to make firms more cautious by reducing the responsiveness of investment to sales growth. The result is robust to consideration of energy intensity by industry. The effect is greater for high growth firms. It must be emphasized that the direct effect of uncertainty is not estimated. Conditional variance of energy price is obtained from a GARCH model. Findings suggest that stability in energy prices would be conducive to greater stability in firm-level investment. (author)

  15. Characterization of XR-RV3 GafChromic{sup ®} films in standard laboratory and in clinical conditions and means to evaluate uncertainties and reduce errors

    Energy Technology Data Exchange (ETDEWEB)

    Farah, J., E-mail: jad.farah@irsn.fr; Clairand, I.; Huet, C. [External Dosimetry Department, Institut de Radioprotection et de Sûreté Nucléaire (IRSN), BP-17, 92260 Fontenay-aux-Roses (France); Trianni, A. [Medical Physics Department, Udine University Hospital S. Maria della Misericordia (AOUD), p.le S. Maria della Misericordia, 15, 33100 Udine (Italy); Ciraj-Bjelac, O. [Vinca Institute of Nuclear Sciences (VINCA), P.O. Box 522, 11001 Belgrade (Serbia); De Angelis, C. [Department of Technology and Health, Istituto Superiore di Sanità (ISS), Viale Regina Elena 299, 00161 Rome (Italy); Delle Canne, S. [Fatebenefratelli San Giovanni Calibita Hospital (FBF), UOC Medical Physics - Isola Tiberina, 00186 Rome (Italy); Hadid, L.; Waryn, M. J. [Radiology Department, Hôpital Jean Verdier (HJV), Avenue du 14 Juillet, 93140 Bondy Cedex (France); Jarvinen, H.; Siiskonen, T. [Radiation and Nuclear Safety Authority (STUK), P.O. Box 14, 00881 Helsinki (Finland); Negri, A. [Veneto Institute of Oncology (IOV), Via Gattamelata 64, 35124 Padova (Italy); Novák, L. [National Radiation Protection Institute (NRPI), Bartoškova 28, 140 00 Prague 4 (Czech Republic); Pinto, M. [Istituto Nazionale di Metrologia delle Radiazioni Ionizzanti (ENEA-INMRI), C.R. Casaccia, Via Anguillarese 301, I-00123 Santa Maria di Galeria (RM) (Italy); Knežević, Ž. [Ruđer Bošković Institute (RBI), Bijenička c. 54, 10000 Zagreb (Croatia)

    2015-07-15

    Purpose: To investigate the optimal use of XR-RV3 GafChromic{sup ®} films to assess patient skin dose in interventional radiology while addressing the means to reduce uncertainties in dose assessment. Methods: XR-Type R GafChromic films have been shown to represent the most efficient and suitable solution to determine patient skin dose in interventional procedures. As film dosimetry can be associated with high uncertainty, this paper presents the EURADOS WG 12 initiative to carry out a comprehensive study of film characteristics with a multisite approach. The considered sources of uncertainties include scanner, film, and fitting-related errors. The work focused on studying film behavior with clinical high-dose-rate pulsed beams (previously unavailable in the literature) together with reference standard laboratory beams. Results: First, the performance analysis of six different scanner models has shown that scan uniformity perpendicular to the lamp motion axis and that long term stability are the main sources of scanner-related uncertainties. These could induce errors of up to 7% on the film readings unless regularly checked and corrected. Typically, scan uniformity correction matrices and reading normalization to the scanner-specific and daily background reading should be done. In addition, the analysis on multiple film batches has shown that XR-RV3 films have generally good uniformity within one batch (<1.5%), require 24 h to stabilize after the irradiation and their response is roughly independent of dose rate (<5%). However, XR-RV3 films showed large variations (up to 15%) with radiation quality both in standard laboratory and in clinical conditions. As such, and prior to conducting patient skin dose measurements, it is mandatory to choose the appropriate calibration beam quality depending on the characteristics of the x-ray systems that will be used clinically. In addition, yellow side film irradiations should be preferentially used since they showed a lower

  16. Low threading dislocation density aluminum nitride on silicon carbide through the use of reduced temperature interlayers

    KAUST Repository

    Foronda, Humberto M.

    2017-11-23

    In this work, reduced threading dislocation density AlN on (0 0 0 1) 6H-SiC was realized through the use of reduced temperature AlN interlayers in the metalorganic chemical vapor deposition growth. We explored the dependence of the interlayer growth temperature on the AlN crystal quality, defect density, and surface morphology. The crystal quality was characterized using omega rocking curve scans and the threading dislocation density was determined by plan view transmission electron microscopy. The growth resulted in a threading dislocation density of 7 × 108 cm−2 indicating a significant reduction in the defect density of AlN in comparison to direct growth of AlN on SiC (∼1010 cm−2). Atomic force microscopy images demonstrated a clear step-terrace morphology that is consistent with step flow growth at high temperature. Reducing the interlayer growth temperature increased the TD inclination and thus enhanced TD-TD interactions. The TDD was decreased via fusion and annihilation reactions.

  17. Rising Temperatures Reduce Global Wheat Production

    Science.gov (United States)

    Asseng, S.; Ewert, F.; Martre, P.; Rötter, R. P.; Lobell, D. B.; Cammarano, D.; Kimball, B. A.; Ottman, M. J.; Wall, G. W.; White, J. W.; hide

    2015-01-01

    Crop models are essential tools for assessing the threat of climate change to local and global food production. Present models used to predict wheat grain yield are highly uncertain when simulating how crops respond to temperature. Here we systematically tested 30 different wheat crop models of the Agricultural Model Intercomparison and Improvement Project against field experiments in which growing season mean temperatures ranged from 15 degrees C to 32? degrees C, including experiments with artificial heating. Many models simulated yields well, but were less accurate at higher temperatures. The model ensemble median was consistently more accurate in simulating the crop temperature response than any single model, regardless of the input information used. Extrapolating the model ensemble temperature response indicates that warming is already slowing yield gains at a majority of wheat-growing locations. Global wheat production is estimated to fall by 6% for each degree C of further temperature increase and become more variable over space and time.

  18. Policy Uncertainty and the US Ethanol Industry

    Directory of Open Access Journals (Sweden)

    Jason P. H. Jones

    2017-11-01

    Full Text Available The Renewable Fuel Standard (RFS2, as implemented, has introduced uncertainty into US ethanol producers and the supporting commodity market. First, the fixed mandate for what is mainly cornstarch-based ethanol has increased feedstock price volatility and exerts a general effect across the agricultural sector. Second, the large discrepancy between the original Energy Independence and Security Act (EISA intentions and the actual RFS2 implementation for some fuel classes has increased the investment uncertainty facing investors in biofuel production, distribution, and consumption. Here we discuss and analyze the sources of uncertainty and evaluate the effect of potential RFS2 adjustments as they influence these uncertainties. This includes the use of a flexible, production dependent mandate on corn starch ethanol. We find that a flexible mandate on cornstarch ethanol relaxed during drought could significantly reduce commodity price spikes and alleviate the decline of livestock production in cases of feedstock production shortfalls, but it would increase the risk for ethanol investors.

  19. Reducing uncertainty in load forecasts and using real options for improving capacity dispatch management through the utilization of weather and hydrologic forecasts

    International Nuclear Information System (INIS)

    Davis, T.

    2004-01-01

    The effect of weather on electricity markets was discussed with particular focus on reducing weather uncertainty by improving short term weather forecasts. The implications of weather for hydroelectric power dispatch and use were also discussed. Although some errors in weather forecasting can result in economic benefits, most errors are associated with more costs than benefits. This presentation described how a real options analysis can make weather a favorable option. Four case studies were presented for exploratory data analysis of regional weather phenomena. These included: (1) the 2001 California electricity crisis, (2) the delta breeze effects on the California ISO, (3) the summer 2002 weather forecast error for ISO New England, and (4) the hydro plant asset valuation using weather uncertainty. It was concluded that there is a need for more economic methodological studies on the effect of weather on energy markets and costs. It was suggested that the real options theory should be applied to weather planning and utility applications. tabs., figs

  20. Simulation and Experimental Study on Effect of Phase Change Material Thickness to Reduce Temperature of Photovoltaic Panel

    Science.gov (United States)

    Indartono, Y. S.; Prakoso, S. D.; Suwono, A.; Zaini, I. N.; Fernaldi, B.

    2015-09-01

    Solar energy is promising renewable energy which can be applied in Indonesia. Average solar radiation in the country is 4.8 kWh/day/m2. Weakness of silicon-based photovoltaic (PV) is efficiency reduction caused by temperature increase. Many attempts have been done to reduce PV temperature. In previous study, palm oil, which is widely available in Indonesia, is suitable to be used as phase change material (PCM) to reduce PV temperature. In this study, thickness of aluminium rectangular-tube containing phase change material oil is varied. The tube is placed at back part of PV. Numerical and experimental study was done to evaluate the effect of tube thickness to the temperature reduction of the PV. Variation of tube thickness used in the experiment is 50.8mm, 76.2 mm, 101.6 mm. Both studies show that increase of PCM thickness reduces PV temperature. Higher PCM thickness cause large reduction on PV temperature. Simulation result shows there is an optimum thickness of the PCM which is applied to the PV.

  1. Simulation and Experimental Study on Effect of Phase Change Material Thickness to Reduce Temperature of Photovoltaic Panel

    International Nuclear Information System (INIS)

    Indartono, Y S; Prakoso, S D; Suwono, A; Zaini, I N; Fernaldi, B

    2015-01-01

    Solar energy is promising renewable energy which can be applied in Indonesia. Average solar radiation in the country is 4.8 kWh/day/m2. Weakness of silicon-based photovoltaic (PV) is efficiency reduction caused by temperature increase. Many attempts have been done to reduce PV temperature. In previous study, palm oil, which is widely available in Indonesia, is suitable to be used as phase change material (PCM) to reduce PV temperature. In this study, thickness of aluminium rectangular-tube containing phase change material oil is varied. The tube is placed at back part of PV. Numerical and experimental study was done to evaluate the effect of tube thickness to the temperature reduction of the PV. Variation of tube thickness used in the experiment is 50.8mm, 76.2 mm, 101.6 mm. Both studies show that increase of PCM thickness reduces PV temperature. Higher PCM thickness cause large reduction on PV temperature. Simulation result shows there is an optimum thickness of the PCM which is applied to the PV. (paper)

  2. Uncertainty and Sensitivity Analyses of a Pebble Bed HTGR Loss of Cooling Event

    Directory of Open Access Journals (Sweden)

    Gerhard Strydom

    2013-01-01

    Full Text Available The Very High Temperature Reactor Methods Development group at the Idaho National Laboratory identified the need for a defensible and systematic uncertainty and sensitivity approach in 2009. This paper summarizes the results of an uncertainty and sensitivity quantification investigation performed with the SUSA code, utilizing the International Atomic Energy Agency CRP 5 Pebble Bed Modular Reactor benchmark and the INL code suite PEBBED-THERMIX. Eight model input parameters were selected for inclusion in this study, and after the input parameters variations and probability density functions were specified, a total of 800 steady state and depressurized loss of forced cooling (DLOFC transient PEBBED-THERMIX calculations were performed. The six data sets were statistically analyzed to determine the 5% and 95% DLOFC peak fuel temperature tolerance intervals with 95% confidence levels. It was found that the uncertainties in the decay heat and graphite thermal conductivities were the most significant contributors to the propagated DLOFC peak fuel temperature uncertainty. No significant differences were observed between the results of Simple Random Sampling (SRS or Latin Hypercube Sampling (LHS data sets, and use of uniform or normal input parameter distributions also did not lead to any significant differences between these data sets.

  3. Stand-alone core sensitivity and uncertainty analysis of ALFRED from Monte Carlo simulations

    International Nuclear Information System (INIS)

    Pérez-Valseca, A.-D.; Espinosa-Paredes, G.; François, J.L.; Vázquez Rodríguez, A.; Martín-del-Campo, C.

    2017-01-01

    Highlights: • Methodology based on Monte Carlo simulation. • Sensitivity analysis of Lead Fast Reactor (LFR). • Uncertainty and regression analysis of LFR. • 10% change in the core inlet flow, the response in thermal power change is 0.58%. • 2.5% change in the inlet lead temperature the response is 1.87% in power. - Abstract: The aim of this paper is the sensitivity and uncertainty analysis of a Lead-Cooled Fast Reactor (LFR) based on Monte Carlo simulation of sizes up to 2000. The methodology developed in this work considers the uncertainty of sensitivities and uncertainty of output variables due to a single-input-variable variation. The Advanced Lead fast Reactor European Demonstrator (ALFRED) is analyzed to determine the behavior of the essential parameters due to effects of mass flow and temperature of liquid lead. The ALFRED core mathematical model developed in this work is fully transient, which takes into account the heat transfer in an annular fuel pellet design, the thermo-fluid in the core, and the neutronic processes, which are modeled with point kinetic with feedback fuel temperature and expansion effects. The sensitivity evaluated in terms of the relative standard deviation (RSD) showed that for 10% change in the core inlet flow, the response in thermal power change is 0.58%, and for 2.5% change in the inlet lead temperature is 1.87%. The regression analysis with mass flow rate as the predictor variable showed statistically valid cubic correlations for neutron flux and linear relationship neutron flux as a function of the lead temperature. No statistically valid correlation was observed for the reactivity as a function of the mass flow rate and for the lead temperature. These correlations are useful for the study, analysis, and design of any LFR.

  4. Interactive uncertainty reduction strategies and verbal affection in computer-mediated communication

    NARCIS (Netherlands)

    Antheunis, M.L.; Schouten, A.P.; Valkenburg, P.M.; Peter, J.

    2012-01-01

    The goal of this study was to investigate the language-based strategies that computer-mediated communication (CMC) users employ to reduce uncertainty in the absence of nonverbal cues. Specifically, this study investigated the prevalence of three interactive uncertainty reduction strategies (i.e.,

  5. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  6. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    Science.gov (United States)

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    the normal range. This finding implies that the simplified parameterization of the SSEBop model did not significantly affect the accuracy of the ET estimate while increasing the ease of model setup for operational applications. The sensitivity analysis indicated that the SSEBop model is most sensitive to input variables, land surface temperature (LST) and reference ET (ETo); and parameters, differential temperature (dT), and maximum ET scalar (Kmax), particularly during the non-growing season and in dry areas. In summary, the uncertainty assessment verifies that the SSEBop model is a reliable and robust method for large-area ET estimation. The SSEBop model estimates can be further improved by reducing errors in two input variables (ETo and LST) and two key parameters (Kmax and dT).

  7. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, W.; Rinke, A.; Moore, J. C.; Cui, X.; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D. M.; McGuire, A. D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-02-01

    We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135 × 104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101 × 104 km2). However the uncertainty (1 to 128 × 104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future permafrost distribution can be made for

  8. Uncertainty Regarding Waste Handling in Everyday Life

    Directory of Open Access Journals (Sweden)

    Susanne Ewert

    2010-09-01

    Full Text Available According to our study, based on interviews with households in a residential area in Sweden, uncertainty is a cultural barrier to improved recycling. Four causes of uncertainty are identified. Firstly, professional categories not matching cultural categories—people easily discriminate between certain categories (e.g., materials such as plastic and paper but not between others (e.g., packaging and “non-packaging”. Thus a frequent cause of uncertainty is that the basic categories of the waste recycling system do not coincide with the basic categories used in everyday life. Challenged habits—source separation in everyday life is habitual, but when a habit is challenged, by a particular element or feature of the waste system, uncertainty can arise. Lacking fractions—some kinds of items cannot be left for recycling and this makes waste collection incomplete from the user’s point of view and in turn lowers the credibility of the system. Missing or contradictory rules of thumb—the above causes seem to be particularly relevant if no motivating principle or rule of thumb (within the context of use is successfully conveyed to the user. This paper discusses how reducing uncertainty can improve recycling.

  9. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  10. Projecting species' vulnerability to climate change: Which uncertainty sources matter most and extrapolate best?

    Science.gov (United States)

    Steen, Valerie; Sofaer, Helen R; Skagen, Susan K; Ray, Andrea J; Noon, Barry R

    2017-11-01

    Species distribution models (SDMs) are commonly used to assess potential climate change impacts on biodiversity, but several critical methodological decisions are often made arbitrarily. We compare variability arising from these decisions to the uncertainty in future climate change itself. We also test whether certain choices offer improved skill for extrapolating to a changed climate and whether internal cross-validation skill indicates extrapolative skill. We compared projected vulnerability for 29 wetland-dependent bird species breeding in the climatically dynamic Prairie Pothole Region, USA. For each species we built 1,080 SDMs to represent a unique combination of: future climate, class of climate covariates, collinearity level, and thresholding procedure. We examined the variation in projected vulnerability attributed to each uncertainty source. To assess extrapolation skill under a changed climate, we compared model predictions with observations from historic drought years. Uncertainty in projected vulnerability was substantial, and the largest source was that of future climate change. Large uncertainty was also attributed to climate covariate class with hydrological covariates projecting half the range loss of bioclimatic covariates or other summaries of temperature and precipitation. We found that choices based on performance in cross-validation improved skill in extrapolation. Qualitative rankings were also highly uncertain. Given uncertainty in projected vulnerability and resulting uncertainty in rankings used for conservation prioritization, a number of considerations appear critical for using bioclimatic SDMs to inform climate change mitigation strategies. Our results emphasize explicitly selecting climate summaries that most closely represent processes likely to underlie ecological response to climate change. For example, hydrological covariates projected substantially reduced vulnerability, highlighting the importance of considering whether water

  11. Model parameter uncertainty analysis for annual field-scale P loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  12. Breaking through the uncertainty ceiling in LA-ICP-MS U-Pb geochronology

    Science.gov (United States)

    Horstwood, M.

    2016-12-01

    Sources of systematic uncertainty associated with session-to-session bias are the dominant contributor to the 2% (2s) uncertainty ceiling that currently limits the accuracy of LA-ICP-MS U-Pb geochronology. Sources include differential downhole fractionation (LIEF), `matrix effects' and ablation volume differences, which result in irreproducibility of the same reference material across sessions. Current mitigation methods include correcting for LIEF mathematically, using matrix-matched reference materials, annealing material to reduce or eliminate radiation damage effects and tuning for robust plasma conditions. Reducing the depth and volume of ablation can also mitigate these problems and should contribute to the reduction of the uncertainty ceiling. Reducing analysed volume leads to increased detection efficiency, reduced matrix-effects, eliminates LIEF, obviates ablation rate differences and reduces the likelihood of intercepting complex growth zones with depth, thereby apparently improving material homogeneity. High detection efficiencies (% level) and low sampling volumes (20um box, 1-2um deep) can now be achieved using MC-ICP-MS such that low volume ablations should be considered part of the toolbox of methods targeted at improving the reproducibility of LA-ICP-MS U-Pb geochronology. In combination with other strategies these improvements should be feasible on any ICP platform. However, reducing the volume of analysis reduces detected counts and requires a change of analytical approach in order to mitigate this. Appropriate strategies may include the use of high efficiency cell and torch technologies and the optimisation of acquisition protocols and data handling techniques such as condensing signal peaks, using log ratios and total signal integration. The tools required to break the 2% (2s) uncertainty ceiling in LA-ICP-MS U-Pb geochronology are likely now known but require a coherent strategy and change of approach to combine their implementation and realise

  13. Exploring Heterogeneous Multicore Architectures for Advanced Embedded Uncertainty Quantification.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric T.; Edwards, Harold C.; Hu, Jonathan J.

    2014-09-01

    We explore rearrangements of classical uncertainty quantification methods with the aim of achieving higher aggregate performance for uncertainty quantification calculations on emerging multicore and manycore architectures. We show a rearrangement of the stochastic Galerkin method leads to improved performance and scalability on several computational architectures whereby un- certainty information is propagated at the lowest levels of the simulation code improving memory access patterns, exposing new dimensions of fine grained parallelism, and reducing communica- tion. We also develop a general framework for implementing such rearrangements for a diverse set of uncertainty quantification algorithms as well as computational simulation codes to which they are applied.

  14. Climate policy uncertainty and investment risk

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-06-21

    Our climate is changing. This is certain. Less certain, however, is the timing and magnitude of climate change, and the cost of transition to a low-carbon world. Therefore, many policies and programmes are still at a formative stage, and policy uncertainty is very high. This book identifies how climate change policy uncertainty may affect investment behaviour in the power sector. For power companies, where capital stock is intensive and long-lived, those risks rank among the biggest and can create an incentive to delay investment. Our analysis results show that the risk premiums of climate change uncertainty can add 40% of construction costs of the plant for power investors, and 10% of price surcharges for the electricity end-users. This publication tells what can be done in policy design to reduce these costs. Incorporating the results of quantitative analysis, this publication also shows the sensitivity of different power sector investment decisions to different risks. It compares the effects of climate policy uncertainty with energy market uncertainty, showing the relative importance of these sources of risk for different technologies in different market types. Drawing on extensive consultation with power companies and financial investors, it also assesses the implications for policy makers, allowing the key messages to be transferred into policy designs. This book is a useful tool for governments to improve climate policy mechanisms and create more certainty for power investors.

  15. Impact of heat stress on crop yield—on the importance of considering canopy temperature

    International Nuclear Information System (INIS)

    Siebert, Stefan; Ewert, Frank; Eyshi Rezaei, Ehsan; Kage, Henning; Graß, Rikard

    2014-01-01

    Increasing crop productivity while simultaneously reducing the environmental footprint of crop production is considered a major challenge for the coming decades. Even short episodes of heat stress can reduce crop yield considerably causing low resource use efficiency. Studies on the impact of heat stress on crop yields over larger regions generally rely on temperatures measured by standard weather stations at 2 m height. Canopy temperatures measured in this study in field plots of rye were up to 7 °C higher than air temperature measured at typical weather station height with the differences in temperatures controlled by soil moisture contents. Relationships between heat stress and grain number derived from controlled environment studies were only confirmed under field conditions when canopy temperature was used to calculate stress thermal time. By using hourly mean temperatures measured by 78 weather stations located across Germany for the period 1994–2009 it is estimated, that mean yield declines in wheat due to heat stress during flowering were 0.7% when temperatures are measured at 2 m height, but yield declines increase to 22% for temperatures measured at the ground. These results suggest that canopy temperature should be simulated or estimated to reduce uncertainty in assessing heat stress impacts on crop yield. (letter)

  16. Analogy as a strategy for supporting complex problem solving under uncertainty.

    Science.gov (United States)

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  17. Shape-Dependent Activity of Ceria for Hydrogen Electro-Oxidation in Reduced-Temperature Solid Oxide Fuel Cells.

    Science.gov (United States)

    Tong, Xiaofeng; Luo, Ting; Meng, Xie; Wu, Hao; Li, Junliang; Liu, Xuejiao; Ji, Xiaona; Wang, Jianqiang; Chen, Chusheng; Zhan, Zhongliang

    2015-11-04

    Single crystalline ceria nanooctahedra, nanocubes, and nanorods are hydrothermally synthesized, colloidally impregnated into the porous La0.9Sr0.1Ga0.8Mg0.2O3-δ (LSGM) scaffolds, and electrochemically evaluated as the anode catalysts for reduced temperature solid oxide fuel cells (SOFCs). Well-defined surface terminations are confirmed by the high-resolution transmission electron microscopy--(111) for nanooctahedra, (100) for nanocubes, and both (110) and (100) for nanorods. Temperature-programmed reduction in H2 shows the highest reducibility for nanorods, followed sequentially by nanocubes and nanooctahedra. Measurements of the anode polarization resistances and the fuel cell power densities reveal different orders of activity of ceria nanocrystals at high and low temperatures for hydrogen electro-oxidation, i.e., nanorods > nanocubes > nanooctahedra at T ≤ 450 °C and nanooctahedra > nanorods > nanocubes at T ≥ 500 °C. Such shape-dependent activities of these ceria nanocrystals have been correlated to their difference in the local structure distortions and thus in the reducibility. These findings will open up a new strategy for design of advanced catalysts for reduced-temperature SOFCs by elaborately engineering the shape of nanocrystals and thus selectively exposing the crystal facets. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Understanding uncertainties in future Colorado River streamflow

    Science.gov (United States)

    Julie A. Vano,; Bradley Udall,; Cayan, Daniel; Jonathan T Overpeck,; Brekke, Levi D.; Das, Tapash; Hartmann, Holly C.; Hidalgo, Hugo G.; Hoerling, Martin P; McCabe, Gregory J.; Morino, Kiyomi; Webb, Robert S.; Werner, Kevin; Lettenmaier, Dennis P.

    2014-01-01

    The Colorado River is the primary water source for more than 30 million people in the United States and Mexico. Recent studies that project streamf low changes in the Colorado River all project annual declines, but the magnitude of the projected decreases range from less than 10% to 45% by the mid-twenty-first century. To understand these differences, we address the questions the management community has raised: Why is there such a wide range of projections of impacts of future climate change on Colorado River streamflow, and how should this uncertainty be interpreted? We identify four major sources of disparities among studies that arise from both methodological and model differences. In order of importance, these are differences in 1) the global climate models (GCMs) and emission scenarios used; 2) the ability of land surface and atmospheric models to simulate properly the high-elevation runoff source areas; 3) the sensitivities of land surface hydrology models to precipitation and temperature changes; and 4) the methods used to statistically downscale GCM scenarios. In accounting for these differences, there is substantial evidence across studies that future Colorado River streamflow will be reduced under the current trajectories of anthropogenic greenhouse gas emissions because of a combination of strong temperature-induced runoff curtailment and reduced annual precipitation. Reconstructions of preinstrumental streamflows provide additional insights; the greatest risk to Colorado River streamf lows is a multidecadal drought, like that observed in paleoreconstructions, exacerbated by a steady reduction in flows due to climate change. This could result in decades of sustained streamflows much lower than have been observed in the ~100 years of instrumental record.

  19. Reducing uncertainty in sustainable interpersonal service relationships: the role of aesthetics.

    Science.gov (United States)

    Xenakis, Ioannis

    2018-05-01

    Sustainable interpersonal service relationships (SISRs) are the outcome of a design process that supports situated meaningful interactions between those being served and those in service. Service design is not just directed to simply satisfy the ability to perceive the psychological state of others, but more importantly, it should aim at preserving these relationships in relation to the contextual requirements that they functionally need, in order to be or remain sustainable. However, SISRs are uncertain since they have many possibilities to be in error in the sense that the constructed, situated meanings may finally be proven unsuccessful for the anticipations and the goals of those people engaged in a SISR. The endeavor of this paper is to show that aesthetic behavior plays a crucial role in the reduction of the uncertainty that characterizes such relationships. Aesthetic behavior, as an organized network of affective and cognitive processes, has an anticipatory evaluative function with a strong influence on perception by providing significance and value for those aspects in SISRs that exhibit many possibilities to serve goals that correspond to sustainable challenges. Thus, aesthetic behavior plays an important role in the construction of meanings that are related to both empathic and contextual aspects that constitute the entire situation in which a SISR takes place. Aesthetic behavior has a strong influence in meaning-making, motivating the selection of actions that contribute to our initial goal of interacting with uncertainty, to make the world a bit less puzzling and, thus, to improve our lives, or in other words, to design.

  20. Effect of uncertainty in pore volumes on the uncertainty in amount adsorbed at high-pressures on activated carbon cloth

    International Nuclear Information System (INIS)

    Pendleton, Ph.; Badalyan, A.

    2005-01-01

    Activated carbon cloth (ACC) is a good adsorbent for high rate adsorption of volatile organic carbons [1] and as a storage media for methane [2]. It has been shown [2] that the capacity of ACC to adsorb methane, in the first instance, depends on its micropore volume. One way of increasing this storage capacity is to increase micropore volume [3]. Therefore, the uncertainty in the determination of ACC micropore volume becomes a very important factor, since it affects the uncertainty of amount adsorbed at high-pressures, which usually accompany storage of methane on ACC. Recently, we developed a method for the calculation of experimental uncertainty in micropore volume using low pressure nitrogen adsorption data at 77 K for FM1/250 ACC (ex. Calgon, USA). We tested several cubic equations of state (EOS) and multiple parameter (EOS) to determine the amount of high-pressure nitrogen adsorbed, and compared these data with amounts calculated via interpolated NIST density data. The amount adsorbed calculated from interpolated NIST density data exhibit the lowest propagated combined uncertainty. Values of relative combined standard uncertainty for FM1/250 calculated using a weighted, mean-least-squares method applied to the low-pressure nitrogen adsorption data (Fig. 1) gave 3.52% for the primary micropore volume and 1.63% for the total micropore volume. Our equipment allows the same sample to be exposed to nitrogen (and other gases) at pressures from 10 -4 Pa to 17-MPa in the temperature range from 176 to 252 K. The maximum uptake of nitrogen was 356-mmol/g at 201.92 K and 15.8-MPa (Fig. 2). The delivery capacity of ACC is determined by the amount of adsorbed gas recovered when the pressure is reduced from that for maximum adsorption to 0.1-MPa [2]. In this regard, the total micropore volume becomes an important parameter in determining the amount of gas delivered during desorption. In the present paper we will discuss the effect of uncertainty in micropore volume

  1. A Bayesian analysis of sensible heat flux estimation: Quantifying uncertainty in meteorological forcing to improve model prediction

    KAUST Repository

    Ershadi, Ali

    2013-05-01

    The influence of uncertainty in land surface temperature, air temperature, and wind speed on the estimation of sensible heat flux is analyzed using a Bayesian inference technique applied to the Surface Energy Balance System (SEBS) model. The Bayesian approach allows for an explicit quantification of the uncertainties in input variables: a source of error generally ignored in surface heat flux estimation. An application using field measurements from the Soil Moisture Experiment 2002 is presented. The spatial variability of selected input meteorological variables in a multitower site is used to formulate the prior estimates for the sampling uncertainties, and the likelihood function is formulated assuming Gaussian errors in the SEBS model. Land surface temperature, air temperature, and wind speed were estimated by sampling their posterior distribution using a Markov chain Monte Carlo algorithm. Results verify that Bayesian-inferred air temperature and wind speed were generally consistent with those observed at the towers, suggesting that local observations of these variables were spatially representative. Uncertainties in the land surface temperature appear to have the strongest effect on the estimated sensible heat flux, with Bayesian-inferred values differing by up to ±5°C from the observed data. These differences suggest that the footprint of the in situ measured land surface temperature is not representative of the larger-scale variability. As such, these measurements should be used with caution in the calculation of surface heat fluxes and highlight the importance of capturing the spatial variability in the land surface temperature: particularly, for remote sensing retrieval algorithms that use this variable for flux estimation.

  2. Uncertainty Analyses and Strategy

    International Nuclear Information System (INIS)

    Kevin Coppersmith

    2001-01-01

    The DOE identified a variety of uncertainties, arising from different sources, during its assessment of the performance of a potential geologic repository at the Yucca Mountain site. In general, the number and detail of process models developed for the Yucca Mountain site, and the complex coupling among those models, make the direct incorporation of all uncertainties difficult. The DOE has addressed these issues in a number of ways using an approach to uncertainties that is focused on producing a defensible evaluation of the performance of a potential repository. The treatment of uncertainties oriented toward defensible assessments has led to analyses and models with so-called ''conservative'' assumptions and parameter bounds, where conservative implies lower performance than might be demonstrated with a more realistic representation. The varying maturity of the analyses and models, and uneven level of data availability, result in total system level analyses with a mix of realistic and conservative estimates (for both probabilistic representations and single values). That is, some inputs have realistically represented uncertainties, and others are conservatively estimated or bounded. However, this approach is consistent with the ''reasonable assurance'' approach to compliance demonstration, which was called for in the U.S. Nuclear Regulatory Commission's (NRC) proposed 10 CFR Part 63 regulation (64 FR 8640 [DIRS 101680]). A risk analysis that includes conservatism in the inputs will result in conservative risk estimates. Therefore, the approach taken for the Total System Performance Assessment for the Site Recommendation (TSPA-SR) provides a reasonable representation of processes and conservatism for purposes of site recommendation. However, mixing unknown degrees of conservatism in models and parameter representations reduces the transparency of the analysis and makes the development of coherent and consistent probability statements about projected repository

  3. Analysis and Reduction of Complex Networks Under Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Knio, Omar M

    2014-04-09

    This is a collaborative proposal that aims at developing new methods for the analysis and reduction of complex multiscale networks under uncertainty. The approach is based on combining methods of computational singular perturbation (CSP) and probabilistic uncertainty quantification. In deterministic settings, CSP yields asymptotic approximations of reduced-dimensionality “slow manifolds” on which a multiscale dynamical system evolves. Introducing uncertainty raises fundamentally new issues, particularly concerning its impact on the topology of slow manifolds, and means to represent and quantify associated variability. To address these challenges, this project uses polynomial chaos (PC) methods to reformulate uncertain network models, and to analyze them using CSP in probabilistic terms. Specific objectives include (1) developing effective algorithms that can be used to illuminate fundamental and unexplored connections among model reduction, multiscale behavior, and uncertainty, and (2) demonstrating the performance of these algorithms through applications to model problems.

  4. Bayesian Chance-Constrained Hydraulic Barrier Design under Geological Structure Uncertainty.

    Science.gov (United States)

    Chitsazan, Nima; Pham, Hai V; Tsai, Frank T-C

    2015-01-01

    The groundwater community has widely recognized geological structure uncertainty as a major source of model structure uncertainty. Previous studies in aquifer remediation design, however, rarely discuss the impact of geological structure uncertainty. This study combines chance-constrained (CC) programming with Bayesian model averaging (BMA) as a BMA-CC framework to assess the impact of geological structure uncertainty in remediation design. To pursue this goal, the BMA-CC method is compared with traditional CC programming that only considers model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from salt water intrusion in the "1500-foot" sand and the "1700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address geological structure uncertainty, three groundwater models based on three different hydrostratigraphic architectures are developed. The results show that using traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from the connector wells is higher than the total pumpage of the protected public supply wells. While reducing the injection rate can be achieved by reducing the reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station may not be economically attractive. © 2014, National Ground Water Association.

  5. Shape optimization of an airfoil in a BZT flow with multiple-source uncertainties

    International Nuclear Information System (INIS)

    Congedo, P.M.; Corre, C.; Martinez, J.M.

    2011-01-01

    Bethe-Zel'dovich-Thompson fluids (BZT) are characterized by negative values of the fundamental derivative of gas dynamics for a range of temperatures and pressures in the vapor phase, which leads to non-classical gas dynamic behaviors such as the disintegration of compression shocks. These non-classical phenomena can be exploited, when using these fluids in Organic Rankine Cycles (ORCs), to increase isentropic efficiency. A predictive numerical simulation of these flows must account for two main sources of physical uncertainties: the BZT fluid properties often difficult to measure accurately and the usually fluctuating turbine inlet conditions. For taking full advantage of the BZT properties, the turbine geometry must also be specifically designed, keeping in mind the geometry achieved in practice after machining always slightly differs from the theoretical shape. This paper investigates some efficient procedures to perform shape optimization in a 2D BZT flow with multiple-source uncertainties (thermodynamic model, operating conditions and geometry). To demonstrate the feasibility of the proposed efficient strategies for shape optimization in the presence of multiple-source uncertainties, a zero incidence symmetric airfoil wave-drag minimization problem is retained as a case-study. This simplified configuration encompasses most of the features associated with a turbine design problem, as far the uncertainty quantification is concerned. A preliminary analysis of the contributions to the variance of the wave-drag allows to select the most significant sources of uncertainties using a reduced number of flow computations. The resulting mean value and variance of the objective are next turned into meta models. The optimal Pareto sets corresponding to the minimization of various substitute functions are obtained using a genetic algorithm as optimizer and their differences are discussed. (authors)

  6. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  7. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  8. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  9. Effect of the potential well on low temperature pressure broadening in CO-He

    Science.gov (United States)

    Palma, A.; Green, S.

    1986-01-01

    Previously reported low-temperature pressure-broadening calculations (Green, 1985) for CO-He interacting via an SCF-CI potential are compared with new calculations in which the attractive part of the potential is either reduced by half or eliminated entirely. Results demonstrate that the attractive well is responsible for low-temperature enhancement of pressure-broadening cross sections and suggest that agreement with recent experimental values at 4 K (Messer and DeLucia, 1984) can be obtained by a modest reduction, probably within the expected uncertainty, in the attractive part of the SCF-CI potential.

  10. Observation of quantum-memory-assisted entropic uncertainty relation under open systems, and its steering

    Science.gov (United States)

    Chen, Peng-Fei; Sun, Wen-Yang; Ming, Fei; Huang, Ai-Jun; Wang, Dong; Ye, Liu

    2018-01-01

    Quantum objects are susceptible to noise from their surrounding environments, interaction with which inevitably gives rise to quantum decoherence or dissipation effects. In this work, we examine how different types of local noise under an open system affect entropic uncertainty relations for two incompatible measurements. Explicitly, we observe the dynamics of the entropic uncertainty in the presence of quantum memory under two canonical categories of noisy environments: unital (phase flip) and nonunital (amplitude damping). Our study shows that the measurement uncertainty exhibits a non-monotonic dynamical behavior—that is, the amount of the uncertainty will first inflate, and subsequently decrease, with the growth of decoherence strengths in the two channels. In contrast, the uncertainty decreases monotonically with the growth of the purity of the initial state shared in prior. In order to reduce the measurement uncertainty in noisy environments, we put forward a remarkably effective strategy to steer the magnitude of uncertainty by means of a local non-unitary operation (i.e. weak measurement) on the qubit of interest. It turns out that this non-unitary operation can greatly reduce the entropic uncertainty, upon tuning the operation strength. Our investigations might thereby offer an insight into the dynamics and steering of entropic uncertainty in open systems.

  11. SunShot solar power reduces costs and uncertainty in future low-carbon electricity systems.

    Science.gov (United States)

    Mileva, Ana; Nelson, James H; Johnston, Josiah; Kammen, Daniel M

    2013-08-20

    The United States Department of Energy's SunShot Initiative has set cost-reduction targets of $1/watt for central-station solar technologies. We use SWITCH, a high-resolution electricity system planning model, to study the implications of achieving these targets for technology deployment and electricity costs in western North America, focusing on scenarios limiting carbon emissions to 80% below 1990 levels by 2050. We find that achieving the SunShot target for solar photovoltaics would allow this technology to provide more than a third of electric power in the region, displacing natural gas in the medium term and reducing the need for nuclear and carbon capture and sequestration (CCS) technologies, which face technological and cost uncertainties, by 2050. We demonstrate that a diverse portfolio of technological options can help integrate high levels of solar generation successfully and cost-effectively. The deployment of GW-scale storage plays a central role in facilitating solar deployment and the availability of flexible loads could increase the solar penetration level further. In the scenarios investigated, achieving the SunShot target can substantially mitigate the cost of implementing a carbon cap, decreasing power costs by up to 14% and saving up to $20 billion ($2010) annually by 2050 relative to scenarios with Reference solar costs.

  12. Determination of a PWR key neutron parameters uncertainties and conformity studies applications

    International Nuclear Information System (INIS)

    Bernard, D.

    2002-01-01

    The aim of this thesis was to evaluate uncertainties of key neutron parameters of slab reactors. Uncertainties sources have many origins, technologic origin for parameters of fabrication and physical origin for nuclear data. First, each contribution of uncertainties is calculated and finally, a factor of uncertainties is associated to key slab parameter like reactivity, isotherm reactivity coefficient, control rod efficiency, power form factor before irradiation and lifetime. This factors of uncertainties were computed by Generalized Perturbations Theory in case of step 0 and by directs calculations in case of irradiation problems. One of neutronic conformity applications was about fabrication and nuclear data targets precision adjustments. Statistic (uncertainties) and deterministic (deviations) approaches were studied. Then neutronics key slab parameters uncertainties were reduced and so nuclear performances were optimised. (author)

  13. Multi-criteria evaluation of wastewater treatment plant control strategies under uncertainty.

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodríguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2008-11-01

    The evaluation of activated sludge control strategies in wastewater treatment plants (WWTP) via mathematical modelling is a complex activity because several objectives; e.g. economic, environmental, technical and legal; must be taken into account at the same time, i.e. the evaluation of the alternatives is a multi-criteria problem. Activated sludge models are not well characterized and some of the parameters can present uncertainty, e.g. the influent fractions arriving to the facility and the effect of either temperature or toxic compounds on the kinetic parameters, having a strong influence in the model predictions used during the evaluation of the alternatives and affecting the resulting rank of preferences. Using a simplified version of the IWA Benchmark Simulation Model No. 2 as a case study, this article shows the variations in the decision making when the uncertainty in activated sludge model (ASM) parameters is either included or not during the evaluation of WWTP control strategies. This paper comprises two main sections. Firstly, there is the evaluation of six WWTP control strategies using multi-criteria decision analysis setting the ASM parameters at their default value. In the following section, the uncertainty is introduced, i.e. input uncertainty, which is characterized by probability distribution functions based on the available process knowledge. Next, Monte Carlo simulations are run to propagate input through the model and affect the different outcomes. Thus (i) the variation in the overall degree of satisfaction of the control objectives for the generated WWTP control strategies is quantified, (ii) the contributions of environmental, legal, technical and economic objectives to the existing variance are identified and finally (iii) the influence of the relative importance of the control objectives during the selection of alternatives is analyzed. The results show that the control strategies with an external carbon source reduce the output uncertainty

  14. Position-momentum uncertainty relations in the presence of quantum memory

    Energy Technology Data Exchange (ETDEWEB)

    Furrer, Fabian, E-mail: furrer@eve.phys.s.u-tokyo.ac.jp [Department of Physics, Graduate School of Science, University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Tokyo 113-0033 (Japan); Berta, Mario [Institute for Quantum Information and Matter, Caltech, Pasadena, California 91125 (United States); Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Tomamichel, Marco [School of Physics, The University of Sydney, Sydney 2006 (Australia); Centre for Quantum Technologies, National University of Singapore, Singapore 117543 (Singapore); Scholz, Volkher B. [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Christandl, Matthias [Institute for Theoretical Physics, ETH Zurich, Wolfgang-Pauli-Str. 27, 8093 Zürich (Switzerland); Department of Mathematical Sciences, University of Copenhagen, Universitetsparken 5, 2100 Copenhagen (Denmark)

    2014-12-15

    A prominent formulation of the uncertainty principle identifies the fundamental quantum feature that no particle may be prepared with certain outcomes for both position and momentum measurements. Often the statistical uncertainties are thereby measured in terms of entropies providing a clear operational interpretation in information theory and cryptography. Recently, entropic uncertainty relations have been used to show that the uncertainty can be reduced in the presence of entanglement and to prove security of quantum cryptographic tasks. However, much of this recent progress has been focused on observables with only a finite number of outcomes not including Heisenberg’s original setting of position and momentum observables. Here, we show entropic uncertainty relations for general observables with discrete but infinite or continuous spectrum that take into account the power of an entangled observer. As an illustration, we evaluate the uncertainty relations for position and momentum measurements, which is operationally significant in that it implies security of a quantum key distribution scheme based on homodyne detection of squeezed Gaussian states.

  15. Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.

    2012-12-01

    Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root

  16. Influences of increasing temperature on Indian wheat: quantifying limits to predictability

    International Nuclear Information System (INIS)

    Koehler, Ann-Kristin; Challinor, Andrew J; Hawkins, Ed; Asseng, Senthold

    2013-01-01

    As climate changes, temperatures will play an increasing role in determining crop yield. Both climate model error and lack of constrained physiological thresholds limit the predictability of yield. We used a perturbed-parameter climate model ensemble with two methods of bias-correction as input to a regional-scale wheat simulation model over India to examine future yields. This model configuration accounted for uncertainty in climate, planting date, optimization, temperature-induced changes in development rate and reproduction. It also accounts for lethal temperatures, which have been somewhat neglected to date. Using uncertainty decomposition, we found that fractional uncertainty due to temperature-driven processes in the crop model was on average larger than climate model uncertainty (0.56 versus 0.44), and that the crop model uncertainty is dominated by crop development. Simulations with the raw compared to the bias-corrected climate data did not agree on the impact on future wheat yield, nor its geographical distribution. However the method of bias-correction was not an important source of uncertainty. We conclude that bias-correction of climate model data and improved constraints on especially crop development are critical for robust impact predictions. (letter)

  17. The effects of annealing temperature on the permittivity and electromagnetic attenuation performance of reduced graphene oxide

    Science.gov (United States)

    Wu, Fan; Zeng, Qiao; Xia, Yilu; Sun, Mengxiao; Xie, Aming

    2018-05-01

    Reduced graphene oxide (RGO) has been prepared through the thermal reduction method with different annealing temperatures to explore the effects of temperature on the permittivity and electromagnetic attenuation performance. The real and imaginary parts of permittivity increase along with the decrease in the oxygen functional group and the increase in the filler loading ratio. A composite only loaded with 1 wt. % of RGO can possess an effective electromagnetic absorption bandwidth of 7.60 GHz, when graphene oxide was reduced under 300 °C for 2 h. With the annealing temperature increased to 700 °C and the well reduced RGO loaded 7 wt. % in the composite, the electromagnetic interference shielding efficiency can get higher than 35 dB from 2 to 18 GHz. This study shows that controlling the oxygen functional groups on the RGO surface can also obtain an ideal electromagnetic attenuation performance without any other decorated nanomaterials.

  18. A new uncertainty reduction method for PWR cores with erbia bearing fuel

    International Nuclear Information System (INIS)

    Takeda, Toshikazu; Sano, Tadafumi; Kitada, Takanori; Kuroishi, Takeshi; Yamasaki, Masatoshi; Unesaki, Hironobu

    2008-01-01

    The concept of a PWR with erbia bearing high burnup fuel has been proposed. The erbia is added to all fuel with over 5% 235 U enrichment to retain the neutronics characteristics to that within 5% 235 U enrichment. There is a problem of the prediction accuracy of the neutronics characteristics with erbia bearing fuel because of the short of experimental data of erbia bearing fuel. The purpose of the present work is to reduce the uncertainty. A new method has been proposed by combining the bias factor method and the cross section adjustment method. For the PWR core, the uncertainty reduction, which shows the rate of reduction of uncertainty, of the k eff is 0.865 by the present method and 0.801 by the conventional bias factor method. Thus the prediction uncertainties are reduced by the present method compared to the bias factor method. (authors)

  19. Optimization Under Uncertainty for Wake Steering Strategies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-05-01

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.

  20. Reducing greenhouses and the temperature history of Earth and Mars

    International Nuclear Information System (INIS)

    Sagan, C.

    1977-01-01

    The modern theory of stellar evolution implies that the Sun has increased in brightness by several tens per cent over geological time. Were all other global parameters held constant, this would imply that the mean temperature of the Earth was below the freezing point of seawater about 2 x 10 9 yr ago. There is, however, excellent geological and palaeontological evidence that there were extensive bodies of liquid water on the Earth between 3 and 4 x 10 9 yr ago. A possible solution to this puzzle is that the Earth's primitive atmosphere contained small quantities of NH 3 and other reducing gases which significantly enhanced the global 'greenhouse' effect. Cosmochemical considerations point strongly to a higher abundance of reduced constituents in the primitive than in the contemporary terrestrial atmosphere; and reduced atmospheric components such as NH 3 and CH 4 are required to understand the accumulation of prebiological organic compounds necessary for the origin of life between 3 and 4 x 10 9 yr ago. Similar arguments may apply to Mars. (author)

  1. Reducing greenhouses and the temperature history of Earth and Mars

    Energy Technology Data Exchange (ETDEWEB)

    Sagan, C [Cornell Univ., Ithaca, N.Y. (USA). Lab. for Planetary Studies

    1977-09-15

    The modern theory of stellar evolution implies that the Sun has increased in brightness by several tens per cent over geological time. Were all other global parameters held constant, this would imply that the mean temperature of the Earth was below the freezing point of seawater about 2 x 10/sup 9/ yr ago. There is, however, excellent geological and palaeontological evidence that there were extensive bodies of liquid water on the Earth between 3 and 4 x 10/sup 9/ yr ago. A possible solution to this puzzle is that the Earth's primitive atmosphere contained small quantities of NH/sub 3/ and other reducing gases which significantly enhanced the global 'greenhouse' effect. Cosmochemical considerations point strongly to a higher abundance of reduced constituents in the primitive than in the contemporary terrestrial atmosphere; and reduced atmospheric components such as NH/sub 3/ and CH/sub 4/ are required to understand the accumulation of prebiological organic compounds necessary for the origin of life between 3 and 4 x 10/sup 9/ yr ago. Similar arguments may apply to Mars.

  2. Resolving structural uncertainty in natural resources management using POMDP approaches

    Science.gov (United States)

    Williams, B.K.

    2011-01-01

    In recent years there has been a growing focus on the uncertainties of natural resources management, and the importance of accounting for uncertainty in assessing management effectiveness. This paper focuses on uncertainty in resource management in terms of discrete-state Markov decision processes (MDP) under structural uncertainty and partial observability. It describes the treatment of structural uncertainty with approaches developed for partially observable resource systems. In particular, I show how value iteration for partially observable MDPs (POMDP) can be extended to structurally uncertain MDPs. A key difference between these process classes is that structurally uncertain MDPs require the tracking of system state as well as a probability structure for the structure uncertainty, whereas with POMDPs require only a probability structure for the observation uncertainty. The added complexity of the optimization problem under structural uncertainty is compensated by reduced dimensionality in the search for optimal strategy. A solution algorithm for structurally uncertain processes is outlined for a simple example in conservation biology. By building on the conceptual framework developed for POMDPs, natural resource analysts and decision makers who confront structural uncertainties in natural resources can take advantage of the rapid growth in POMDP methods and approaches, and thereby produce better conservation strategies over a larger class of resource problems. ?? 2011.

  3. Uncertainty Evaluation with Multi-Dimensional Model of LBLOCA in OPR1000 Plant

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jieun; Oh, Deog Yeon; Seul, Kwang-Won; Lee, Jin Ho [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    KINS has used KINS-REM (KINS-Realistic Evaluation Methodology) which developed for Best- Estimate (BE) calculation and uncertainty quantification for regulatory audit. This methodology has been improved continuously by numerous studies, such as uncertainty parameters and uncertainty ranges. In this study, to evaluate the applicability of improved KINS-REM for OPR1000 plant, uncertainty evaluation with multi-dimensional model for confirming multi-dimensional phenomena was conducted with MARS-KS code. In this study, the uncertainty evaluation with multi- dimensional model of OPR1000 plant was conducted for confirming the applicability of improved KINS- REM The reactor vessel modeled using MULTID component of MARS-KS code, and total 29 uncertainty parameters were considered by 124 sampled calculations. Through 124 calculations using Mosaique program with MARS-KS code, peak cladding temperature was calculated and final PCT was determined by the 3rd order Wilks' formula. The uncertainty parameters which has strong influence were investigated by Pearson coefficient analysis. They were mostly related with plant operation and fuel material properties. Evaluation results through the 124 calculations and sensitivity analysis show that improved KINS-REM could be reasonably applicable for uncertainty evaluation with multi-dimensional model calculations of OPR1000 plants.

  4. Detailed modeling of the statistical uncertainty of Thomson scattering measurements

    International Nuclear Information System (INIS)

    Morton, L A; Parke, E; Hartog, D J Den

    2013-01-01

    The uncertainty of electron density and temperature fluctuation measurements is determined by statistical uncertainty introduced by multiple noise sources. In order to quantify these uncertainties precisely, a simple but comprehensive model was made of the noise sources in the MST Thomson scattering system and of the resulting variance in the integrated scattered signals. The model agrees well with experimental and simulated results. The signal uncertainties are then used by our existing Bayesian analysis routine to find the most likely electron temperature and density, with confidence intervals. In the model, photonic noise from scattered light and plasma background light is multiplied by the noise enhancement factor (F) of the avalanche photodiode (APD). Electronic noise from the amplifier and digitizer is added. The amplifier response function shapes the signal and induces correlation in the noise. The data analysis routine fits a characteristic pulse to the digitized signals from the amplifier, giving the integrated scattered signals. A finite digitization rate loses information and can cause numerical integration error. We find a formula for the variance of the scattered signals in terms of the background and pulse amplitudes, and three calibration constants. The constants are measured easily under operating conditions, resulting in accurate estimation of the scattered signals' uncertainty. We measure F ≈ 3 for our APDs, in agreement with other measurements for similar APDs. This value is wavelength-independent, simplifying analysis. The correlated noise we observe is reproduced well using a Gaussian response function. Numerical integration error can be made negligible by using an interpolated characteristic pulse, allowing digitization rates as low as the detector bandwidth. The effect of background noise is also determined

  5. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  6. Degradation and performance evaluation of PV module in desert climate conditions with estimate uncertainty in measuring

    Directory of Open Access Journals (Sweden)

    Fezzani Amor

    2017-01-01

    Full Text Available The performance of photovoltaic (PV module is affected by outdoor conditions. Outdoor testing consists installing a module, and collecting electrical performance data and climatic data over a certain period of time. It can also include the study of long-term performance under real work conditions. Tests are operated in URAER located in desert region of Ghardaïa (Algeria characterized by high irradiation and temperature levels. The degradation of PV module with temperature and time exposure to sunlight contributes significantly to the final output from the module, as the output reduces each year. This paper presents a comparative study of different methods to evaluate the degradation of PV module after a long term exposure of more than 12 years in desert region and calculates uncertainties in measuring. Firstly, this evaluation uses three methods: Visual inspection, data given by Solmetric PVA-600 Analyzer translated at Standard Test Condition (STC and based on the investigation results of the translation equations as ICE 60891. Secondly, the degradation rates calculated for all methods. Finally, a comparison between a degradation rates given by Solmetric PVA-600 analyzer, calculated by simulation model and calculated by two methods (ICE 60891 procedures 1, 2. We achieved a detailed uncertainty study in order to improve the procedure and measurement instrument.

  7. Effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model output

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study analyses the effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model's discharge estimates. Prediction uncertainty bounds are derived using the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation (at a single station within the catchment) and a precipitation factor FPi. Thus, these factors provide a simplified representation of the spatial variation of precipitation, specifically the shape of the functional relationship between precipitation and height. In the absence of information about appropriate values of the precipitation factors FPi, these are estimated through standard calibration procedures. The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. Monte Carlo samples of the model output are obtained by randomly varying the model parameters within their feasible ranges. In the first experiment, the precipitation factors FPi are considered unknown and thus included in the sampling process. The total number of unknown parameters in this case is 16. In the second experiment, precipitation factors FPi are estimated a priori, by means of a long term water balance between observed discharge at the catchment outlet, evapotranspiration estimates and observed precipitation. In this case, the number of unknown parameters reduces to 11. The feasible ranges assigned to the precipitation factors in the first experiment are slightly wider than the range of fixed precipitation factors used in the second experiment. The mean squared error of the Box-Cox transformed discharge during the calibration period is used for the evaluation of the

  8. When will we be committed to crossing 1.5 and 2 °C temperature thresholds?

    Science.gov (United States)

    Armour, K.; Proistosescu, C.; Roe, G.; Huybers, P. J.

    2017-12-01

    The zero-emissions climate commitment is a key metric for science and policy. It is the future warming we face given only to-date emissions, independent of future human influence on climate. Following a cessation of emissions, future global temperature change depends on (i) the atmospheric lifetimes of aerosols and greenhouse gases (GHGs), and (ii) the physical climate response to radiative forcing (Armour and Roe 2011). The cooling effect of aerosols diminishes within weeks; GHG concentrations get drawn down on timescales ranging from months to millennia; and ocean heat uptake diminishes as climate equilibrates with the residual CO2 forcing. Whether global temperature increases, stays stable, or declines following emission cessation depends on these competing factors. There is substantial uncertainty in the zero-emissions commitment due to a combination of (i) correlated uncertainties in aerosol radiative forcing and climate sensitivity, (ii) uncertainty in the atmospheric lifetime of CO2, and (iii) uncertainty in how climate sensitivity will evolve in the future. Here we quantify climate commitment in a Bayesian framework of an idealized model constrained by observations of global warming and energy imbalance, combined with estimates of global radiative forcing. At present, our committed warming is 1.2°C (median), with a 25% chance that it already exceeds 1.5°C and a 5% chance that it exceeds 2°C; the range comes primarily from uncertainty in the degree to which aerosols currently mask GHG forcing. We further quantify how climate commitment, and its uncertainty, changes with emissions scenario and over time. Under high emissions (RCP8.5), we will reach a >50% risk of a 2°C zero-emission climate commitment by the year 2035, about two decades before that temperature would be reached if emissions continued unabated. Committed warming is substantially reduced for lower-emissions scenarios, depending on the mix of aerosol and GHG mitigation. For the next few

  9. Uncertainty and stress: Why it causes diseases and how it is mastered by the brain.

    Science.gov (United States)

    Peters, Achim; McEwen, Bruce S; Friston, Karl

    2017-09-01

    The term 'stress' - coined in 1936 - has many definitions, but until now has lacked a theoretical foundation. Here we present an information-theoretic approach - based on the 'free energy principle' - defining the essence of stress; namely, uncertainty. We address three questions: What is uncertainty? What does it do to us? What are our resources to master it? Mathematically speaking, uncertainty is entropy or 'expected surprise'. The 'free energy principle' rests upon the fact that self-organizing biological agents resist a tendency to disorder and must therefore minimize the entropy of their sensory states. Applied to our everyday life, this means that we feel uncertain, when we anticipate that outcomes will turn out to be something other than expected - and that we are unable to avoid surprise. As all cognitive systems strive to reduce their uncertainty about future outcomes, they face a critical constraint: Reducing uncertainty requires cerebral energy. The characteristic of the vertebrate brain to prioritize its own high energy is captured by the notion of the 'selfish brain'. Accordingly, in times of uncertainty, the selfish brain demands extra energy from the body. If, despite all this, the brain cannot reduce uncertainty, a persistent cerebral energy crisis may develop, burdening the individual by 'allostatic load' that contributes to systemic and brain malfunction (impaired memory, atherogenesis, diabetes and subsequent cardio- and cerebrovascular events). Based on the basic tenet that stress originates from uncertainty, we discuss the strategies our brain uses to avoid surprise and thereby resolve uncertainty. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  11. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  12. Assessing measurement uncertainty in meteorology in urban environments

    International Nuclear Information System (INIS)

    Curci, S; Lavecchia, C; Frustaci, G; Pilati, S; Paganelli, C; Paolini, R

    2017-01-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network ® ) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer. (paper)

  13. Assessing measurement uncertainty in meteorology in urban environments

    Science.gov (United States)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  14. Implications of nuclear data uncertainties to reactor design

    International Nuclear Information System (INIS)

    Greebler, P.; Hutchins, B.A.; Cowan, C.L.

    1970-01-01

    Uncertainties in nuclear data require significant allowances to be made in the design and the operating conditions of reactor cores and of shielded-reactor-plant and fuel-processing systems. These allowances result in direct cost increases due to overdesign of components and equipment and reduced core and fuel operating performance. Compromising the allowances for data uncertainties has indirect cost implications due to increased risks of failure to meet plant and fuel performance objectives, with warrantees involved in some cases, and to satisfy licensed safety requirements. Fast breeders are the most sensitive power reactors to the uncertainties in nuclear data over the neutron energy range of interest for fission reactors, and this paper focuses on the implications of the data uncertainties to design and operation of fast breeder reactors and fuel-processing systems. The current status of uncertainty in predicted physics parameters due to data uncertainties is reviewed and compared with the situation in 1966 and that projected for within the next two years due to anticipated data improvements. Implications of the uncertainties in the predicted physics parameters to design and operation are discussed for both a near-term prototype or demonstration breeder plant (∼300 MW(e)) and a longer-term large (∼1000 MW(e)) plant. Significant improvements in the nuclear data have been made during the past three years, the most important of these to fast power reactors being the 239 Pu alpha below 15 keV. The most important remaining specific data uncertainties are illustrated by their individual contributions to the computational uncertainty of selected physics parameters, and recommended priorities and accuracy requirements for improved data are presented

  15. Tailored complex degree of mutual coherence for plane-of-interest interferometry with reduced measurement uncertainty

    Science.gov (United States)

    Fütterer, G.

    2017-10-01

    A problem of interferometers is the elimination of parasitic reflections. Parasitic reflections and modulated intensity signals, which are not related to the reference surface (REF) or the surface under test (SUT) in a direct way, can increase the measurement uncertainty significantly. In some situations standard methods might be used in order to eliminate reflections from the backside of the optical element under test. For instance, match the test object to an absorber, while taking the complex refractive index into account, can cancel out back reflections completely. This causes additional setup time and chemical contamination. In some situations an angular offset might be combined with an aperture stop. This reduces spatial resolution and it does not work if the disturbing wave field propagates in the same direction as the wave field, which propagates from the SUT. However, a stack of surfaces is a problem. An increased spectral bandwidth might be used in order to obtain a separation of the plane-of-interest from other planes. Depending on the interferometer used, this might require an optical path difference of zero or it might cause a reduction of the visibility to V embodiment of a modified interferometer, will be discussed.

  16. Influence of DAD-TA temperature-reducing additive on physical and mechanical properties of bitumen and compaction of asphalt concrete.

    Science.gov (United States)

    Yadykina, V. V.; Akimov, A. E.; Trautvain, A. I.; Kholopov, V. S.

    2018-03-01

    The paper is devoted to the use of DAD-TA temperature-reducing additive for the preparation and pouring of asphalt concrete mixes at reduced temperatures. It also shows positive influence of the modified bitumen on the efficiency of organo-mineral composite compaction at reduced temperatures. Physical and mechanical properties of asphalt concrete with the use of bitumen modified by DAD-TA additive including indicators characterizing road surfacing life are presented. Arguments to use this material from the point of view of its production technology and environmental impact are given.

  17. Reducing greenhouses and the temperature history of earth and Mars

    Science.gov (United States)

    Sagan, C.

    1977-01-01

    It has been suggested that NH3 and other reducing gases were present in the earth's primitive atmosphere, enhancing the global greenhouse effect; data obtained through isotopic archeothermometry support this hypothesis. Computations have been applied to the evolution of surface temperatures on Mars, considering various bolometric albedos and compositions. The results are of interest in the study of Martian sinuous channels which may have been created by aqueous fluvial errosion, and imply that clement conditions may have previously occurred on Mars, and may occur in the future.

  18. Analysis on Calibration and Uncertainty for TD-LTE Radio Test System

    Directory of Open Access Journals (Sweden)

    Zhang Weipeng

    2014-06-01

    Full Text Available TD-LTE base station radio test system measures radio signal with a required accuracy, so calibration need to be done for transmission path between base station and measurement instruments before test. Considering Transmitter OFF Power measurement within OFF period, modulated signal generator and spectrum analyzer inside test system is used for calibration, to get accurate transmission parameters of the paths, and to reduce test cost without more instruments. The paper describes the uncertainty of test system, analyzes uncertainty contribution of interface mismatch, calculates uncertainty for Transmitter OFF Power measurement, uncertainty is 1.193 dB, within the requirement of 3GPP specification.

  19. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  20. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  1. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  2. Estimation of Uncertainty in Aerosol Concentration Measured by Aerosol Sampling System

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Chan; Song, Yong Jae; Jung, Woo Young; Lee, Hyun Chul; Kim, Gyu Tae; Lee, Doo Yong [FNC Technology Co., Yongin (Korea, Republic of)

    2016-10-15

    FNC Technology Co., Ltd has been developed test facilities for the aerosol generation, mixing, sampling and measurement under high pressure and high temperature conditions. The aerosol generation system is connected to the aerosol mixing system which injects SiO{sub 2}/ethanol mixture. In the sampling system, glass fiber membrane filter has been used to measure average mass concentration. Based on the experimental results using main carrier gas of steam and air mixture, the uncertainty estimation of the sampled aerosol concentration was performed by applying Gaussian error propagation law. FNC Technology Co., Ltd. has been developed the experimental facilities for the aerosol measurement under high pressure and high temperature. The purpose of the tests is to develop commercial test module for aerosol generation, mixing and sampling system applicable to environmental industry and safety related system in nuclear power plant. For the uncertainty calculation of aerosol concentration, the value of the sampled aerosol concentration is not measured directly, but must be calculated from other quantities. The uncertainty of the sampled aerosol concentration is a function of flow rates of air and steam, sampled mass, sampling time, condensed steam mass and its absolute errors. These variables propagate to the combination of variables in the function. Using operating parameters and its single errors from the aerosol test cases performed at FNC, the uncertainty of aerosol concentration evaluated by Gaussian error propagation law is less than 1%. The results of uncertainty estimation in the aerosol sampling system will be utilized for the system performance data.

  3. Effect of Baseflow Separation on Uncertainty of Hydrological Modeling in the Xinanjiang Model

    Directory of Open Access Journals (Sweden)

    Kairong Lin

    2014-01-01

    Full Text Available Based on the idea of inputting more available useful information for evaluation to gain less uncertainty, this study focuses on how well the uncertainty can be reduced by considering the baseflow estimation information obtained from the smoothed minima method (SMM. The Xinanjiang model and the generalized likelihood uncertainty estimation (GLUE method with the shuffled complex evolution Metropolis (SCEM-UA sampling algorithm were used for hydrological modeling and uncertainty analysis, respectively. The Jiangkou basin, located in the upper of the Hanjiang River, was selected as case study. It was found that the number and standard deviation of behavioral parameter sets both decreased when the threshold value for the baseflow efficiency index increased, and the high Nash-Sutcliffe efficiency coefficients correspond well with the high baseflow efficiency coefficients. The results also showed that uncertainty interval width decreased significantly, while containing ratio did not decrease by much and the simulated runoff with the behavioral parameter sets can fit better to the observed runoff, when threshold for the baseflow efficiency index was taken into consideration. These implied that using the baseflow estimation information can reduce the uncertainty in hydrological modeling to some degree and gain more reasonable prediction bounds.

  4. Development of a reduced model of formation reactions in Zr-Al nanolaminates

    KAUST Repository

    Vohra, Manav

    2014-12-15

    A computational model of anaerobic reactions in metallic multilayered systems with an equimolar composition of zirconium and aluminum is developed. The reduced reaction formalism of M. Salloum and O. M. Knio, Combust. Flame 157(2): 288–295 (2010) is adopted. Attention is focused on quantifying intermixing rates based on experimental measurements of uniform ignition as well as measurements of self-propagating front velocities. Estimates of atomic diffusivity are first obtained based on a regression analysis. A more elaborate Bayesian inference formalism is then applied in order to assess the impact of uncertainties in the measurements, potential discrepancies between predictions and observations, as well as the sensitivity of predictions to inferred parameters. Intermixing rates are correlated in terms of a composite Arrhenius law, which exhibits a discontinuity around the Al melting temperature. Analysis of the predictions indicates that Arrhenius parameters inferred for the low-temperature branch lie within a tight range, whereas the parameters of the high-temperature branch are characterized by higher uncertainty. The latter is affected by scatter in the experimental measurements, and the limited range of bilayers where observations are available. For both branches, the predictions exhibit higher sensitivity to the activation energy than the pre-exponent, whose posteriors are highly correlated.

  5. Development of a reduced model of formation reactions in Zr-Al nanolaminates

    KAUST Repository

    Vohra, Manav; Winokur, Justin; Overdeep, Kyle R.; Marcello, Paul; Weihs, Timothy P.; Knio, Omar

    2014-01-01

    A computational model of anaerobic reactions in metallic multilayered systems with an equimolar composition of zirconium and aluminum is developed. The reduced reaction formalism of M. Salloum and O. M. Knio, Combust. Flame 157(2): 288–295 (2010) is adopted. Attention is focused on quantifying intermixing rates based on experimental measurements of uniform ignition as well as measurements of self-propagating front velocities. Estimates of atomic diffusivity are first obtained based on a regression analysis. A more elaborate Bayesian inference formalism is then applied in order to assess the impact of uncertainties in the measurements, potential discrepancies between predictions and observations, as well as the sensitivity of predictions to inferred parameters. Intermixing rates are correlated in terms of a composite Arrhenius law, which exhibits a discontinuity around the Al melting temperature. Analysis of the predictions indicates that Arrhenius parameters inferred for the low-temperature branch lie within a tight range, whereas the parameters of the high-temperature branch are characterized by higher uncertainty. The latter is affected by scatter in the experimental measurements, and the limited range of bilayers where observations are available. For both branches, the predictions exhibit higher sensitivity to the activation energy than the pre-exponent, whose posteriors are highly correlated.

  6. Uncertainty analysis of time-dependent nonlinear systems: theory and application to transient thermal hydraulics

    International Nuclear Information System (INIS)

    Barhen, J.; Bjerke, M.A.; Cacuci, D.G.; Mullins, C.B.; Wagschal, G.G.

    1982-01-01

    An advanced methodology for performing systematic uncertainty analysis of time-dependent nonlinear systems is presented. This methodology includes a capability for reducing uncertainties in system parameters and responses by using Bayesian inference techniques to consistently combine prior knowledge with additional experimental information. The determination of best estimates for the system parameters, for the responses, and for their respective covariances is treated as a time-dependent constrained minimization problem. Three alternative formalisms for solving this problem are developed. The two ''off-line'' formalisms, with and without ''foresight'' characteristics, require the generation of a complete sensitivity data base prior to performing the uncertainty analysis. The ''online'' formalism, in which uncertainty analysis is performed interactively with the system analysis code, is best suited for treatment of large-scale highly nonlinear time-dependent problems. This methodology is applied to the uncertainty analysis of a transient upflow of a high pressure water heat transfer experiment. For comparison, an uncertainty analysis using sensitivities computed by standard response surface techniques is also performed. The results of the analysis indicate the following. Major reduction of the discrepancies in the calculation/experiment ratios is achieved by using the new methodology. Incorporation of in-bundle measurements in the uncertainty analysis significantly reduces system uncertainties. Accuracy of sensitivities generated by response-surface techniques should be carefully assessed prior to using them as a basis for uncertainty analyses of transient reactor safety problems

  7. Estimation of uncertainty in pKa values determined by potentiometric titration.

    Science.gov (United States)

    Koort, Eve; Herodes, Koit; Pihl, Viljar; Leito, Ivo

    2004-06-01

    A procedure is presented for estimation of uncertainty in measurement of the pK(a) of a weak acid by potentiometric titration. The procedure is based on the ISO GUM. The core of the procedure is a mathematical model that involves 40 input parameters. A novel approach is used for taking into account the purity of the acid, the impurities are not treated as inert compounds only, their possible acidic dissociation is also taken into account. Application to an example of practical pK(a) determination is presented. Altogether 67 different sources of uncertainty are identified and quantified within the example. The relative importance of different uncertainty sources is discussed. The most important source of uncertainty (with the experimental set-up of the example) is the uncertainty of pH measurement followed by the accuracy of the burette and the uncertainty of weighing. The procedure gives uncertainty separately for each point of the titration curve. The uncertainty depends on the amount of titrant added, being lowest in the central part of the titration curve. The possibilities of reducing the uncertainty and interpreting the drift of the pK(a) values obtained from the same curve are discussed.

  8. Efficient climate policies under technology and climate uncertainty

    International Nuclear Information System (INIS)

    Held, Hermann; Kriegler, Elmar; Lessmann, Kai; Edenhofer, Ottmar

    2009-01-01

    This article explores efficient climate policies in terms of investment streams into fossil and renewable energy technologies. The investment decisions maximise social welfare while observing a probabilistic guardrail for global mean temperature rise under uncertain technology and climate parameters. Such a guardrail constitutes a chance constraint, and the resulting optimisation problem is an instance of chance constrained programming, not stochastic programming as often employed. Our analysis of a model of economic growth and endogenous technological change, MIND, suggests that stringent mitigation strategies cannot guarantee a very high probability of limiting warming to 2 o C since preindustrial time under current uncertainty about climate sensitivity and climate response time scale. Achieving the 2 o C temperature target with a probability P* of 75% requires drastic carbon dioxide emission cuts. This holds true even though we have assumed an aggressive mitigation policy on other greenhouse gases from, e.g., the agricultural sector. The emission cuts are deeper than estimated from a deterministic calculation with climate sensitivity fixed at the P* quantile of its marginal probability distribution (3.6 o C). We show that earlier and cumulatively larger investments into the renewable sector are triggered by including uncertainty in the technology and climate response time scale parameters. This comes at an additional GWP loss of 0.3%, resulting in a total loss of 0.8% GWP for observing the chance constraint. We obtained those results with a new numerical scheme to implement constrained welfare optimisation under uncertainty as a chance constrained programming problem in standard optimisation software such as GAMS. The scheme is able to incorporate multivariate non-factorial probability measures such as given by the joint distribution of climate sensitivity and response time. We demonstrate the scheme for the case of a four-dimensional parameter space capturing

  9. Sensitivity of modeled ozone concentrations to uncertainties in biogenic emissions

    International Nuclear Information System (INIS)

    Roselle, S.J.

    1992-06-01

    The study examines the sensitivity of regional ozone (O3) modeling to uncertainties in biogenic emissions estimates. The United States Environmental Protection Agency's (EPA) Regional Oxidant Model (ROM) was used to simulate the photochemistry of the northeastern United States for the period July 2-17, 1988. An operational model evaluation showed that ROM had a tendency to underpredict O3 when observed concentrations were above 70-80 ppb and to overpredict O3 when observed values were below this level. On average, the model underpredicted daily maximum O3 by 14 ppb. Spatial patterns of O3, however, were reproduced favorably by the model. Several simulations were performed to analyze the effects of uncertainties in biogenic emissions on predicted O3 and to study the effectiveness of two strategies of controlling anthropogenic emissions for reducing high O3 concentrations. Biogenic hydrocarbon emissions were adjusted by a factor of 3 to account for the existing range of uncertainty in these emissions. The impact of biogenic emission uncertainties on O3 predictions depended upon the availability of NOx. In some extremely NOx-limited areas, increasing the amount of biogenic emissions decreased O3 concentrations. Two control strategies were compared in the simulations: (1) reduced anthropogenic hydrocarbon emissions, and (2) reduced anthropogenic hydrocarbon and NOx emissions. The simulations showed that hydrocarbon emission controls were more beneficial to the New York City area, but that combined NOx and hydrocarbon controls were more beneficial to other areas of the Northeast. Hydrocarbon controls were more effective as biogenic hydrocarbon emissions were reduced, whereas combined NOx and hydrocarbon controls were more effective as biogenic hydrocarbon emissions were increased

  10. Damage behavior in helium-irradiated reduced-activation martensitic steels at elevated temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Fengfeng [Key Laboratory of Artificial Micro- and Nano-Structures of Ministry of Education, Hubei Nuclear Solid Physics Key Laboratory and School of Physics and Technology, Wuhan University, Wuhan 430072 (China); Guo, Liping, E-mail: guolp@whu.edu.cn [Key Laboratory of Artificial Micro- and Nano-Structures of Ministry of Education, Hubei Nuclear Solid Physics Key Laboratory and School of Physics and Technology, Wuhan University, Wuhan 430072 (China); Chen, Jihong; Li, Tiecheng; Zheng, Zhongcheng [Key Laboratory of Artificial Micro- and Nano-Structures of Ministry of Education, Hubei Nuclear Solid Physics Key Laboratory and School of Physics and Technology, Wuhan University, Wuhan 430072 (China); Yao, Z. [Department of Mechanical and Materials Engineering, Queen’s University, Kingston K7L 3N6, ON (Canada); Suo, Jinping [State Key Laboratory of Mould Technology, Institute of Materials Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2014-12-15

    Dislocation loops induced by helium irradiation at elevated temperatures in reduced-activation martensitic steels were investigated using transmission electron microscopy. Steels were irradiated with 100 keV helium ions to 0.8 dpa between 300 K and 723 K. At irradiation temperatures T{sub irr} ⩽ 573 K, small defects with both Burger vectors b = 1/2〈1 1 1〉 and b = 〈1 0 0〉 were observed, while at T{sub irr} ⩾ 623 K, the microstructure was dominated by large convoluted interstitial dislocation loops with b = 〈1 0 0〉. Only small cavities were found in the steels irradiated at 723 K.

  11. County-Level Climate Uncertainty for Risk Assessments: Volume 1.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.

  12. Constraining Parameter Uncertainty in Simulations of Water and Heat Dynamics in Seasonally Frozen Soil Using Limited Observed Data

    Directory of Open Access Journals (Sweden)

    Mousong Wu

    2016-02-01

    Full Text Available Water and energy processes in frozen soils are important for better understanding hydrologic processes and water resources management in cold regions. To investigate the water and energy balance in seasonally frozen soils, CoupModel combined with the generalized likelihood uncertainty estimation (GLUE method was used. Simulation work on water and heat processes in frozen soil in northern China during the 2012/2013 winter was conducted. Ensemble simulations through the Monte Carlo sampling method were generated for uncertainty analysis. Behavioral simulations were selected based on combinations of multiple model performance index criteria with respect to simulated soil water and temperature at four depths (5 cm, 15 cm, 25 cm, and 35 cm. Posterior distributions for parameters related to soil hydraulic, radiation processes, and heat transport indicated that uncertainties in both input and model structures could influence model performance in modeling water and heat processes in seasonally frozen soils. Seasonal courses in water and energy partitioning were obvious during the winter. Within the day-cycle, soil evaporation/condensation and energy distributions were well captured and clarified as an important phenomenon in the dynamics of the energy balance system. The combination of the CoupModel simulations with the uncertainty-based calibration method provides a way of understanding the seasonal courses of hydrology and energy processes in cold regions with limited data. Additional measurements may be used to further reduce the uncertainty of regulating factors during the different stages of freezing–thawing.

  13. Study on Increasing High Temperature pH(t) to Reduce Iron Corrosion Products

    International Nuclear Information System (INIS)

    Shin, Dong Man; Hur, Nam Yong; Kim, Waang Bae

    2011-01-01

    The transportation and deposition of iron corrosion products are important elements that affect both the steam generator (SG) integrity and secondary system in pressurized water reactor (PWR) nuclear power plants. Most of iron corrosion products are generated on carbon steel materials due to flow accelerated corrosion (FAC). The several parameters like water chemistry, temperature, hydrodynamic, and steel composition affect FAC. It is well established that the at-temperature pH of the deaerated water system has a first order effect on the FAC rate of carbon steels through nuclear industry researches. In order to reduce transportation and deposition of iron corrosion products, increasing pH(t) tests were applied on secondary system of A, B units. Increasing pH(t) successfully reduced flow accelerated corrosion. The effect of increasing pH(t) to inhibit FAC was identified through the experiment and pH(t) evaluation in this paper

  14. The Effect of Annealing Temperature on Nickel on Reduced Graphene Oxide Catalysts on Urea Electrooxidation

    International Nuclear Information System (INIS)

    Glass, Dean E.; Galvan, Vicente; Prakash, G.K. Surya

    2017-01-01

    Highlights: •Nickel was reduced on graphene oxide and annealed under argon from 300 to 700 °C. •Nickel was oxidized from the removal of oxygen groups on the graphene oxide. •Higher annealed catalysts displayed decreased urea electrooxidation currents. •Micro direct urea/hydrogen peroxide fuel cells were employed for the first time. •Ni/rGO catalysts displayed enhanced fuel cell performance than the bare nickel. -- Abstract: The annealing temperature effects on nickel on reduced graphene oxide (Ni/rGO) catalysts for urea electrooxidation were investigated. Nickel chloride was directly reduced in an aqueous solution of graphene oxide (GO) followed by annealing under argon at 300, 400, 500, 600, and 700 °C, respectively. X-ray Diffraction (XRD) patterns revealed an increase in the crystallite size of the nickel nanoparticles while the Raman spectra displayed an increase in the graphitic disorder of the reduced graphene oxide at higher annealing temperatures due to the removal of oxygen functional groups. The Ni/rGO catalysts annealed at higher temperatures displayed oxidized nickel surface characteristics from the Ni 2p X-ray Photoelectron Spectra (XPS) due to the oxidation of the nickel from the oxygen functional groups in the graphitic lattice. In the half-cell testing, the onset potential of urea electrooxidation decreased while the urea electrooxidation currents decreased as the annealing temperature was increased. The nickel catalyst annealed at 700 °C displayed a 31% decrease in peak power density while the catalyst annealed at 300 °C displayed a 13% increase compared with the unannealed Ni/rGO catalyst in the micro direct urea/hydrogen peroxide fuel cells tests.

  15. Measuring centimeter-resolution air temperature profiles above land and water using fiber-optic Distributed Temperature Sensing

    Science.gov (United States)

    Sigmund, Armin; Pfister, Lena; Olesch, Johannes; Thomas, Christoph K.

    2016-04-01

    The precise determination of near-surface air temperature profiles is of special importance for the characterization of airflows (e.g. cold air) and the quantification of sensible heat fluxes according to the flux-gradient similarity approach. In contrast to conventional multi-sensor techniques, measuring temperature profiles using fiber-optic Distributed Temperature Sensing (DTS) provides thousands of measurements referenced to a single calibration standard at much reduced costs. The aim of this work was to enhance the vertical resolution of Raman scatter DTS measurements up to the centimeter-scale using a novel approach for atmospheric applications: the optical fiber was helically coiled around a meshed fabric. In addition to testing the new fiber geometry, we quantified the measurement uncertainty and demonstrated the benefits of the enhanced-resolution profiles. The fiber-optic cable was coiled around a hollow column consisting of white reinforcing fabric supported by plexiglass rings every meter. Data from two columns of this type were collected for 47 days to measure air temperature vertically over 3.0 and 5.1 m over a gently inclined meadow and over and in a small lake, respectively. Both profiles had a vertical resolution of 1 cm in the lower section near the surface and 5 cm in the upper section with an along-fiber instrument-specific averaging of 1.0 m and a temporal resolution of 30 s. Measurement uncertainties, especially from conduction between reinforcing fabric and fiber-optic cable, were estimated by modeling the fiber temperature via a detailed energy balance approach. Air temperature, wind velocity and radiation components were needed as input data and measured separately. The temperature profiles revealed valuable details, especially in the lowest 1 m above surface. This was best demonstrated for nighttime observations when artefacts due to solar heating did not occur. For example, the dynamics of a cold air layer was detected in a clear night

  16. Low LET radiolysis escape yields for reducing radicals and H2 in pressurized high temperature water

    Science.gov (United States)

    Sterniczuk, Marcin; Yakabuskie, Pamela A.; Wren, J. Clara; Jacob, Jasmine A.; Bartels, David M.

    2016-04-01

    Low Linear Energy Transfer (LET) radiolysis escape yields (G values) are reported for the sum (G(radH)+G(e-)aq) and for G(H2) in subcritical water up to 350 °C. The scavenger system 1-10 mM acetate/0.001 M hydroxide/0.00048 M N2O was used with simultaneous mass spectroscopic detection of H2 and N2 product. Temperature-dependent measurements were carried out with 2.5 MeV electrons from a van de Graaff accelerator, while room temperature calibration measurements were done with a 60Co gamma source. The concentrations and dose range were carefully chosen so that initial spur chemistry is not perturbed and the N2 product yield corresponds to those reducing radicals that escape recombination in pure water. In comparison with a recent review recommendation of Elliot and Bartels (AECL report 153-127160-450-001, 2009), the measured reducing radical yield is seven percent smaller at room temperature but in fairly good agreement above 150 °C. The H2 escape yield is in good agreement throughout the temperature range with several previous studies that used much larger radical scavenging rates. Previous analysis of earlier high temperature measurements of Gesc(radOH) is shown to be flawed, although the actual G values may be nearly correct. The methodology used in the present report greatly reduces the range of possible error and puts the high temperature escape yields for low-LET radiation on a much firmer quantitative foundation than was previously available.

  17. Reducing uncertainty of Monte Carlo estimated fatigue damage in offshore wind turbines using FORM

    DEFF Research Database (Denmark)

    H. Horn, Jan-Tore; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to fatigue damage estimation of non-linear systems are highly dependent on the tail behaviour and extreme values of the stress range distribution. By using a combination of the First Order Reliability Method (FORM) and Monte Carlo simulations (MCS), the accuracy of the fatigue...

  18. Determination of internal series resistance of PV devices: repeatability and uncertainty

    International Nuclear Information System (INIS)

    Trentadue, Germana; Pavanello, Diego; Salis, Elena; Field, Mike; Müllejans, Harald

    2016-01-01

    The calibration of photovoltaic devices requires the measurement of their current–voltage characteristics at standard test conditions (STC). As the latter can only be reached approximately, a curve translation is necessary, requiring among others the internal series resistance of the photovoltaic device as an input parameter. Therefore accurate and reliable determination of the series resistance is important in measurement and test laboratories. This work follows standard IEC 60891 ed 2 (2009) for the determination of the internal series resistance and investigates repeatability and uncertainty of the result in three aspects for a number of typical photovoltaic technologies. Firstly the effect of varying device temperature on the determined series resistance is determined experimentally and compared to a theoretical derivation showing agreement. It is found that the series resistance can be determined with an uncertainty of better than 5% if the device temperature is stable within  ±0.1 °C, whereas the temperature range of  ±2 °C allowed by the standard leads to much larger variations. Secondly the repeatability of the series resistance determination with respect to noise in current–voltage measurement is examined yielding typical values of  ±5%. Thirdly the determination of the series resistance using three different experimental set-ups (solar simulators) shows agreement on the level of  ±5% for crystalline Silicon photovoltaic devices and deviations up to 15% for thin-film devices. It is concluded that the internal series resistance of photovoltaic devices could be determined with an uncertainty of better than 10%. The influence of this uncertainty in series resistance on the electrical performance parameters of photovoltaic devices was estimated and showed a contribution of 0.05% for open-circuit voltage and 0.1% for maximum power. Furthermore it is concluded that the range of device temperatures allowed during determination of series

  19. Synchronous Sounds Enhance Visual Sensitivity without Reducing Target Uncertainty

    Directory of Open Access Journals (Sweden)

    Yi-Chuan Chen

    2011-10-01

    Full Text Available We examined the crossmodal effect of the presentation of a simultaneous sound on visual detection and discrimination sensitivity using the equivalent noise paradigm (Dosher & Lu, 1998. In each trial, a tilted Gabor patch was presented in either the first or second of two intervals consisting of dynamic 2D white noise with one of seven possible contrast levels. The results revealed that the sensitivity of participants' visual detection and discrimination performance were both enhanced by the presentation of a simultaneous sound, though only close to the noise level at which participants' target contrast thresholds started to increase with the increasing noise contrast. A further analysis of the psychometric function at this noise level revealed that the increase in sensitivity could not be explained by the reduction of participants' uncertainty regarding the onset time of the visual target. We suggest that this crossmodal facilitatory effect may be accounted for by perceptual enhancement elicited by a simultaneously-presented sound, and that the crossmodal facilitation was easier to observe when the visual system encountered a level of noise that happened to be close to the level of internal noise embedded within the system.

  20. Accounting for Epistemic Uncertainty in Mission Supportability Assessment: A Necessary Step in Understanding Risk and Logistics Requirements

    Science.gov (United States)

    Owens, Andrew; De Weck, Olivier L.; Stromgren, Chel; Goodliff, Kandyce; Cirillo, William

    2017-01-01

    Future crewed missions to Mars present a maintenance logistics challenge that is unprecedented in human spaceflight. Mission endurance – defined as the time between resupply opportunities – will be significantly longer than previous missions, and therefore logistics planning horizons are longer and the impact of uncertainty is magnified. Maintenance logistics forecasting typically assumes that component failure rates are deterministically known and uses them to represent aleatory uncertainty, or uncertainty that is inherent to the process being examined. However, failure rates cannot be directly measured; rather, they are estimated based on similarity to other components or statistical analysis of observed failures. As a result, epistemic uncertainty – that is, uncertainty in knowledge of the process – exists in failure rate estimates that must be accounted for. Analyses that neglect epistemic uncertainty tend to significantly underestimate risk. Epistemic uncertainty can be reduced via operational experience; for example, the International Space Station (ISS) failure rate estimates are refined using a Bayesian update process. However, design changes may re-introduce epistemic uncertainty. Thus, there is a tradeoff between changing a design to reduce failure rates and operating a fixed design to reduce uncertainty. This paper examines the impact of epistemic uncertainty on maintenance logistics requirements for future Mars missions, using data from the ISS Environmental Control and Life Support System (ECLS) as a baseline for a case study. Sensitivity analyses are performed to investigate the impact of variations in failure rate estimates and epistemic uncertainty on spares mass. The results of these analyses and their implications for future system design and mission planning are discussed.

  1. Socioeconomic Implications of Achieving 2.0 °C and 1.5 °C Climate Targets under Scientific Uncertainties

    Science.gov (United States)

    Su, X.; Takahashi, K.; Fujimori, S.; Hasegawa, T.; Tanaka, K.; Shiogama, H.; Emori, S.; LIU, J.; Hanasaki, N.; Hijioka, Y.; Masui, T.

    2017-12-01

    Large uncertainty exists in the temperature projections, including contributions from carbon cycle, climate system and aerosols. For the integrated assessment models (IAMs), like DICE, FUND and PAGE, however, the scientific uncertainties mainly rely on the distribution of (equilibrium) climate sensitivity. This study aims at evaluating the emission pathways by limiting temperature increase below 2.0 ºC or 1.5 ºC after 2100 considering scientific uncertainties, and exploring how socioeconomic indicators are affected by such scientific uncertainties. We use a stochastic version of the SCM4OPT, with an uncertainty measurement by considering alternative ranges of key parameters. Three climate cases, namely, i) base case of SSP2, ii) limiting temperature increase below 2.0 ºC after 2100 and iii) limiting temperature increase below 1.5 ºC after 2100, and three types of probabilities - i) >66% probability or likely, ii) >50% probability or more likely than not and iii) the mean of the probability distribution, are considered in the study. The results show that, i) for the 2.0ºC case, the likely CO2 reduction rate in 2100 ranges from 75.5%-102.4%, with mean value of 88.1%, and 93.0%-113.1% (mean 102.5%) for the 1.5ºC case; ii) a likely range of forcing effect is found for the 2.0 ºC case (2.7-3.9 Wm-2) due to scientific uncertainty, and 1.9-3.1 Wm-2 for the 1.5 ºC case; iii) the carbon prices within 50% confidential interval may differ a factor of 3 for both the 2.0ºC case and the 1.5 ºC case; iv) the abatement costs within 50% confidential interval may differ a factor of 4 for both the 2.0ºC case and the 1.5 ºC case. Nine C4MIP carbon cycle models and nineteen CMIP3 AOGCMs are used to account for the scientific uncertainties, following MAGICC 6.0. These uncertainties will result in a likely radiative forcing range of 6.1-7.5 Wm-2 and a likely temperature increase of 3.1-4.5 ºC in 2100 for the base case of SSP2. If we evaluate the 2 ºC target by limiting the

  2. Regional scaling of annual mean precipitation and water availability with global temperature change

    Science.gov (United States)

    Greve, Peter; Gudmundsson, Lukas; Seneviratne, Sonia I.

    2018-03-01

    Changes in regional water availability belong to the most crucial potential impacts of anthropogenic climate change, but are highly uncertain. It is thus of key importance for stakeholders to assess the possible implications of different global temperature thresholds on these quantities. Using a subset of climate model simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5), we derive here the sensitivity of regional changes in precipitation and in precipitation minus evapotranspiration to global temperature changes. The simulations span the full range of available emission scenarios, and the sensitivities are derived using a modified pattern scaling approach. The applied approach assumes linear relationships on global temperature changes while thoroughly addressing associated uncertainties via resampling methods. This allows us to assess the full distribution of the simulations in a probabilistic sense. Northern high-latitude regions display robust responses towards wetting, while subtropical regions display a tendency towards drying but with a large range of responses. Even though both internal variability and the scenario choice play an important role in the overall spread of the simulations, the uncertainty stemming from the climate model choice usually accounts for about half of the total uncertainty in most regions. We additionally assess the implications of limiting global mean temperature warming to values below (i) 2 K or (ii) 1.5 K (as stated within the 2015 Paris Agreement). We show that opting for the 1.5 K target might just slightly influence the mean response, but could substantially reduce the risk of experiencing extreme changes in regional water availability.

  3. The Bertlmann-Martin Inequalities and the Uncertainty Principle

    International Nuclear Information System (INIS)

    Ighezou, F.Z.; Kerris, A.T.; Lombard, R.J.

    2008-01-01

    A lower bound to (r) 1s is established from the Thomas-Reiche-Kuhn sum rule applied to the reduced equation for the s-states. It is linked to the average value of (r 2 ) 1s We discuss, on few examples, how the use of approximate value for (r 2 ) 1s , derived from the generalized Bertlmann and Martin inequalities, preserves the lower bound character of (r) 1s . Finally, by using the uncertainty principle and the uncertainty in the radial position, we derive a low bound to the ground state kinetic energy

  4. Insights into water managers' perception and handling of uncertainties - a study of the role of uncertainty in practitioners' planning and decision-making

    Science.gov (United States)

    Höllermann, Britta; Evers, Mariele

    2017-04-01

    Planning and decision-making under uncertainty is common in water management due to climate variability, simplified models, societal developments, planning restrictions just to name a few. Dealing with uncertainty can be approached from two sites, hereby affecting the process and form of communication: Either improve the knowledge base by reducing uncertainties or apply risk-based approaches to acknowledge uncertainties throughout the management process. Current understanding is that science more strongly focusses on the former approach, while policy and practice are more actively applying a risk-based approach to handle incomplete and/or ambiguous information. The focus of this study is on how water managers perceive and handle uncertainties at the knowledge/decision interface in their daily planning and decision-making routines. How they evaluate the role of uncertainties for their decisions and how they integrate this information into the decision-making process. Expert interviews and questionnaires among practitioners and scientists provided an insight into their perspectives on uncertainty handling allowing a comparison of diverse strategies between science and practice as well as between different types of practitioners. Our results confirmed the practitioners' bottom up approach from potential measures upwards instead of impact assessment downwards common in science-based approaches. This science-practice gap may hinder effective uncertainty integration and acknowledgement in final decisions. Additionally, the implementation of an adaptive and flexible management approach acknowledging uncertainties is often stalled by rigid regulations favouring a predict-and-control attitude. However, the study showed that practitioners' level of uncertainty recognition varies with respect to his or her affiliation to type of employer and business unit, hence, affecting the degree of the science-practice-gap with respect to uncertainty recognition. The level of working

  5. Kofi Annan, Syria and the Uses of Uncertainty in Mediation

    Directory of Open Access Journals (Sweden)

    Richard Gowan

    2013-03-01

    Full Text Available One year after Kofi Annan presented his six-point plan for ending the Syrian civil war, it can only be called a failure. But it is necessary to recall the situation facing the UN-Arab League envoy and his team in early 2012. The Syrian conflict had created serious tensions between the major powers. A Western military intervention appeared unlikely but could not be ruled out with absolute certainty. This commentary contends that Annan’s initial priority was to reduce the level of uncertainty inside and outside Syria, thereby creating a framework for political talks.  However, in lowering the level of uncertainty, Annan reduced his own leverage as the Syrian government correctly concluded that it would not be punished for failing to cooperate in good faith.  The commentary concludes that there are occasions where it is advisable for international mediators to maintain and exploit a degree of uncertainty about how a conflict may develop.

  6. Uncertainty vs. learning in climate policy: Some classical results and new directions

    Energy Technology Data Exchange (ETDEWEB)

    Lange, A. [Univ. of Maryland (United States); Treich, N. [Univ. of Toulouse (France)

    2007-07-01

    Climate policy decisions today have to be made under substantial uncertainty: the impact of accumulating greenhouse gases in the atmosphere is not perfectly known, the future economic and social consequences of climate change, in particular the valuation of possible damages, are uncertain. However, learning will change the basis of making future decisions on abatement policies. These important issues of uncertainty and learning are often presented in a colloquial sense. Two opposing effects are typically put forward: First, uncertainty about future climate damage, which is often associated with the possibility of a catastrophic scenario is said to give a premium to slow down global warming and therefore to increase abatement efforts today. Second learning opportunities will reduce scientific undertainty about climate damage over time. This is often used as an argument to postpone abatement efforts until new information is received. The effects of uncertainty and learning on the optimal design of current climate policy are still much debated both in the academic and the political arena. In this paper, the authors study and contrast the effect of uncertainty and learning in a two-decision model that encompasses most existing microeconomics models of climate change. They first consider the common expected utility framework: While uncertainty has generally no or a negative effect on welfare, learning has always a positive, and thus opposite, effect. The effects of both uncertainty and learning on decisions are less clear. Neither uncertainty nor learning can be used as an argument to increase or reduce emissions today, independently on the degree of risk aversion of the decision-marker and on the nature of irreversibility constraints. The authors then deviate from the expected utility framework and consider a model with ambiguity aversion. The model accounts well for situations of imprecise or multiple probability distributions, as present in the context of climate

  7. Analysis of impact of mixing flow on the pebble bed high temperature reactor

    International Nuclear Information System (INIS)

    Hao Chen; Li Fu; Guo Jiong

    2014-01-01

    The impact of the mixing flow in the pebble flow on pebble bed high temperature gas cooled reactor (HTR) was analyzed in the paper. New code package MFVSOP which can simulate the mixing flow was developed. The equilibrium core of HTR-PM was selected as reference case, the impact of the mixing flow on the core parameters such as core power peak factor, power distribution was analyzed with different degree of mixing flow, and uncertainty analysis was carried out. Numerical results showed that the mixing flow had little impact on key parameters of pebble bed HTR, and the multiple-pass-operation-mode in pebble bed HTR can reduce the uncertainty arouse from the mixing flow. (authors)

  8. Verification of the thermal module in the ELESIM code and the associated uncertainty analysis

    International Nuclear Information System (INIS)

    Arimescu, V.I.; Williams, A.F.; Klein, M.E.; Richmond, W.R.; Couture, M.

    1997-09-01

    Temperature is a critical parameter in fuel modelling because most of the physical processes that occur in fuel elements during irradiation are thermally activated. The focus of this paper is the temperature distribution calculation used in the computer code ELESIM, developed at AECL to model the steady-state behaviour of CANDU fuel. A validation procedure for fuel codes is described and applied to ELESIM's thermal calculation.The effects of uncertainties in model parameters, like Uranium Dioxide thermal conductivity, and input variables, such as fuel element linear power, are accounted for through an uncertainty analysis using Response Surface and Monte Carlo techniques

  9. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  10. Observed Decrease of North American Winter Temperature Variability

    Science.gov (United States)

    Rhines, A. N.; Tingley, M.; McKinnon, K. A.; Huybers, P. J.

    2015-12-01

    There is considerable interest in determining whether temperature variability has changed in recent decades. Model ensembles project that extratropical land temperature variance will detectably decrease by 2070. We use quantile regression of station observations to show that decreasing variability is already robustly detectable for North American winter during 1979--2014. Pointwise trends from GHCND stations are mapped into a continuous spatial field using thin-plate spline regression, resolving small-scales while providing uncertainties accounting for spatial covariance and varying station density. We find that variability of daily temperatures, as measured by the difference between the 95th and 5th percentiles, has decreased markedly in winter for both daily minima and maxima. Composites indicate that the reduced spread of winter temperatures primarily results from Arctic amplification decreasing the meridional temperature gradient. Greater observed warming in the 5th relative to the 95th percentile stems from asymmetric effects of advection during cold versus warm days; cold air advection is generally from northerly regions that have experienced greater warming than western or southwestern regions that are generally sourced during warm days.

  11. Diagnostic and model dependent uncertainty of simulated Tibetan permafrost area

    Science.gov (United States)

    Wang, A.; Moore, J.C.; Cui, Xingquan; Ji, D.; Li, Q.; Zhang, N.; Wang, C.; Zhang, S.; Lawrence, D.M.; McGuire, A.D.; Zhang, W.; Delire, C.; Koven, C.; Saito, K.; MacDougall, A.; Burke, E.; Decharme, B.

    2016-01-01

     We perform a land-surface model intercomparison to investigate how the simulation of permafrost area on the Tibetan Plateau (TP) varies among six modern stand-alone land-surface models (CLM4.5, CoLM, ISBA, JULES, LPJ-GUESS, UVic). We also examine the variability in simulated permafrost area and distribution introduced by five different methods of diagnosing permafrost (from modeled monthly ground temperature, mean annual ground and air temperatures, air and surface frost indexes). There is good agreement (99 to 135  ×  104 km2) between the two diagnostic methods based on air temperature which are also consistent with the observation-based estimate of actual permafrost area (101  × 104 km2). However the uncertainty (1 to 128  ×  104 km2) using the three methods that require simulation of ground temperature is much greater. Moreover simulated permafrost distribution on the TP is generally only fair to poor for these three methods (diagnosis of permafrost from monthly, and mean annual ground temperature, and surface frost index), while permafrost distribution using air-temperature-based methods is generally good. Model evaluation at field sites highlights specific problems in process simulations likely related to soil texture specification, vegetation types and snow cover. Models are particularly poor at simulating permafrost distribution using the definition that soil temperature remains at or below 0 °C for 24 consecutive months, which requires reliable simulation of both mean annual ground temperatures and seasonal cycle, and hence is relatively demanding. Although models can produce better permafrost maps using mean annual ground temperature and surface frost index, analysis of simulated soil temperature profiles reveals substantial biases. The current generation of land-surface models need to reduce biases in simulated soil temperature profiles before reliable contemporary permafrost maps and predictions of changes in future

  12. Assessing flood forecast uncertainty with fuzzy arithmetic

    Directory of Open Access Journals (Sweden)

    de Bruyn Bertrand

    2016-01-01

    Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out

  13. Critical mid-term uncertainties in long-term decarbonisation pathways

    International Nuclear Information System (INIS)

    Usher, Will; Strachan, Neil

    2012-01-01

    Over the next decade, large energy investments are required in the UK to meet growing energy service demands and legally binding emission targets under a pioneering policy agenda. These are necessary despite deep mid-term (2025–2030) uncertainties over which national policy makers have little control. We investigate the effect of two critical mid-term uncertainties on optimal near-term investment decisions using a two-stage stochastic energy system model. The results show that where future fossil fuel prices are uncertain: (i) the near term hedging strategy to 2030 differs from any one deterministic fuel price scenario and is structurally dissimilar to a simple ‘average’ of the deterministic scenarios, and (ii) multiple recourse strategies from 2030 are perturbed by path dependencies caused by hedging investments. Evaluating the uncertainty under a decarbonisation agenda shows that fossil fuel price uncertainty is very expensive at around £20 billion. The addition of novel mitigation options reduces the value of fossil fuel price uncertainty to £11 billion. Uncertain biomass import availability shows a much lower value of uncertainty at £300 million. This paper reveals the complex relationship between the flexibility of the energy system and mitigating the costs of uncertainty due to the path-dependencies caused by the long-life times of both infrastructures and generation technologies. - Highlights: ► Critical mid-term uncertainties affect near-term investments in UK energy system. ► Deterministic scenarios give conflicting near-term actions. ► Stochastic scenarios give one near-term hedging strategy. ► Technologies exhibit path dependency or flexibility. ► Fossil fuel price uncertainty is very expensive, biomass availability uncertainty is not.

  14. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  15. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  16. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  17. Quantum-memory-assisted entropic uncertainty relation in a Heisenberg XYZ chain with an inhomogeneous magnetic field

    Science.gov (United States)

    Wang, Dong; Huang, Aijun; Ming, Fei; Sun, Wenyang; Lu, Heping; Liu, Chengcheng; Ye, Liu

    2017-06-01

    The uncertainty principle provides a nontrivial bound to expose the precision for the outcome of the measurement on a pair of incompatible observables in a quantum system. Therefore, it is of essential importance for quantum precision measurement in the area of quantum information processing. Herein, we investigate quantum-memory-assisted entropic uncertainty relation (QMA-EUR) in a two-qubit Heisenberg \\boldsymbol{X}\\boldsymbol{Y}\\boldsymbol{Z} spin chain. Specifically, we observe the dynamics of QMA-EUR in a realistic model there are two correlated sites linked by a thermal entanglement in the spin chain with an inhomogeneous magnetic field. It turns out that the temperature, the external inhomogeneous magnetic field and the field inhomogeneity can lift the uncertainty of the measurement due to the reduction of the thermal entanglement, and explicitly higher temperature, stronger magnetic field or larger inhomogeneity of the field can result in inflation of the uncertainty. Besides, it is found that there exists distinct dynamical behaviors of the uncertainty for ferromagnetism \\boldsymbol{}≤ft(\\boldsymbol{J}\\boldsymbol{0}\\right) chains. Moreover, we also verify that the measuring uncertainty is dramatically anti-correlated with the purity of the bipartite spin system, the greater purity can result in the reduction of the measuring uncertainty, vice versa. Therefore, our observations might provide a better understanding of the dynamics of the entropic uncertainty in the Heisenberg spin chain, and thus shed light on quantum precision measurement in the framework of versatile systems, particularly solid states.

  18. Evaluation of uncertainty associated with parameters for long-term safety assessments of geological disposal

    International Nuclear Information System (INIS)

    Yamaguchi, Tetsuji; Minase, Naofumi; Iida, Yoshihisa; Tanaka, Tadao; Nakayama, Shinichi

    2005-01-01

    This paper describes the current status of our data acquisition on quantifying uncertainties associated with parameters for safety assessment on groundwater scenarios for geological disposal of radioactive wastes. First, sources of uncertainties and the resulting priority in data acquisition were briefed. Then, the current status of data acquisition for quantifying the uncertainties in assessing solubility, diffusivity in bentonite buffer and distribution coefficient on rocks is introduced. The uncertainty with the solubility estimation is quantified from that associated with thermodynamic data and that in estimating groundwater chemistry. The uncertainty associated with the diffusivity in bentonite buffer is composed of variations of relevant factors such as porosity of the bentonite buffer, montmorillonite content, chemical composition of pore water and temperature. The uncertainty of factors such as the specific surface area of the rock, pH, ionic strength, carbonate concentration in groundwater compose uncertainty of the distribution coefficient of radionuclides on rocks. Based on these investigations, problems to be solved in future studies are summarized. (author)

  19. Statistical uncertainty analysis of radon transport in nonisothermal, unsaturated soils

    International Nuclear Information System (INIS)

    Holford, D.J.; Owczarski, P.C.; Gee, G.W.; Freeman, H.D.

    1990-10-01

    To accurately predict radon fluxes soils to the atmosphere, we must know more than the radium content of the soil. Radon flux from soil is affected not only by soil properties, but also by meteorological factors such as air pressure and temperature changes at the soil surface, as well as the infiltration of rainwater. Natural variations in meteorological factors and soil properties contribute to uncertainty in subsurface model predictions of radon flux, which, when coupled with a building transport model, will also add uncertainty to predictions of radon concentrations in homes. A statistical uncertainty analysis using our Rn3D finite-element numerical model was conducted to assess the relative importance of these meteorological factors and the soil properties affecting radon transport. 10 refs., 10 figs., 3 tabs

  20. Uncertainties in radioecological assessment models-Their nature and approaches to reduce them

    International Nuclear Information System (INIS)

    Kirchner, G.; Steiner, M.

    2008-01-01

    Radioecological assessment models are necessary tools for estimating the radiation exposure of humans and non-human biota. This paper focuses on factors affecting their predictive accuracy, discusses the origin and nature of the different contributions to uncertainty and variability and presents approaches to separate and quantify them. The key role of the conceptual model, notably in relation to its structure and complexity, as well as the influence of the number and type of input parameters, are highlighted. Guidelines are provided to improve the degree of reliability of radioecological models

  1. Causal uncertainty, claimed and behavioural self-handicapping.

    Science.gov (United States)

    Thompson, Ted; Hepburn, Jonathan

    2003-06-01

    Causal uncertainty beliefs involve doubts about the causes of events, and arise as a consequence of non-contingent evaluative feedback: feedback that leaves the individual uncertain about the causes of his or her achievement outcomes. Individuals high in causal uncertainty are frequently unable to confidently attribute their achievement outcomes, experience anxiety in achievement situations and as a consequence are likely to engage in self-handicapping behaviour. Accordingly, we sought to establish links between trait causal uncertainty, claimed and behavioural self-handicapping. Participants were N=72 undergraduate students divided equally between high and low causally uncertain groups. We used a 2 (causal uncertainty status: high, low) x 3 (performance feedback condition: success, non-contingent success, non-contingent failure) between-subjects factorial design to examine the effects of causal uncertainty on achievement behaviour. Following performance feedback, participants completed 20 single-solution anagrams and 12 remote associate tasks serving as performance measures, and 16 unicursal tasks to assess practice effort. Participants also completed measures of claimed handicaps, state anxiety and attributions. Relative to low causally uncertain participants, high causally uncertain participants claimed more handicaps prior to performance on the anagrams and remote associates, reported higher anxiety, attributed their failure to internal, stable factors, and reduced practice effort on the unicursal tasks, evident in fewer unicursal tasks solved. These findings confirm links between trait causal uncertainty and claimed and behavioural self-handicapping, highlighting the need for educators to facilitate means by which students can achieve surety in the manner in which they attribute the causes of their achievement outcomes.

  2. Evaluation and uncertainties of global climate models as simulated in East Asia and China

    International Nuclear Information System (INIS)

    Zhao, Z.C.

    1994-01-01

    The assessments and uncertainties of the general circulation models (GCMs) as simulated in East Asia and China (15-60 N, 70-140 E) have been investigated by using seven GCMs. Four methods of assessment have been chosen. The variables for the validations for the GCMs include the annual, seasonal and monthly mean temperatures and precipitation. The assessments indicated that: (1) the simulations of seven GCMs for temperature are much better than those for precipitation; (2) the simulations in winter are much better than those in summer; (3) the simulations in eastern parts are much better than those in Western parts for both temperature and precipitation; (4) the best GCM for simulated temperature is the GISS model, and the best GCM for simulated precipitation is the UKMO-H model. The seven GCMs' means for both simulated temperature and precipitation provided good results. The range of uncertainties in East Asia and China due to human activities are presented. The differences between the GCMs for temperature and precipitation before the year 2050 are much smaller than those after the year 2050

  3. EMPRESS: A European Project to Enhance Process Control Through Improved Temperature Measurement

    Science.gov (United States)

    Pearce, J. V.; Edler, F.; Elliott, C. J.; Rosso, L.; Sutton, G.; Andreu, A.; Machin, G.

    2017-08-01

    A new European project called EMPRESS, funded by the EURAMET program `European Metrology Program for Innovation and Research,' is described. The 3 year project, which started in the summer of 2015, is intended to substantially augment the efficiency of high-value manufacturing processes by improving temperature measurement techniques at the point of use. The project consortium has 18 partners and 5 external collaborators, from the metrology sector, high-value manufacturing, sensor manufacturing, and academia. Accurate control of temperature is key to ensuring process efficiency and product consistency and is often not achieved to the level required for modern processes. Enhanced efficiency of processes may take several forms including reduced product rejection/waste; improved energy efficiency; increased intervals between sensor recalibration/maintenance; and increased sensor reliability, i.e., reduced amount of operator intervention. Traceability of temperature measurements to the International Temperature Scale of 1990 (ITS-90) is a critical factor in establishing low measurement uncertainty and reproducible, consistent process control. Introducing such traceability in situ (i.e., within the industrial process) is a theme running through this project.

  4. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    International Nuclear Information System (INIS)

    Xiao, H.; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-01-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach has

  5. Quantifying and reducing model-form uncertainties in Reynolds-averaged Navier–Stokes simulations: A data-driven, physics-informed Bayesian approach

    Energy Technology Data Exchange (ETDEWEB)

    Xiao, H., E-mail: hengxiao@vt.edu; Wu, J.-L.; Wang, J.-X.; Sun, R.; Roy, C.J.

    2016-11-01

    Despite their well-known limitations, Reynolds-Averaged Navier–Stokes (RANS) models are still the workhorse tools for turbulent flow simulations in today's engineering analysis, design and optimization. While the predictive capability of RANS models depends on many factors, for many practical flows the turbulence models are by far the largest source of uncertainty. As RANS models are used in the design and safety evaluation of many mission-critical systems such as airplanes and nuclear power plants, quantifying their model-form uncertainties has significant implications in enabling risk-informed decision-making. In this work we develop a data-driven, physics-informed Bayesian framework for quantifying model-form uncertainties in RANS simulations. Uncertainties are introduced directly to the Reynolds stresses and are represented with compact parameterization accounting for empirical prior knowledge and physical constraints (e.g., realizability, smoothness, and symmetry). An iterative ensemble Kalman method is used to assimilate the prior knowledge and observation data in a Bayesian framework, and to propagate them to posterior distributions of velocities and other Quantities of Interest (QoIs). We use two representative cases, the flow over periodic hills and the flow in a square duct, to evaluate the performance of the proposed framework. Both cases are challenging for standard RANS turbulence models. Simulation results suggest that, even with very sparse observations, the obtained posterior mean velocities and other QoIs have significantly better agreement with the benchmark data compared to the baseline results. At most locations the posterior distribution adequately captures the true model error within the developed model form uncertainty bounds. The framework is a major improvement over existing black-box, physics-neutral methods for model-form uncertainty quantification, where prior knowledge and details of the models are not exploited. This approach

  6. Reducing uncertainty in Climate Response Time Scale by Bayesian Analysis of the 8.2 ka event

    Science.gov (United States)

    Lorenz, A.; Held, H.; Bauer, E.; Schneider von Deimling, T.

    2009-04-01

    We analyze the possibility of uncertainty reduction in Climate Response Time Scale by utilizing Greenland ice-core data that contain the 8.2 ka event within a Bayesian model-data intercomparison with the Earth system model of intermediate complexity, CLIMBER-2.3. Within a stochastic version of the model it has been possible to mimic the 8.2 ka event within a plausible experimental setting and with relatively good accuracy considering the timing of the event in comparison to other modeling exercises [1]. The simulation of the centennial cold event is effectively determined by the oceanic cooling rate which depends largely on the ocean diffusivity described by diffusion coefficients of relatively wide uncertainty ranges. The idea now is to discriminate between the different values of diffusivities according to their likelihood to rightly represent the duration of the 8.2 ka event and thus to exploit the paleo data to constrain uncertainty in model parameters in analogue to [2]. Implementing this inverse Bayesian Analysis with this model the technical difficulty arises to establish the related likelihood numerically in addition to the uncertain model parameters: While mainstream uncertainty analyses can assume a quasi-Gaussian shape of likelihood, with weather fluctuating around a long term mean, the 8.2 ka event as a highly nonlinear effect precludes such an a priori assumption. As a result of this study [3] the Bayesian Analysis showed a reduction of uncertainty in vertical ocean diffusivity parameters of factor 2 compared to prior knowledge. This learning effect on the model parameters is propagated to other model outputs of interest; e.g. the inverse ocean heat capacity, which is important for the dominant time scale of climate response to anthropogenic forcing which, in combination with climate sensitivity, strongly influences the climate systems reaction for the near- and medium-term future. 1 References [1] E. Bauer, A. Ganopolski, M. Montoya: Simulation of the

  7. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    Science.gov (United States)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  8. Costs of travel time uncertainty and benefits of travel time information: Conceptual model and numerical examples

    NARCIS (Netherlands)

    Ettema, D.F.; Timmermans, H.J.P.

    2006-01-01

    A negative effect of congestion that tends to be overlooked is travel time uncertainty. Travel time uncertainty causes scheduling costs due to early or late arrival. The negative effects of travel time uncertainty can be reduced by providing travellers with travel time information, which improves

  9. Key uncertainties in climate change policy: Results from ICAM-2

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to: inform decision makers about the likely outcome of policy initiatives; and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.0. This model includes demographics, economic activities, emissions, atmospheric chemistry, climate change, sea level rise and other impact modules and the numerous associated feedbacks. The model has over 700 objects of which over 1/3 are uncertain. These have been grouped into seven different classes of uncertain items. The impact of uncertainties in each of these items can be considered individually or in combinations with the others. In this paper we demonstrate the relative contribution of various sources of uncertainty to different outcomes in the model. The analysis shows that climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. Extreme uncertainties in indirect aerosol forcing and behavioral response to climate change (adaptation) were characterized by using bounding analyses; the results suggest that these extreme uncertainties can dominate the choice of policy outcomes.

  10. Kalman filter approach for uncertainty quantification in time-resolved laser-induced incandescence.

    Science.gov (United States)

    Hadwin, Paul J; Sipkens, Timothy A; Thomson, Kevin A; Liu, Fengshan; Daun, Kyle J

    2018-03-01

    Time-resolved laser-induced incandescence (TiRe-LII) data can be used to infer spatially and temporally resolved volume fractions and primary particle size distributions of soot-laden aerosols, but these estimates are corrupted by measurement noise as well as uncertainties in the spectroscopic and heat transfer submodels used to interpret the data. Estimates of the temperature, concentration, and size distribution of soot primary particles within a sample aerosol are typically made by nonlinear regression of modeled spectral incandescence decay, or effective temperature decay, to experimental data. In this work, we employ nonstationary Bayesian estimation techniques to infer aerosol properties from simulated and experimental LII signals, specifically the extended Kalman filter and Schmidt-Kalman filter. These techniques exploit the time-varying nature of both the measurements and the models, and they reveal how uncertainty in the estimates computed from TiRe-LII data evolves over time. Both techniques perform better when compared with standard deterministic estimates; however, we demonstrate that the Schmidt-Kalman filter produces more realistic uncertainty estimates.

  11. Did European temperatures in 1540 exceed present-day records?

    Science.gov (United States)

    Orth, Rene; Vogel, Martha M.; Luterbacher, Jürg; Pfister, Christian; Seneviratne, Sonia I.

    2017-04-01

    There is strong evidence that the year 1540 was exceptionally dry and warm in Central Europe. Here we infer 1540 summer temperatures from the number of dry days (NDDs) in spring (March-May) and summer (June-August) in 1540 derived from historical documentary evidence published elsewhere, and compare our estimates with present-day temperatures. We translate the NDD values into temperature distributions using a linear relationship between modeled temperature and NDD from a 3000 year pre-industrial control simulation with the Community Earth System Model (CESM). Our results show medium confidence that summer mean temperatures (T JJA) and maximum temperatures (TXx) in Central Europe in 1540 were warmer than the respective present-day mean summer temperatures (assessed between 1966-2015). The model-based reconstruction suggests further that with a probability of 40%-70%, the highest daily temperatures in 1540 were even warmer than in 2003, while there is at most a 20% probability that the 1540 mean summer temperature was warmer than that of 2003 in Central Europe. As with other state-of-the-art analyses, the uncertainty of the reconstructed 1540 summer weather in this study is considerable, for instance as extrapolation is required because 1540-like events are not captured by the employed Earth system model (ESM), and neither by other ESMs. However, in addition to paleoclimatological approaches we introduce here an independent methodology to estimate 1540 temperatures, and contribute consequently to a reduced overall uncertainty in the analysis of this event. The characterization of such events and the related climate system functioning is particularly relevant in the context of global warming and the corresponding increase of extreme heat wave magnitude and occurrence frequency. Orth, R., M.M. Vogel, J. Luterbacher, C. Pfister, and S.I. Seneviratne, (2016): Did European temperatures in 1540 exceed present-day records? Env. Res. Lett., 11, 114021, doi: 10.1088/1748-9326/11/11/114021

  12. High Temperature Transducers for Online Monitoring of Microstructure Evolution

    Energy Technology Data Exchange (ETDEWEB)

    Lissenden, Cliff [Pennsylvania State Univ., State College, PA (United States); Tittmann, Bernhard [Battelle Energy Alliance, LLC, Idaho Falls, ID (United States)

    2015-03-30

    A critical technology gap exists relative to online condition monitoring (CM) of advanced nuclear plant components for damage accumulation; there are not capable sensors and infrastructure available for the high temperature environment. The sensory system, monitoring methodology, data acquisition, and damage characterization algorithm that comprise a CM system are investigated here. Thus this work supports the DOE mission to develop a fundamental understanding of advanced sensors to improve physical measurement accuracy and reduce uncertainty. The research involves a concept viability assessment, a detailed technology gap analysis, and a technology development roadmap.

  13. The role of uncertainty in climate change adaptation strategies — A Danish water management example

    DEFF Research Database (Denmark)

    Refsgaard, J.C.; Arnbjerg-Nielsen, Karsten; Drews, Martin

    2013-01-01

    We propose a generic framework to characterize climate change adaptation uncertainty according to three dimensions: level, source and nature. Our framework is different, and in this respect more comprehensive, than the present UN Intergovernmental Panel on Climate Change (IPCC) approach and could...... are epistemic (reducible) by nature but uncertainties on adaptation measures are complex, with ambiguity often being added to impact uncertainties. Strategies to deal with uncertainty in climate change adaptation should reflect the nature of the uncertainty sources and how they interact with risk level...

  14. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  15. A new optimization framework using genetic algorithm and artificial neural network to reduce uncertainties in petroleum reservoir models

    Science.gov (United States)

    Maschio, Célio; José Schiozer, Denis

    2015-01-01

    In this article, a new optimization framework to reduce uncertainties in petroleum reservoir attributes using artificial intelligence techniques (neural network and genetic algorithm) is proposed. Instead of using the deterministic values of the reservoir properties, as in a conventional process, the parameters of the probability density function of each uncertain attribute are set as design variables in an optimization process using a genetic algorithm. The objective function (OF) is based on the misfit of a set of models, sampled from the probability density function, and a symmetry factor (which represents the distribution of curves around the history) is used as weight in the OF. Artificial neural networks are trained to represent the production curves of each well and the proxy models generated are used to evaluate the OF in the optimization process. The proposed method was applied to a reservoir with 16 uncertain attributes and promising results were obtained.

  16. Isotopic techniques in radioactive waste disposal site evaluation: a method for reducing uncertainties I. T, T/3He, 4He, 14C, 36Cl

    International Nuclear Information System (INIS)

    Muller, A.B.

    1981-01-01

    This paper introduces five of the isotopic techniques which can help reduce uncertainties associated with the assessment of radioactive waste disposal sites. The basic principles and practical considerations of these best known techniques have been presented, showing how much additional site specific information can be acquired at little cost or consequence to containment efficiency. These methods, and the more experimental methods appearing in the figure but not discussed here, should be considered in any detailed site characterization, data collection and analysis

  17. Effect of uncertainty components such as recalibration on the performance of quality control charts

    DEFF Research Database (Denmark)

    Winkel, P; Zhang, Nevin

    2005-01-01

    Uncertainty components (recalibration, new reagent lots, etc.) may be the source of random changes in the level of quality control (QC) values, thus causing false alarms. We propose a method for reducing false alarms.......Uncertainty components (recalibration, new reagent lots, etc.) may be the source of random changes in the level of quality control (QC) values, thus causing false alarms. We propose a method for reducing false alarms....

  18. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  19. Status of uncertainty assessment in k0-NAA measurement. Anything still missing?

    International Nuclear Information System (INIS)

    Borut Smodis; Tinkara Bucar

    2014-01-01

    Several approaches to quantifying measurement uncertainty in k 0 -based neutron activation analysis (k 0 -NAA) are reviewed, comprising the original approach, the spreadsheet approach, the dedicated computer program involving analytical calculations and the two k 0 -NAA programs available on the market. Two imperfectness in the dedicated programs are identified, their impact assessed and possible improvements presented for a concrete experimental situation. The status of uncertainty assessment in k 0 -NAA is discussed and steps for improvement are recommended. It is concluded that the present magnitude of measurement uncertainty should further be improved by making additional efforts in reducing uncertainties of the relevant nuclear constants used. (author)

  20. Reducing the ordering temperature of FePt nanoparticles by Cu additive and alternate reduction method

    Directory of Open Access Journals (Sweden)

    Fang Wang

    2017-12-01

    Full Text Available (FePt85Cu15 nanoparticles were successfully prepared by alternate reduction of metal salts in aqueous medium. Detailed investigations on the correlation between the magnetic and structural properties of these nanoparticles are presented as a function of annealing temperature. Both the X-ray diffraction patterns and the magnetic hysteresis loop measurements show the existence of L10-FePt phase at a relative low annealing temperature. It is proved that the Cu additive and alternate reduction are very effective methods in reducing the ordering temperature of FePt nanoparticles.

  1. Two-color spatial and temporal temperature measurements using a streaked soft x-ray imager

    Energy Technology Data Exchange (ETDEWEB)

    Moore, A. S., E-mail: alastair.moore@physics.org; Ahmed, M. F.; Soufli, R.; Pardini, T.; Hibbard, R. L.; Bailey, C. G.; Bell, P. M.; Hau-Riege, S. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States); Benstead, J.; Morton, J.; Guymer, T. M.; Garbett, W. J.; Rubery, M. S.; Skidmore, J. W. [Directorate Science and Technology, AWE Aldermaston, Reading RG7 4PR (United Kingdom); Bedzyk, M.; Shoup, M. J.; Regan, S. P.; Agliata, T.; Jungquist, R. [Laboratory for Laser Energetics, University of Rochester, Rochester, New York 14623 (United States); Schmidt, D. W. [Los Alamos National Laboratory, Los Alamos, New Mexico 87545 (United States); and others

    2016-11-15

    A dual-channel streaked soft x-ray imager has been designed and used on high energy-density physics experiments at the National Ignition Facility. This streaked imager creates two images of the same x-ray source using two slit apertures and a single shallow angle reflection from a nickel mirror. Thin filters are used to create narrow band pass images at 510 eV and 360 eV. When measuring a Planckian spectrum, the brightness ratio of the two images can be translated into a color-temperature, provided that the spectral sensitivity of the two images is well known. To reduce uncertainty and remove spectral features in the streak camera photocathode from this photon energy range, a thin 100 nm CsI on 50 nm Al streak camera photocathode was implemented. Provided that the spectral shape is well-known, then uncertainties on the spectral sensitivity limits the accuracy of the temperature measurement to approximately 4.5% at 100 eV.

  2. GRS Method for Uncertainty and Sensitivity Evaluation of Code Results and Applications

    International Nuclear Information System (INIS)

    Glaeser, H.

    2008-01-01

    During the recent years, an increasing interest in computational reactor safety analysis is to replace the conservative evaluation model calculations by best estimate calculations supplemented by uncertainty analysis of the code results. The evaluation of the margin to acceptance criteria, for example, the maximum fuel rod clad temperature, should be based on the upper limit of the calculated uncertainty range. Uncertainty analysis is needed if useful conclusions are to be obtained from best estimate thermal-hydraulic code calculations, otherwise single values of unknown accuracy would be presented for comparison with regulatory acceptance limits. Methods have been developed and presented to quantify the uncertainty of computer code results. The basic techniques proposed by GRS are presented together with applications to a large break loss of coolant accident on a reference reactor as well as on an experiment simulating containment behaviour

  3. Honesty-humility under threat: Self-uncertainty destroys trust among the nice guys.

    Science.gov (United States)

    Pfattheicher, Stefan; Böhm, Robert

    2018-01-01

    Recent research on humans' prosociality has highlighted the crucial role of Honesty-Humility, a basic trait in the HEXACO personality model. There is overwhelming evidence that Honesty-Humility predicts prosocial behavior across a vast variety of situations. In the present contribution, we cloud this rosy picture, examining a condition under which individuals high in Honesty-Humility reduce prosocial behavior. Specifically, we propose that under self-uncertainty, it is particularly those individuals high in Honesty-Humility who reduce trust in unknown others and become less prosocial. In 5 studies, we assessed Honesty-Humility, manipulated self-uncertainty, and measured interpersonal trust or trust in social institutions using behavioral or questionnaire measures. In Study 1, individuals high (vs. low) in Honesty-Humility showed higher levels of trust. This relation was mediated by their positive social expectations about the trustworthiness of others. Inducing self-uncertainty decreased trust, particularly in individuals high in Honesty-Humility (Studies 2-5). Making use of measuring the mediator (Studies 2 and 3) and applying a causal chain design (Studies 4a and 4b), it is shown that individuals high in Honesty-Humility reduced trust because self-uncertainty decreased positive social expectations about others. We end with an applied perspective, showing that Honesty-Humility is predictive of trust in social institutions (e.g., trust in the police; Study 5a), and that self-uncertainty undermined trust in the police especially for individuals high in Honesty-Humility (Study 5b). By these means, the present research shows that individuals high in Honesty-Humility are not unconditionally prosocial. Further implications for Honesty-Humility as well as for research on self-uncertainty and trust are discussed. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  4. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  5. UNCERTAINTY IN THE PROCESS INTEGRATION FOR THE BIOREFINERIES DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Meilyn González Cortés

    2015-07-01

    Full Text Available This paper presents how the design approaches with high level of flexibility can reduce the additional costs of the strategies that apply overdesign factors to consider parameters with uncertainty that impact on the economic feasibility of a project. The elements with associate uncertainties and that are important in the configurations of the process integration under a biorefinery scheme are: raw material, raw material technologies of conversion, and variety of products that can be obtained. From the analysis it is obtained that in the raw materials and products with potentialities in a biorefinery scheme, there are external uncertainties such as availability, demands and prices in the market. Those external uncertainties can determine their impact on the biorefinery and also in the product prices we can find minimum and maximum limits that can be identified in intervals which should be considered for the project economic evaluation and the sensibility analysis due to varied conditions.

  6. Experiences of Uncertainty in Men With an Elevated PSA.

    Science.gov (United States)

    Biddle, Caitlin; Brasel, Alicia; Underwood, Willie; Orom, Heather

    2015-05-15

    A significant proportion of men, ages 50 to 70 years, have, and continue to receive prostate specific antigen (PSA) tests to screen for prostate cancer (PCa). Approximately 70% of men with an elevated PSA level will not subsequently be diagnosed with PCa. Semistructured interviews were conducted with 13 men with an elevated PSA level who had not been diagnosed with PCa. Uncertainty was prominent in men's reactions to the PSA results, stemming from unanswered questions about the PSA test, PCa risk, and confusion about their management plan. Uncertainty was exacerbated or reduced depending on whether health care providers communicated in lay and empathetic ways, and provided opportunities for question asking. To manage uncertainty, men engaged in information and health care seeking, self-monitoring, and defensive cognition. Results inform strategies for meeting informational needs of men with an elevated PSA and confirm the primary importance of physician communication behavior for open information exchange and uncertainty reduction. © The Author(s) 2015.

  7. The role of sensitivity analysis in assessing uncertainty

    International Nuclear Information System (INIS)

    Crick, M.J.; Hill, M.D.

    1987-01-01

    Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice

  8. Measurement, simulation and uncertainty assessment of implant heating during MRI

    International Nuclear Information System (INIS)

    Neufeld, E; Kuehn, S; Kuster, N; Szekely, G

    2009-01-01

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  9. Measurement, simulation and uncertainty assessment of implant heating during MRI

    Energy Technology Data Exchange (ETDEWEB)

    Neufeld, E; Kuehn, S; Kuster, N [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zurich (Switzerland); Szekely, G [Computer Vision Laboratory, Swiss Federal Institute of Technology (ETHZ), Sternwartstr 7, ETH Zentrum, 8092 Zurich (Switzerland)], E-mail: neufeld@itis.ethz.ch

    2009-07-07

    The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.

  10. Comparison of Different Fuel Temperature Models

    Energy Technology Data Exchange (ETDEWEB)

    Weddig, Beatrice

    2003-02-01

    The purpose of this work is to improve the performance of the core calculation system used in Ringhals for in-core fuel management. It has been observed that, whereas the codes yield results that are in good agreement with measurements when the core operates at full nominal power, this agreement deteriorates noticeably when the reactor is running at reduced power. This deficiency of the code system was observed by comparing the calculated and measured boron concentrations in the moderator of the PWR. From the neutronic point of view, the difference between full power and reduced power in the same core is the different temperature of the fuel and the moderator. Whereas the coolant temperature can be measured and is thus relatively well known, the fuel temperature is only inferred from the moderator temperature as well as neutron physics and heat transfer calculations. The most likely reason for the above mentioned discrepancy is therefore the uncertainty of the fuel temperature at low power, and hence the incorrect calculation of the fuel temperature reactivity feedback through the so called Doppler effect. To obtain the fuel temperature at low power, usually some semi-empirical relations, sometimes called correlations, are used. The above-mentioned inaccuracy of the core calculation procedures can thus be tracked down to the insufficiency of these correlations. Therefore, the suggestion is that the above mentioned deficiency of the core calculation codes can be eliminated or reduced if the fuel temperature correlations are improved. An improved model, called the 30% model, is implemented in SIMULATE-3, the core calculation code used at Ringhals. The accuracy of the 30% model was compared to that of the present model by considering a number of cases, where measured values of the boron concentration at low power were available, and comparing them with calculated values using both the present and the new model. It was found that on the whole, the new fuel temperature

  11. Comparison of Different Fuel Temperature Models

    International Nuclear Information System (INIS)

    Weddig, Beatrice

    2003-02-01

    The purpose of this work is to improve the performance of the core calculation system used in Ringhals for in-core fuel management. It has been observed that, whereas the codes yield results that are in good agreement with measurements when the core operates at full nominal power, this agreement deteriorates noticeably when the reactor is running at reduced power. This deficiency of the code system was observed by comparing the calculated and measured boron concentrations in the moderator of the PWR. From the neutronic point of view, the difference between full power and reduced power in the same core is the different temperature of the fuel and the moderator. Whereas the coolant temperature can be measured and is thus relatively well known, the fuel temperature is only inferred from the moderator temperature as well as neutron physics and heat transfer calculations. The most likely reason for the above mentioned discrepancy is therefore the uncertainty of the fuel temperature at low power, and hence the incorrect calculation of the fuel temperature reactivity feedback through the so called Doppler effect. To obtain the fuel temperature at low power, usually some semi-empirical relations, sometimes called correlations, are used. The above-mentioned inaccuracy of the core calculation procedures can thus be tracked down to the insufficiency of these correlations. Therefore, the suggestion is that the above mentioned deficiency of the core calculation codes can be eliminated or reduced if the fuel temperature correlations are improved. An improved model, called the 30% model, is implemented in SIMULATE-3, the core calculation code used at Ringhals. The accuracy of the 30% model was compared to that of the present model by considering a number of cases, where measured values of the boron concentration at low power were available, and comparing them with calculated values using both the present and the new model. It was found that on the whole, the new fuel temperature

  12. Technical note: Design flood under hydrological uncertainty

    Science.gov (United States)

    Botto, Anna; Ganora, Daniele; Claps, Pierluigi; Laio, Francesco

    2017-07-01

    Planning and verification of hydraulic infrastructures require a design estimate of hydrologic variables, usually provided by frequency analysis, and neglecting hydrologic uncertainty. However, when hydrologic uncertainty is accounted for, the design flood value for a specific return period is no longer a unique value, but is represented by a distribution of values. As a consequence, the design flood is no longer univocally defined, making the design process undetermined. The Uncertainty Compliant Design Flood Estimation (UNCODE) procedure is a novel approach that, starting from a range of possible design flood estimates obtained in uncertain conditions, converges to a single design value. This is obtained through a cost-benefit criterion with additional constraints that is numerically solved in a simulation framework. This paper contributes to promoting a practical use of the UNCODE procedure without resorting to numerical computation. A modified procedure is proposed by using a correction coefficient that modifies the standard (i.e., uncertainty-free) design value on the basis of sample length and return period only. The procedure is robust and parsimonious, as it does not require additional parameters with respect to the traditional uncertainty-free analysis. Simple equations to compute the correction term are provided for a number of probability distributions commonly used to represent the flood frequency curve. The UNCODE procedure, when coupled with this simple correction factor, provides a robust way to manage the hydrologic uncertainty and to go beyond the use of traditional safety factors. With all the other parameters being equal, an increase in the sample length reduces the correction factor, and thus the construction costs, while still keeping the same safety level.

  13. Impacts of generalized uncertainty principle on black hole thermodynamics and Salecker-Wigner inequalities

    International Nuclear Information System (INIS)

    Tawfik, A.

    2013-01-01

    We investigate the impacts of Generalized Uncertainty Principle (GUP) proposed by some approaches to quantum gravity such as String Theory and Doubly Special Relativity on black hole thermodynamics and Salecker-Wigner inequalities. Utilizing Heisenberg uncertainty principle, the Hawking temperature, Bekenstein entropy, specific heat, emission rate and decay time are calculated. As the evaporation entirely eats up the black hole mass, the specific heat vanishes and the temperature approaches infinity with an infinite radiation rate. It is found that the GUP approach prevents the black hole from the entire evaporation. It implies the existence of remnants at which the specific heat vanishes. The same role is played by the Heisenberg uncertainty principle in constructing the hydrogen atom. We discuss how the linear GUP approach solves the entire-evaporation-problem. Furthermore, the black hole lifetime can be estimated using another approach; the Salecker-Wigner inequalities. Assuming that the quantum position uncertainty is limited to the minimum wavelength of measuring signal, Wigner second inequality can be obtained. If the spread of quantum clock is limited to some minimum value, then the modified black hole lifetime can be deduced. Based on linear GUP approach, the resulting lifetime difference depends on black hole relative mass and the difference between black hole mass with and without GUP is not negligible

  14. Improvements to the RELAP5/MOD3 reflood model and uncertainty quantification of reflood peak clad temperature

    International Nuclear Information System (INIS)

    Lee, Sang Yong; Chung, Bob Dong; Lee, Young Jin; Park, Chan Eok; Lee, Guy Hyung; Choi, Chul Jin

    1994-06-01

    This research aims to develop reliable, advanced system thermal-hydraulic computer code and to quantify the uncertainties of code to introduce the best estimate methodology of ECCS for LBLOCA. Although the one of best estimate code, RELAP5/MOD3.1 was introduced from USNRC, several deficiencies in its reflood model and some improvements have been made. The improvements consist of modification of reflood wall heat transfer package and adjusting the drop size in dispersed flow regime. The tome smoothing of wall vaporization and level tracking model are also added to eliminate the pressure spike and level oscillation. For the verification of improved model and quantification of associated uncertainty, the FLECHT-SEASET data were used and upper limit of uncertainty at 95% confidence level is evaluated. (Author) 30 refs., 49 figs., 2 tabs

  15. Improvements to the RELAP5/MOD3 reflood model and uncertainty quantification of reflood peak clad temperature

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sang Yong; Chung, Bob Dong; Lee, Young Jin; Park, Chan Eok; Lee, Guy Hyung; Choi, Chul Jin [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    This research aims to develop reliable, advanced system thermal-hydraulic computer code and to quantify the uncertainties of code to introduce the best estimate methodology of ECCS for LBLOCA. Although the one of best estimate code, RELAP5/MOD3.1 was introduced from USNRC, several deficiencies in its reflood model and some improvements have been made. The improvements consist of modification of reflood wall heat transfer package and adjusting the drop size in dispersed flow regime. The tome smoothing of wall vaporization and level tracking model are also added to eliminate the pressure spike and level oscillation. For the verification of improved model and quantification of associated uncertainty, the FLECHT-SEASET data were used and upper limit of uncertainty at 95% confidence level is evaluated. (Author) 30 refs., 49 figs., 2 tabs.

  16. Gaussian process regression for sensor networks under localization uncertainty

    Science.gov (United States)

    Jadaliha, M.; Xu, Yunfei; Choi, Jongeun; Johnson, N.S.; Li, Weiming

    2013-01-01

    In this paper, we formulate Gaussian process regression with observations under the localization uncertainty due to the resource-constrained sensor networks. In our formulation, effects of observations, measurement noise, localization uncertainty, and prior distributions are all correctly incorporated in the posterior predictive statistics. The analytically intractable posterior predictive statistics are proposed to be approximated by two techniques, viz., Monte Carlo sampling and Laplace's method. Such approximation techniques have been carefully tailored to our problems and their approximation error and complexity are analyzed. Simulation study demonstrates that the proposed approaches perform much better than approaches without considering the localization uncertainty properly. Finally, we have applied the proposed approaches on the experimentally collected real data from a dye concentration field over a section of a river and a temperature field of an outdoor swimming pool to provide proof of concept tests and evaluate the proposed schemes in real situations. In both simulation and experimental results, the proposed methods outperform the quick-and-dirty solutions often used in practice.

  17. Analysis of uncertainty propagation in nuclear fuel cycle scenarios

    International Nuclear Information System (INIS)

    Krivtchik, Guillaume

    2014-01-01

    Nuclear scenario studies model nuclear fleet over a given period. They enable the comparison of different options for the reactor fleet evolution, and the management of the future fuel cycle materials, from mining to disposal, based on criteria such as installed capacity per reactor technology, mass inventories and flows, in the fuel cycle and in the waste. Uncertainties associated with nuclear data and scenario parameters (fuel, reactors and facilities characteristics) propagate along the isotopic chains in depletion calculations, and through out the scenario history, which reduces the precision of the results. The aim of this work is to develop, implement and use a stochastic uncertainty propagation methodology adapted to scenario studies. The method chosen is based on development of depletion computation surrogate models, which reduce the scenario studies computation time, and whose parameters include perturbations of the depletion model; and fabrication of equivalence model which take into account cross-sections perturbations for computation of fresh fuel enrichment. Then the uncertainty propagation methodology is applied to different scenarios of interest, considering different options of evolution for the French PWR fleet with SFR deployment. (author) [fr

  18. Effects of creep and oxidation on reduced modulus in high-temperature nanoindentation

    International Nuclear Information System (INIS)

    Li, Yan; Fang, Xufei; Lu, Siyuan; Yu, Qingmin; Hou, Guohui; Feng, Xue

    2016-01-01

    Nanoindentation tests were performed on single crystal Ni-based superalloy at temperatures ranging from 20 °C to 800 °C in inert environment. Load-displacement curves at temperatures higher than 500 °C exhibit obvious creep inferred by increasing displacements at load-holding segments. Load-displacement curves obtained at 800 °C also display negative unloading stiffness. Examination of the microstructure beneath the indented area using Transmission Electron Microscope (TEM) reveals abundant dislocation piling up as well as oxide formation on the substrate. A method considering the creep effect is proposed to calculate the reduced modulus. In addition, a dimensionless ratio relating indentation depth and oxide film thickness is introduced to explain the oxidation effect on the mechanical properties derived from the load-displacement curves.

  19. Concept of uncertainty in relation to the foresight research

    Directory of Open Access Journals (Sweden)

    Magruk Andrzej

    2017-03-01

    Full Text Available Uncertainty is one of the most important features of many areas of social and economic life, especially in the forward-looking context. On the one hand, the degree of uncertainty is associated with the objective essence of randomness of the phenomenon, and on the other, with the subjective perspective of a man. Future-oriented perception of human activities is laden with an incomplete specificity of the analysed phenomena, their volatility, and lack of continuity. A man is unable to determine, with complete certainty, the further course of these phenomena. According to the author of this article, in order to significantly reduce the uncertainty while making strategic decisions in a complex environment, we should focus our actions on the future through systemic research of foresight. This article attempts to answer the following research questions: 1 What is the relationship between foresight studies in the system perspective to studies of the uncertainty? 2 What classes of foresight methods enable the research of uncertainty in the process of system inquiry of the future? This study conducted deductive reasoning based on the results of the analysis methods and criticism of literature.

  20. Information Theory for Correlation Analysis and Estimation of Uncertainty Reduction in Maps and Models

    Directory of Open Access Journals (Sweden)

    J. Florian Wellmann

    2013-04-01

    Full Text Available The quantification and analysis of uncertainties is important in all cases where maps and models of uncertain properties are the basis for further decisions. Once these uncertainties are identified, the logical next step is to determine how they can be reduced. Information theory provides a framework for the analysis of spatial uncertainties when different subregions are considered as random variables. In the work presented here, joint entropy, conditional entropy, and mutual information are applied for a detailed analysis of spatial uncertainty correlations. The aim is to determine (i which areas in a spatial analysis share information, and (ii where, and by how much, additional information would reduce uncertainties. As an illustration, a typical geological example is evaluated: the case of a subsurface layer with uncertain depth, shape and thickness. Mutual information and multivariate conditional entropies are determined based on multiple simulated model realisations. Even for this simple case, the measures not only provide a clear picture of uncertainties and their correlations but also give detailed insights into the potential reduction of uncertainties at each position, given additional information at a different location. The methods are directly applicable to other types of spatial uncertainty evaluations, especially where multiple realisations of a model simulation are analysed. In summary, the application of information theoretic measures opens up the path to a better understanding of spatial uncertainties, and their relationship to information and prior knowledge, for cases where uncertain property distributions are spatially analysed and visualised in maps and models.

  1. Epistemic Uncertainty in Evalustion of Evapotranspiration and Net Infiltration Using Analogue Meteorological Data

    Energy Technology Data Exchange (ETDEWEB)

    B. Faybishenko

    2006-09-01

    Uncertainty is typically defined as a potential deficiency in the modeling of a physical process, owing to a lack of knowledge. Uncertainty can be categorized as aleatoric (inherent uncertainty caused by the intrinsic randomness of the system) or epistemic (uncertainty caused by using various model simplifications and their parameters). One of the main reasons for model simplifications is a limited amount of meteorological data. This paper is devoted to the epistemic uncertainty quantification involved in two components of the hydrologic balance-evapotranspiration and net infiltration for interglacial (present day), and future monsoon, glacial transition, and glacial climates at Yucca Mountain, using the data from analogue meteorological stations. In particular, the author analyzes semi-empirical models used for evaluating (1) reference-surface potential evapotranspiration, including temperature-based models (Hargreaves-Samani, Thornthwaite, Hamon, Jensen-Haise, and Turc) and radiation-based models (Priestly-Taylor and Penman), and (2) surface-dependent potential evapotranspiration (Penman-Monteith and Shuttleworth-Wallace models). Evapotranspiration predictions are then used as inputs for the evaluation of net infiltration using the semi-empirical models of Budyko, Fu, Milly, Turc-Pike, and Zhang. Results show that net infiltration ranges are expected to generally increase from the present-day climate to monsoon climate, to glacial transition climate, and then to the glacial climate. The propagation of uncertainties through model predictions for different climates is characterized using statistical measures. Predicted evapotranspiration ranges are reasonably corroborated against the data from Class A pan evaporometers (taking into account evaporation-pan adjustment coefficients), and ranges of net infiltration predictions are corroborated against the geochemical and temperature-based estimates of groundwater recharge and percolation rates through the unsaturated

  2. Epistemic Uncertainty in Evaluation of Evapotranspiration and Net Infiltration Using Analogue Meteorological Data

    International Nuclear Information System (INIS)

    B. Faybishenko

    2006-01-01

    Uncertainty is typically defined as a potential deficiency in the modeling of a physical process, owing to a lack of knowledge. Uncertainty can be categorized as aleatoric (inherent uncertainty caused by the intrinsic randomness of the system) or epistemic (uncertainty caused by using various model simplifications and their parameters). One of the main reasons for model simplifications is a limited amount of meteorological data. This paper is devoted to the epistemic uncertainty quantification involved in two components of the hydrologic balance-evapotranspiration and net infiltration for interglacial (present day), and future monsoon, glacial transition, and glacial climates at Yucca Mountain, using the data from analogue meteorological stations. In particular, the author analyzes semi-empirical models used for evaluating (1) reference-surface potential evapotranspiration, including temperature-based models (Hargreaves-Samani, Thornthwaite, Hamon, Jensen-Haise, and Turc) and radiation-based models (Priestly-Taylor and Penman), and (2) surface-dependent potential evapotranspiration (Penman-Monteith and Shuttleworth-Wallace models). Evapotranspiration predictions are then used as inputs for the evaluation of net infiltration using the semi-empirical models of Budyko, Fu, Milly, Turc-Pike, and Zhang. Results show that net infiltration ranges are expected to generally increase from the present-day climate to monsoon climate, to glacial transition climate, and then to the glacial climate. The propagation of uncertainties through model predictions for different climates is characterized using statistical measures. Predicted evapotranspiration ranges are reasonably corroborated against the data from Class A pan evaporometers (taking into account evaporation-pan adjustment coefficients), and ranges of net infiltration predictions are corroborated against the geochemical and temperature-based estimates of groundwater recharge and percolation rates through the unsaturated

  3. Quantification of the impact of precipitation spatial distribution uncertainty on predictive uncertainty of a snowmelt runoff model

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed

  4. Assessing climate adaptation options and uncertainties for cereal systems in West Africa

    Science.gov (United States)

    Guan, K.; Sultan, B.; Biasutti, M.; Lobell, D. B.

    2015-12-01

    The already fragile agriculture production system in West Africa faces further challenges in meeting food security in the coming decades, primarily due to a fast increasing population and risks of climate change. Successful adaptation of agriculture should not only benefit in the current climate but should also reduce negative (or enhance positive) impacts for climate change. Assessment of various possible adaptation options and their uncertainties provides key information for prioritizing adaptation investments. Here, based on the several robust aspects of climate projections in this region (i.e. temperature increases and rainfall pattern shifts), we use two well-validated crop models (i.e. APSIM and SARRA-H) and an ensemble of downscaled climate forcing to assess five possible and realistic adaptation options (late sowing, intensification, thermal time increase, water harvesting and increased resilience to heat stress) in West Africa for the staple crop production of sorghum. We adopt a new assessment framework to account for both the impacts of adaptation options in current climate and their ability to reduce impacts of future climate change, and also consider changes in both mean yield and its variability. Our results reveal that most proposed "adaptation options" are not more beneficial in the future than in the current climate, i.e. not really reduce the climate change impacts. Increased temperature resilience during grain number formation period is the main adaptation that emerges. We also find that changing from the traditional to modern cultivar, and later sowing in West Sahel appear to be robust adaptations.

  5. Real Options Effect of Uncertainty and Labor Demand Shocks on the Housing Market

    OpenAIRE

    Lee, Gabriel; Nguyen Thanh, Binh; Strobel, Johannes

    2016-01-01

    This paper shows that uncertainty affects the housing market in two significant ways. First, uncertainty shocks adversely affect housing prices but not the quantities that are traded. Controlling for a broad set of variables in fixed-effects regressions, we find that uncertainty shocks reduce housing prices and median sales prices in the amount of 1.4% and 1.8%, respectively, but the effect is not statistically significant for the percentage changes of all homes sold. Second, when...

  6. Towards constraining extreme temperature projections of the CMIP5 ensemble

    Science.gov (United States)

    Vogel, Martha-Marie; Orth, René; Isabelle Seneviratne, Sonia

    2016-04-01

    The frequency and intensity of heat waves is expected to change in future in response to global warming. Given the severe impacts of heat waves on ecosystems and society it is important to understand how and where they will intensify. Projections of extreme hot temperatures in the IPCC AR5 model ensemble show large uncertainties for projected changes of extreme temperatures in particular in Central Europe. In this region land-atmosphere coupling can contribute substantially to the development of heat waves. This coupling is also subject to change in future, while model projections display considerable spread. In this work we link projections of changes in extreme temperatures and of changes in land-atmosphere interactions with a particular focus on Central Europe. Uncertainties in projected extreme temperatures can be partly explained by different projected changes of the interplay between latent heat and temperature as well as soil moisture. Given the considerable uncertainty in land-atmosphere coupling representation already in the current climate, we furthermore employ observational data sets to constrain the model ensemble, and consequently the extreme temperature projections.

  7. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  8. Evaluation method for uncertainty of effective delayed neutron fraction βeff

    International Nuclear Information System (INIS)

    Zukeran, Atsushi

    1999-01-01

    Uncertainty of effective delayed neutron fraction β eff is evaluated in terms of three quantities; uncertainties of the basic delayed neutron constants, energy dependence of delayed neutron yield ν d m , and the uncertainties of the fission cross sections of fuel elements. The uncertainty of β eff due to the delayed neutron yield is expressed by a linearized formula assuming that the delayed neutron yield does not depend on the incident energy, and the energy dependence is supplemented by using the detailed energy dependence proposed by D'Angelo and Filip. The third quantity, uncertainties of fission cross section, is evaluated on the basis of the generalized perturbation theory in relation to reaction rate rations such as central spectral indexes or average reaction rate ratios. Resultant uncertainty of β eff is about 4 to 5%s, in which primary factor is the delayed neutron yield, and the secondary one is the fission cross section uncertainty, especially for 238 U. The energy dependence of ν d m systematically reduces the magnitude of β eff about 1.4% to 1.7%, depending on the model of the energy vs. ν d m correlation curve. (author)

  9. Self-Uncertainty and the Influence of Alternative Goals on Self-Regulation.

    Science.gov (United States)

    Light, Alysson E; Rios, Kimberly; DeMarree, Kenneth G

    2018-01-01

    The current research examines factors that facilitate or undermine goal pursuit. Past research indicates that attempts to reduce self-uncertainty can result in increased goal motivation. We explore a critical boundary condition of this effect-the presence of alternative goals. Though self-regulatory processes usually keep interest in alternative goals in check, uncertainty reduction may undermine these self-regulatory efforts by (a) reducing conflict monitoring and (b) increasing valuation of alternative goals. As such, reminders of alternative goals will draw effort away from focal goals for self-uncertain (but not self-certain) participants. Across four studies and eight supplemental studies, using different focal goals (e.g., academic achievement, healthy eating) and alternative goals (e.g., social/emotional goals, attractiveness, indulgence), we found that alternative goal salience does not negatively influence goal-directed behavior among participants primed with self-certainty, but that reminders of alternative goals undermine goal pursuit among participants primed with self-uncertainty.

  10. Trapped between two tails: trading off scientific uncertainties via climate targets

    International Nuclear Information System (INIS)

    Lemoine, Derek; McJeon, Haewon C

    2013-01-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming. (letter)

  11. Trapped between two tails: trading off scientific uncertainties via climate targets

    Science.gov (United States)

    Lemoine, Derek; McJeon, Haewon C.

    2013-09-01

    Climate change policies must trade off uncertainties about future warming, about the social and ecological impacts of warming, and about the cost of reducing greenhouse gas emissions. We show that laxer carbon targets produce broader distributions for climate damages, skewed towards severe outcomes. However, if potential low-carbon technologies fill overlapping niches, then more stringent carbon targets produce broader distributions for the cost of reducing emissions, skewed towards high-cost outcomes. We use the technology-rich GCAM integrated assessment model to assess the robustness of 450 and 500 ppm carbon targets to each uncertain factor. The 500 ppm target provides net benefits across a broad range of futures. The 450 ppm target provides net benefits only when impacts are greater than conventionally assumed, when multiple technological breakthroughs lower the cost of abatement, or when evaluated with a low discount rate. Policy evaluations are more sensitive to uncertainty about abatement technology and impacts than to uncertainty about warming.

  12. Scientific uncertainties and climate risks

    International Nuclear Information System (INIS)

    Petit, M.

    2005-01-01

    Human activities have induced a significant change in the Earth's atmospheric composition and, most likely, this trend will increase throughout the coming decades. During the last decades, the mean temperature has actually increased by the expected amount. Moreover, the geographical distribution of the warming, and day-to-night temperature variation have evolved as predicted. The magnitude of those changes is relatively small for the time being, but is expected to increase alarmingly during the coming decades. Greenhouse warming is a representative example of the problems of sustainable development: long-term risks can be estimated on a rational basis from scientific laws alone, but the non-specialist is generally not prepared to understand the steps required. However, even the non-specialist has obviously the right to decide about his way of life and the inheritance that he would like to leave for his children, but it is preferable that he is fully informed before making his decisions. Dialog, mutual understanding and confidence must prevail between Science and Society to avoid irrational actions. Controversy among experts is quite frequent. In the case of greenhouse warming, a commendable collective expertise has drastically reduced possible confusion. The Intergovernmental Panel on Climate Change was created jointly by the World Meteorology Organization (WMO) and the UN Program for the Environment (UNEP). Its reports evaluate the state of knowledge on past and future global climate changes, their impact, and the possibility of controlling anthropogenic emissions. The main targeted readers are, nevertheless, non-specialists, who should be made aware of results deduced from approaches that they may not be able to follow step by step. Moreover, these results, in particular, future projections, are, and will remain, subject to some uncertainty, which a fair description of the state of knowledge must include. Many misunderstandings between writers and readers can

  13. Probabilistic Modeling of High-Temperature Material Properties of a 5-Harness 0/90 Sylramic Fiber/ CVI-SiC/ MI-SiC Woven Composite

    Science.gov (United States)

    Nagpal, Vinod K.; Tong, Michael; Murthy, P. L. N.; Mital, Subodh

    1998-01-01

    An integrated probabilistic approach has been developed to assess composites for high temperature applications. This approach was used to determine thermal and mechanical properties and their probabilistic distributions of a 5-harness 0/90 Sylramic fiber/CVI-SiC/Mi-SiC woven Ceramic Matrix Composite (CMC) at high temperatures. The purpose of developing this approach was to generate quantitative probabilistic information on this CMC to help complete the evaluation for its potential application for HSCT combustor liner. This approach quantified the influences of uncertainties inherent in constituent properties called primitive variables on selected key response variables of the CMC at 2200 F. The quantitative information is presented in the form of Cumulative Density Functions (CDFs). Probability Density Functions (PDFS) and primitive variable sensitivities on response. Results indicate that the scatters in response variables were reduced by 30-50% when the uncertainties in the primitive variables, which showed the most influence, were reduced by 50%.

  14. Benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X

    International Nuclear Information System (INIS)

    Aures, A.; Bostelmann, F.; Hursin, M.; Leray, O.

    2017-01-01

    Highlights: • Application of the uncertainty analysis methods XSUSA and SHARK-X. • Propagation of nuclear data uncertainty through PWR pin cell depletion calculation. • Uncertainty quantification of eigenvalue, nuclide densities and Doppler coefficient. • Top contributor to overall output uncertainty by sensitivity analysis. • Comparison with SAMPLER and TSUNAMI of the SCALE code package. - Abstract: This study presents collaborative work performed between GRS and PSI on benchmarking and application of the state-of-the-art uncertainty analysis methods XSUSA and SHARK-X. Applied to a PWR pin cell depletion calculation, both methods propagate input uncertainty from nuclear data to output uncertainty. The uncertainty of the multiplication factors, nuclide densities, and fuel temperature coefficients derived by both methods are compared at various burnup steps. Comparisons of these quantities are furthermore performed with the SAMPLER module of SCALE 6.2. The perturbation theory based TSUNAMI module of both SCALE 6.1 and SCALE 6.2 is additionally applied for comparisons of the reactivity coefficient.

  15. Uncertainty in social dilemmas

    OpenAIRE

    Kwaadsteniet, Erik Willem de

    2007-01-01

    This dissertation focuses on social dilemmas, and more specifically, on environmental uncertainty in these dilemmas. Real-life social dilemma situations are often characterized by uncertainty. For example, fishermen mostly do not know the exact size of the fish population (i.e., resource size uncertainty). Several researchers have therefore asked themselves the question as to how such uncertainty influences people’s choice behavior. These researchers have repeatedly concluded that uncertainty...

  16. Uncertainty theory

    CERN Document Server

    Liu, Baoding

    2015-01-01

    When no samples are available to estimate a probability distribution, we have to invite some domain experts to evaluate the belief degree that each event will happen. Perhaps some people think that the belief degree should be modeled by subjective probability or fuzzy set theory. However, it is usually inappropriate because both of them may lead to counterintuitive results in this case. In order to rationally deal with belief degrees, uncertainty theory was founded in 2007 and subsequently studied by many researchers. Nowadays, uncertainty theory has become a branch of axiomatic mathematics for modeling belief degrees. This is an introductory textbook on uncertainty theory, uncertain programming, uncertain statistics, uncertain risk analysis, uncertain reliability analysis, uncertain set, uncertain logic, uncertain inference, uncertain process, uncertain calculus, and uncertain differential equation. This textbook also shows applications of uncertainty theory to scheduling, logistics, networks, data mining, c...

  17. Assessment of uncertainties associated with characterization of geological environment in the Tono area. Japanese fiscal year, 2006 (Contract research)

    International Nuclear Information System (INIS)

    Toida, Masaru; Suyama, Yasuhiro; Seno, Shoji; Atsumi, Hiroyuki; Ogata, Nobuhisa

    2008-03-01

    'Geoscientific research' performed at the Tono Geoscience Center is developing site investigation, characterization and assessment techniques for understanding of geological environment. Their important themes are to establish a methodology for analyzing uncertainties in heterogeneous geological environment, and to develop investigation techniques for reducing the uncertainties efficiently. This study proposes a new approach where all the possible options in the models and data-sets that cannot be excluded in the light of the evidence available, are identified. This approach enables uncertainties associated with the understanding at a given stage of the site characterization to be made explicitly using an uncertainty analysis technique based on Fuzzy geostatistics. This, in turn, supports the design of the following investigation stage to reduce the uncertainties efficiently. In the study, current knowledge had been compiled, and the technique had been advanced through geological modeling and groundwater analyses in the Tono area. This report systematized the uncertainty analysis methodology associated with the characterization of the geological environment, and organized the procedure of the methodology with the application examples in the study. This report also dealt with investigation techniques for reducing the uncertainties efficiently, and underground facility design options for handling geological uncertainties based on the characterization of the geological environment. (author)

  18. Liquidus Temperature Data for DWPF Glass

    International Nuclear Information System (INIS)

    Piepel, G.F.; Vienna, J.D.; Crum, J.V.; Mika, M.; Hrma, P.

    1999-01-01

    This report provides new liquidus temperature (T L ) versus composition data that can be used to reduce uncertainty in T L calculation for DWPF glass. According to the test plan and test matrix design PNNL has measured T L for 53 glasses within and just outside of the current DWPF processing composition window. The T L database generated under this task will directly support developing and enhancing the current T L process-control model. Preliminary calculations have shown a high probability of increasing HLW loading in glass produced at the SRS and Hanford. This increase in waste loading will decrease the life-cycle tank cleanup costs by decreasing process time and the volume of waste glass produced

  19. Uncertainty Propagation in Hypersonic Vehicle Aerothermoelastic Analysis

    Science.gov (United States)

    Lamorte, Nicolas Etienne

    Hypersonic vehicles face a challenging flight environment. The aerothermoelastic analysis of its components requires numerous simplifying approximations. Identifying and quantifying the effect of uncertainties pushes the limits of the existing deterministic models, and is pursued in this work. An uncertainty quantification framework is used to propagate the effects of identified uncertainties on the stability margins and performance of the different systems considered. First, the aeroelastic stability of a typical section representative of a control surface on a hypersonic vehicle is examined. Variability in the uncoupled natural frequencies of the system is modeled to mimic the effect of aerodynamic heating. Next, the stability of an aerodynamically heated panel representing a component of the skin of a generic hypersonic vehicle is considered. Uncertainty in the location of transition from laminar to turbulent flow and the heat flux prediction is quantified using CFD. In both cases significant reductions of the stability margins are observed. A loosely coupled airframe--integrated scramjet engine is considered next. The elongated body and cowl of the engine flow path are subject to harsh aerothermodynamic loading which causes it to deform. Uncertainty associated with deformation prediction is propagated to the engine performance analysis. The cowl deformation is the main contributor to the sensitivity of the propulsion system performance. Finally, a framework for aerothermoelastic stability boundary calculation for hypersonic vehicles using CFD is developed. The usage of CFD enables one to consider different turbulence conditions, laminar or turbulent, and different models of the air mixture, in particular real gas model which accounts for dissociation of molecules at high temperature. The system is found to be sensitive to turbulence modeling as well as the location of the transition from laminar to turbulent flow. Real gas effects play a minor role in the

  20. Can slow-diffusing solute atoms reduce vacancy diffusion in advanced high-temperature alloys?

    International Nuclear Information System (INIS)

    Goswami, Kamal Nayan; Mottura, Alessandro

    2014-01-01

    The high-temperature mechanical properties of precipitate-strengthened advanced alloys can be heavily influenced by adjusting chemical composition. The widely-accepted argument within the community is that, under certain temperature and loading conditions, plasticity occurs only in the matrix, and dislocations have to rely on thermally-activated climb mechanisms to overcome the barriers to glide posed by the hard precipitates. This is the case for γ′-strengthened Ni-based superalloys. The presence of dilute amounts of slow-diffusing solute atoms, such as Re and W, in the softer matrix phase is thought to reduce plasticity by retarding the climb of dislocations at the interface with the hard precipitate phase. One hypothesis is that the presence of these solutes must hinder the flow of vacancies, which are essential to the climb process. In this work, density functional theory calculations are used to inform two analytical models to describe the effect of solute atoms on the diffusion of vacancies. Results suggest that slow-diffusing solute atoms are not effective at reducing the diffusion of vacancies in these systems

  1. Smoke flow temperature beneath tunnel ceiling for train fire at subway station: Reduced-scale experiments and correlations

    International Nuclear Information System (INIS)

    Meng, Na; Wang, Qiang; Liu, Zhaoxia; Li, Xiao; Yang, He

    2017-01-01

    Highlights: • Reduced-scale experiments on train fire at subway station. • Smoke flow temperature beneath tunnel ceiling measured and correlated. • Effect of platform-tunnel conjunction door type on smoke temperature is clarified. - Abstract: This paper is to investigate the smoke flow temperature beneath tunnel ceiling for a train on fire stopping besides a subway station. Experiments were carried out in a reduced-scale (1:10) subway station model to study the maximum smoke temperature and the longitudinal temperature distribution beneath the tunnel ceiling by considering platform-tunnel conjunction doors of two types: the full-seal platform screen door (PSD) and the full-height safety door. For the maximum temperature beneath the tunnel ceiling, it is found to be well correlated non-dimensionally with heat release rate by a 3.65 and a 2.92 power law function for the full-seal platform screen door and the full-height safety door, respectively. For the longitudinal temperature distribution along the tunnel ceiling, it can be well correlated by an exponential function for both types of platform-tunnel conjunction doors. Concerning the effect of the door type, the maximum temperature is lower and the longitudinal temperature decays faster for full-height safety door than that for full-seal PSD. This is due to that with the full-height safety door, the effective width of the tunnel ceiling is widened, which results in more heat losses from the smoke flow to the ceiling.

  2. Blockchain to Rule the Waves - Nascent Design Principles for Reducing Risk and Uncertainty in Decentralized Environments

    DEFF Research Database (Denmark)

    Nærland, Kristoffer; Müller-Bloch, Christoph; Beck, Roman

    2017-01-01

    Many decentralized, inter-organizational environments such as supply chains are characterized by high transactional uncertainty and risk. At the same time, blockchain technology promises to mitigate these issues by introducing certainty into economic transactions. This paper discusses the findings...... of a Design Science Research project involving the construction and evaluation of an information technology artifact in collaboration with Maersk, a leading international shipping company, where central documents in shipping, such as the Bill of Lading, are turned into a smart contract on blockchain. Based...... on our insights from the project, we provide first evidence for preliminary design principles for applications that aim to mitigate the transactional risk and uncertainty in decentralized environments using blockchain. Both the artifact and the first evidence for emerging design principles are novel...

  3. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    Science.gov (United States)

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues.

  4. A cross-coupled-structure-based temperature sensor with reduced process variation sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Tie Meng; Cheng Xu, E-mail: tiemeng@mprc.pku.edu.c [Microprocessor Research and Development Center, Peking University, Beijing 100871 (China)

    2009-04-15

    An innovative, thermally-insensitive phenomenon of cascaded cross-coupled structures is found. And a novel CMOS temperature sensor based on a cross-coupled structure is proposed. This sensor consists of two different ring oscillators. The first ring oscillator generates pulses that have a period, changing linearly with temperature. Instead of using the system clock like in traditional sensors, the second oscillator utilizes a cascaded cross-coupled structure to generate temperature independent pulses to capture the result from the first oscillator. Due to the compensation between the two ring oscillators, errors caused by supply voltage variations and systematic process variations are reduced. The layout design of the sensor is based on the TSMC13G process standard cell library. Only three inverters are modified for proper channel width tuning without any other custom design. This allows for an easy integration of the sensor into cell-based chips. Post-layout simulations results show that an error lower than +-1.1 deg. C can be achieved in the full temperature range from -40 to 120 deg. C. As shown by SPICE simulations, the thermal insensitivity of the cross-coupled inverters can be realized for various TSMC technologies: 0.25 mum, 0.18 mum, 0.13 mum, and 65 nm.

  5. High Temperature Gas-Cooled Reactor Projected Markets and Preliminary Economics

    Energy Technology Data Exchange (ETDEWEB)

    Larry Demick

    2011-08-01

    This paper summarizes the potential market for process heat produced by a high temperature gas-cooled reactor (HTGR), the environmental benefits reduced CO2 emissions will have on these markets, and the typical economics of projects using these applications. It gives examples of HTGR technological applications to industrial processes in the typical co-generation supply of process heat and electricity, the conversion of coal to transportation fuels and chemical process feedstock, and the production of ammonia as a feedstock for the production of ammonia derivatives, including fertilizer. It also demonstrates how uncertainties in capital costs and financial factors affect the economics of HTGR technology by analyzing the use of HTGR technology in the application of HTGR and high temperature steam electrolysis processes to produce hydrogen.

  6. Intense air-sea exchanges and heavy orographic precipitation over Italy: The role of Adriatic sea surface temperature uncertainty

    Science.gov (United States)

    Stocchi, Paolo; Davolio, Silvio

    2017-11-01

    Strong and persistent low-level winds blowing over the Adriatic basin are often associated with intense precipitation events over Italy. Typically, in case of moist southeasterly wind (Sirocco), rainfall affects northeastern Italy and the Alpine chain, while with cold northeasterly currents (Bora) precipitations are localized along the eastern slopes of the Apennines and central Italy coastal areas. These events are favoured by intense air-sea interactions and it is reasonable to hypothesize that the Adriatic sea surface temperature (SST) can affect the amount and location of precipitation. High-resolution simulations of different Bora and Sirocco events leading to severe precipitation are performed using a convection-permitting model (MOLOCH). Sensitivity experiments varying the SST initialization field are performed with the aim of evaluating the impact of SST uncertainty on precipitation forecasts, which is a relevant topic for operational weather predictions, especially at local scales. Moreover, diagnostic tools to compute water vapour fluxes across the Italian coast and atmospheric water budget over the Adriatic Sea have been developed and applied in order to characterize the air mass that feeds the precipitating systems. Finally, the investigation of the processes through which the SST influences location and intensity of heavy precipitation allows to gain a better understanding on mechanisms conducive to severe weather in the Mediterranean area and in the Adriatic basin in particular. Results show that the effect of the Adriatic SST (uncertainty) on precipitation is complex and can vary considerably among different events. For both Bora and Sirocco events, SST does not influence markedly the atmospheric water budget or the degree of moistening of air that flows over the Adriatic Sea. SST mainly affects the stability of the atmospheric boundary layer, thus influencing the flow dynamics and the orographic flow regime, and in turn, the precipitation pattern.

  7. Moving Beyond 2% Uncertainty: A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer F.; Clifton, Andrew

    2017-03-08

    Remote sensing of wind using lidar is revolutionizing wind energy. However, current generations of wind lidar are ascribed a climatic value of uncertainty, which is based on a poor description of lidar sensitivity to external conditions. In this presentation, we show how it is important to consider the complete lidar measurement process to define the measurement uncertainty, which in turn offers the ability to define a much more granular and dynamic measurement uncertainty. This approach is a progression from the 'white box' lidar uncertainty method.

  8. Impact of inherent meteorology uncertainty on air quality ...

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is important to understand how uncertainties in these inputs affect the simulated concentrations. Ensembles are one method to explore how uncertainty in meteorology affects air pollution concentrations. Most studies explore this uncertainty by running different meteorological models or the same model with different physics options and in some cases combinations of different meteorological and air quality models. While these have been shown to be useful techniques in some cases, we present a technique that leverages the initial condition perturbations of a weather forecast ensemble, namely, the Short-Range Ensemble Forecast system to drive the four-dimensional data assimilation in the Weather Research and Forecasting (WRF)-Community Multiscale Air Quality (CMAQ) model with a key focus being the response of ozone chemistry and transport. Results confirm that a sizable spread in WRF solutions, including common weather variables of temperature, wind, boundary layer depth, clouds, and radiation, can cause a relatively large range of ozone-mixing ratios. Pollutant transport can be altered by hundreds of kilometers over several days. Ozone-mixing ratios of the ensemble can vary as much as 10–20 ppb

  9. Factoring uncertainty into restoration modeling of in-situ leach uranium mines

    Science.gov (United States)

    Johnson, Raymond H.; Friedel, Michael J.

    2009-01-01

    Postmining restoration is one of the greatest concerns for uranium in-situ leach (ISL) mining operations. The ISL-affected aquifer needs to be returned to conditions specified in the mining permit (either premining or other specified conditions). When uranium ISL operations are completed, postmining restoration is usually achieved by injecting reducing agents into the mined zone. The objective of this process is to restore the aquifer to premining conditions by reducing the solubility of uranium and other metals in the ground water. Reactive transport modeling is a potentially useful method for simulating the effectiveness of proposed restoration techniques. While reactive transport models can be useful, they are a simplification of reality that introduces uncertainty through the model conceptualization, parameterization, and calibration processes. For this reason, quantifying the uncertainty in simulated temporal and spatial hydrogeochemistry is important for postremedial risk evaluation of metal concentrations and mobility. Quantifying the range of uncertainty in key predictions (such as uranium concentrations at a specific location) can be achieved using forward Monte Carlo or other inverse modeling techniques (trial-and-error parameter sensitivity, calibration constrained Monte Carlo). These techniques provide simulated values of metal concentrations at specified locations that can be presented as nonlinear uncertainty limits or probability density functions. Decisionmakers can use these results to better evaluate environmental risk as future metal concentrations with a limited range of possibilities, based on a scientific evaluation of uncertainty.

  10. Characterization of geometrical random uncertainty distribution for a group of patients in radiotherapy

    International Nuclear Information System (INIS)

    Munoz Montplet, C.; Jurado Bruggeman, D.

    2010-01-01

    Geometrical random uncertainty in radiotherapy is usually characterized by a unique value in each group of patients. We propose a novel approach based on a statistically accurate characterization of the uncertainty distribution, thus reducing the risk of obtaining potentially unsafe results in CTV-PTV margins or in the selection of correction protocols.

  11. Quantifying the sources of uncertainty in an ensemble of hydrological climate-impact projections

    Science.gov (United States)

    Aryal, Anil; Shrestha, Sangam; Babel, Mukand S.

    2018-01-01

    The objective of this paper is to quantify the various sources of uncertainty in the assessment of climate change impact on hydrology in the Tamakoshi River Basin, located in the north-eastern part of Nepal. Multiple climate and hydrological models were used to simulate future climate conditions and discharge in the basin. The simulated results of future climate and river discharge were analysed for the quantification of sources of uncertainty using two-way and three-way ANOVA. The results showed that temperature and precipitation in the study area are projected to change in near- (2010-2039), mid- (2040-2069) and far-future (2070-2099) periods. Maximum temperature is likely to rise by 1.75 °C under Representative Concentration Pathway (RCP) 4.5 and by 3.52 °C under RCP 8.5. Similarly, the minimum temperature is expected to rise by 2.10 °C under RCP 4.5 and by 3.73 °C under RCP 8.5 by the end of the twenty-first century. Similarly, the precipitation in the study area is expected to change by - 2.15% under RCP 4.5 and - 2.44% under RCP 8.5 scenarios. The future discharge in the study area was projected using two hydrological models, viz. Soil and Water Assessment Tool (SWAT) and Hydrologic Engineering Center's Hydrologic Modelling System (HEC-HMS). The SWAT model projected discharge is expected to change by small amount, whereas HEC-HMS model projected considerably lower discharge in future compared to the baseline period. The results also show that future climate variables and river hydrology contain uncertainty due to the choice of climate models, RCP scenarios, bias correction methods and hydrological models. During wet days, more uncertainty is observed due to the use of different climate models, whereas during dry days, the use of different hydrological models has a greater effect on uncertainty. Inter-comparison of the impacts of different climate models reveals that the REMO climate model shows higher uncertainty in the prediction of precipitation and

  12. Uncertainty quantification and validation of combined hydrological and macroeconomic analyses.

    Energy Technology Data Exchange (ETDEWEB)

    Hernandez, Jacquelynne; Parks, Mancel Jordan; Jennings, Barbara Joan; Kaplan, Paul Garry; Brown, Theresa Jean; Conrad, Stephen Hamilton

    2010-09-01

    Changes in climate can lead to instabilities in physical and economic systems, particularly in regions with marginal resources. Global climate models indicate increasing global mean temperatures over the decades to come and uncertainty in the local to national impacts means perceived risks will drive planning decisions. Agent-based models provide one of the few ways to evaluate the potential changes in behavior in coupled social-physical systems and to quantify and compare risks. The current generation of climate impact analyses provides estimates of the economic cost of climate change for a limited set of climate scenarios that account for a small subset of the dynamics and uncertainties. To better understand the risk to national security, the next generation of risk assessment models must represent global stresses, population vulnerability to those stresses, and the uncertainty in population responses and outcomes that could have a significant impact on U.S. national security.

  13. Use of 2D/3D data for peak cladding temperature uncertainty studies

    International Nuclear Information System (INIS)

    Boyack, B.E.

    1988-01-01

    In August 1988, the Nuclear Regulatory Commission (NRC) approved the final version of a revised rule on the acceptance of emergency core cooling systems. The revised rule allows emergency core cooling system analysis based on best-estimate methods, provided uncertainties in the prediction of prescribed acceptance limits are quantified and reported. To support the revised rule, the NRC developed the Code Scaling, Applicability, and Uncertainty (CSAU) evaluation methodology. Data from the 2D/3D program have been used in a demonstration of the CSAU methodology in two ways. First, the data were used to identify and quantify biases that are related to the implementation of selected correlations and models in the thermal-hydraulic systems code TRAC-PF1/MOD1 as it is used to calculate the demonstration transient, a large-break loss-of-coolant accident. Second, the data were used in a supportive role to provide insight into the accuracy of code calculations and to confirm conclusions that are drawn regarding specific CSAU studies. Examples are provided illustrating each of these two uses of 2D/3D data. 9 refs., 7 figs

  14. Reducing Uncertainty in the Daycent Model of Heterotrophic Respiration with a More Mechanistic Representation of Microbial Processes.

    Science.gov (United States)

    Berardi, D.; Gomez-Casanovas, N.; Hudiburg, T. W.

    2017-12-01

    Improving the certainty of ecosystem models is essential to ensuring their legitimacy, value, and ability to inform management and policy decisions. With more than a century of research exploring the variables controlling soil respiration, a high level of uncertainty remains in the ability of ecosystem models to accurately estimate respiration with changing climatic conditions. Refining model estimates of soil carbon fluxes is a high priority for climate change scientists to determine whether soils will be carbon sources or sinks in the future. We found that DayCent underestimates heterotrophic respiration by several magnitudes for our temperate mixed conifer forest site. While traditional ecosystem models simulate decomposition through first order kinetics, recent research has found that including microbial mechanisms explains 20 percent more spatial heterogeneity. We manipulated the DayCent heterotrophic respiration model to include a more mechanistic representation of microbial dynamic and compared the new model with continuous and survey observations from our experimental forest site in the Northern Rockies ecoregion. We also calibrated the model's sensitivity to soil moisture and temperature to our experimental data. We expect to improve the accuracy of the model by 20-30 percent. By using a more representative and calibrated model of soil carbon dynamics, we can better predict feedbacks between climate and soil carbon pools.

  15. Socializing Identity Through Practice: A Mixed Methods Approach to Family Medicine Resident Perspectives on Uncertainty.

    Science.gov (United States)

    Ledford, Christy J W; Cafferty, Lauren A; Seehusen, Dean A

    2015-01-01

    Uncertainty is a central theme in the practice of medicine and particularly primary care. This study explored how family medicine resident physicians react to uncertainty in their practice. This study incorporated a two-phase mixed methods approach, including semi-structured personal interviews (n=21) and longitudinal self-report surveys (n=21) with family medicine residents. Qualitative analysis showed that though residents described uncertainty as an implicit part of their identity, they still developed tactics to minimize or manage uncertainty in their practice. Residents described increasing comfort with uncertainty the longer they practiced and anticipated that growth continuing throughout their careers. Quantitative surveys showed that reactions to uncertainty were more positive over time; however, the difference was not statistically significant. Qualitative and quantitative results show that as family medicine residents practice medicine their perception of uncertainty changes. To reduce uncertainty, residents use relational information-seeking strategies. From a broader view of practice, residents describe uncertainty neutrally, asserting that uncertainty is simply part of the practice of family medicine.

  16. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  17. An algorithm to improve sampling efficiency for uncertainty propagation using sampling based method

    International Nuclear Information System (INIS)

    Campolina, Daniel; Lima, Paulo Rubens I.; Pereira, Claubia; Veloso, Maria Auxiliadora F.

    2015-01-01

    Sample size and computational uncertainty were varied in order to investigate sample efficiency and convergence of the sampling based method for uncertainty propagation. Transport code MCNPX was used to simulate a LWR model and allow the mapping, from uncertain inputs of the benchmark experiment, to uncertain outputs. Random sampling efficiency was improved through the use of an algorithm for selecting distributions. Mean range, standard deviation range and skewness were verified in order to obtain a better representation of uncertainty figures. Standard deviation of 5 pcm in the propagated uncertainties for 10 n-samples replicates was adopted as convergence criterion to the method. Estimation of 75 pcm uncertainty on reactor k eff was accomplished by using sample of size 93 and computational uncertainty of 28 pcm to propagate 1σ uncertainty of burnable poison radius. For a fixed computational time, in order to reduce the variance of the uncertainty propagated, it was found, for the example under investigation, it is preferable double the sample size than double the amount of particles followed by Monte Carlo process in MCNPX code. (author)

  18. Effect of temperature on sulphate reduction, growth rate and growth yield in five psychrophilic sulphate-reducing bacteria from Arctic sediments

    DEFF Research Database (Denmark)

    Knoblauch, C.; Jørgensen, BB

    1999-01-01

    Five psychrophilic sulphate-reducing bacteria (strains ASv26, LSv21, PSv29, LSv54 and LSv514) isolated from Arctic sediments were examined for their adaptation to permanently low temperatures, All strains grew at -1.8 degrees C, the freezing point of sea water, but their optimum temperature...... and T(opt). For strains LSv21 and LSv514, however, growth yields were highest at the lowest temperatures, around 0 degrees C. The results indicate that psychrophilic sulphate-reducing bacteria are specially adapted to permanently low temperatures by high relative growth rates and high growth yields...... at in site conditions....

  19. Economic Value of Narrowing the Uncertainty in Climate Sensitivity: Decadal Change in Shortwave Cloud Radiative Forcing and Low Cloud Feedback

    Science.gov (United States)

    Wielicki, B. A.; Cooke, R. M.; Golub, A. A.; Mlynczak, M. G.; Young, D. F.; Baize, R. R.

    2016-12-01

    Several previous studies have been published on the economic value of narrowing the uncertainty in climate sensitivity (Cooke et al. 2015, Cooke et al. 2016, Hope, 2015). All three of these studies estimated roughly 10 Trillion U.S. dollars for the Net Present Value and Real Option Value at a discount rate of 3%. This discount rate is the nominal discount rate used in the U.S. Social Cost of Carbon Memo (2010). The Cooke et al studies approached this problem by examining advances in accuracy of global temperature measurements, while the Hope 2015 study did not address the type of observations required. While temperature change is related to climate sensitivity, large uncertainties of a factor of 3 in current anthropogenic radiative forcing (IPCC, 2013) would need to be solved for advanced decadal temperature change observations to assist the challenge of narrowing climate sensitivity. The present study takes a new approach by extending the Cooke et al. 2015,2016 papers to replace observations of temperature change to observations of decadal change in the effects of changing clouds on the Earths radiative energy balance, a measurement known as Cloud Radiative Forcing, or Cloud Radiative Effect. Decadal change in this observation is direclty related to the largest uncertainty in climate sensitivity which is cloud feedback from changing amount of low clouds, primarily low clouds over the world's oceans. As a result, decadal changes in shortwave cloud radiative forcing are more directly related to cloud feedback uncertainty which is the dominant uncertainty in climate sensitivity. This paper will show results for the new approach, and allow an examination of the sensitivity of economic value results to different observations used as a constraint on uncertainty in climate sensitivity. The analysis suggests roughly a doubling of economic value to 20 Trillion Net Present Value or Real Option Value at 3% discount rate. The higher economic value results from two changes: a

  20. Reducing the uncertainty of the primary damage production in Fe

    International Nuclear Information System (INIS)

    Bjorkas, C.; Nordlund, K.

    2007-01-01

    Full text of publication follows: One of the key questions for understanding neutron irradiation damage buildup in fission and fusion reactor steels is knowing the primary damage state produced by neutron-induced atomic recoils in Fe. Supporting this is our recent study revealing that the initial damage in Fe 0.9 Cr 0.1 is essentially the same as in pure Fe [1]. In spite of decades of study, the question of what the amount and distribution of defects in Fe is, has remained highly unclear. Different computer simulations modules have given a good qualitative understanding of the cascade development [1,2]. However, quantitative differences of more than a factor of three have remained in the predicted clustered defect production numbers [2]. The disagreements between the potentials pose problems for finding a reliable predictive model for the behavior of Fe under irradiation. In this study we analyze the initial damage as predicted by three recent interatomic potentials for Fe. These are well suited for a comparison because they have very different physical motivations and functional forms, but are comparable in overall quality and in particular reproduce the energetics of interstitials in different configurations well. The potentials are those by Ackland and Mendelev et al. (AMS) [3], the 'magnetic' potential by Dudarev and Derlet (DD) [4] and the Tersoff-like analytical potential by Mueller, Erhart and Albe (MEA) [5]. The DD and MEA potentials were modified by us to describe high-energy repulsive interactions well. All potentials were then used in recoil collision cascade simulations carried out and analyzed in exactly the same manner for all potentials. Analysis of the resulting damage showed a much smaller uncertainty regarding the damage production than that of previous potentials. The total defect production numbers essentially agree within the statistical uncertainty for the three potentials. Some differences remains regarding the defect clustered fractions, but