WorldWideScience

Sample records for large uncertainties due

  1. Response of ENSO amplitude to global warming in CESM large ensemble: uncertainty due to internal variability

    Science.gov (United States)

    Zheng, Xiao-Tong; Hui, Chang; Yeh, Sang-Wook

    2018-06-01

    El Niño-Southern Oscillation (ENSO) is the dominant mode of variability in the coupled ocean-atmospheric system. Future projections of ENSO change under global warming are highly uncertain among models. In this study, the effect of internal variability on ENSO amplitude change in future climate projections is investigated based on a 40-member ensemble from the Community Earth System Model Large Ensemble (CESM-LE) project. A large uncertainty is identified among ensemble members due to internal variability. The inter-member diversity is associated with a zonal dipole pattern of sea surface temperature (SST) change in the mean along the equator, which is similar to the second empirical orthogonal function (EOF) mode of tropical Pacific decadal variability (TPDV) in the unforced control simulation. The uncertainty in CESM-LE is comparable in magnitude to that among models of the Coupled Model Intercomparison Project phase 5 (CMIP5), suggesting the contribution of internal variability to the intermodel uncertainty in ENSO amplitude change. However, the causations between changes in ENSO amplitude and the mean state are distinct between CESM-LE and CMIP5 ensemble. The CESM-LE results indicate that a large ensemble of 15 members is needed to separate the relative contributions to ENSO amplitude change over the twenty-first century between forced response and internal variability.

  2. Quantifying uncertainty in NDSHA estimates due to earthquake catalogue

    Science.gov (United States)

    Magrin, Andrea; Peresan, Antonella; Vaccari, Franco; Panza, Giuliano

    2014-05-01

    The procedure for the neo-deterministic seismic zoning, NDSHA, is based on the calculation of synthetic seismograms by the modal summation technique. This approach makes use of information about the space distribution of large magnitude earthquakes, which can be defined based on seismic history and seismotectonics, as well as incorporating information from a wide set of geological and geophysical data (e.g., morphostructural features and ongoing deformation processes identified by earth observations). Hence the method does not make use of attenuation models (GMPE), which may be unable to account for the complexity of the product between seismic source tensor and medium Green function and are often poorly constrained by the available observations. NDSHA defines the hazard from the envelope of the values of ground motion parameters determined considering a wide set of scenario earthquakes; accordingly, the simplest outcome of this method is a map where the maximum of a given seismic parameter is associated to each site. In NDSHA uncertainties are not statistically treated as in PSHA, where aleatory uncertainty is traditionally handled with probability density functions (e.g., for magnitude and distance random variables) and epistemic uncertainty is considered by applying logic trees that allow the use of alternative models and alternative parameter values of each model, but the treatment of uncertainties is performed by sensitivity analyses for key modelling parameters. To fix the uncertainty related to a particular input parameter is an important component of the procedure. The input parameters must account for the uncertainty in the prediction of fault radiation and in the use of Green functions for a given medium. A key parameter is the magnitude of sources used in the simulation that is based on catalogue informations, seismogenic zones and seismogenic nodes. Because the largest part of the existing catalogues is based on macroseismic intensity, a rough estimate

  3. Large-uncertainty intelligent states for angular momentum and angle

    International Nuclear Information System (INIS)

    Goette, Joerg B; Zambrini, Roberta; Franke-Arnold, Sonja; Barnett, Stephen M

    2005-01-01

    The equality in the uncertainty principle for linear momentum and position is obtained for states which also minimize the uncertainty product. However, in the uncertainty relation for angular momentum and angular position both sides of the inequality are state dependent and therefore the intelligent states, which satisfy the equality, do not necessarily give a minimum for the uncertainty product. In this paper, we highlight the difference between intelligent states and minimum uncertainty states by investigating a class of intelligent states which obey the equality in the angular uncertainty relation while having an arbitrarily large uncertainty product. To develop an understanding for the uncertainties of angle and angular momentum for the large-uncertainty intelligent states we compare exact solutions with analytical approximations in two limiting cases

  4. Uncertainty in techno-economic estimates of cellulosic ethanol production due to experimental measurement uncertainty

    Directory of Open Access Journals (Sweden)

    Vicari Kristin J

    2012-04-01

    Full Text Available Abstract Background Cost-effective production of lignocellulosic biofuels remains a major financial and technical challenge at the industrial scale. A critical tool in biofuels process development is the techno-economic (TE model, which calculates biofuel production costs using a process model and an economic model. The process model solves mass and energy balances for each unit, and the economic model estimates capital and operating costs from the process model based on economic assumptions. The process model inputs include experimental data on the feedstock composition and intermediate product yields for each unit. These experimental yield data are calculated from primary measurements. Uncertainty in these primary measurements is propagated to the calculated yields, to the process model, and ultimately to the economic model. Thus, outputs of the TE model have a minimum uncertainty associated with the uncertainty in the primary measurements. Results We calculate the uncertainty in the Minimum Ethanol Selling Price (MESP estimate for lignocellulosic ethanol production via a biochemical conversion process: dilute sulfuric acid pretreatment of corn stover followed by enzymatic hydrolysis and co-fermentation of the resulting sugars to ethanol. We perform a sensitivity analysis on the TE model and identify the feedstock composition and conversion yields from three unit operations (xylose from pretreatment, glucose from enzymatic hydrolysis, and ethanol from fermentation as the most important variables. The uncertainty in the pretreatment xylose yield arises from multiple measurements, whereas the glucose and ethanol yields from enzymatic hydrolysis and fermentation, respectively, are dominated by a single measurement: the fraction of insoluble solids (fIS in the biomass slurries. Conclusions We calculate a $0.15/gal uncertainty in MESP from the TE model due to uncertainties in primary measurements. This result sets a lower bound on the error bars of

  5. Quantifying uncertainty due to internal variability using high-resolution regional climate model simulations

    Science.gov (United States)

    Gutmann, E. D.; Ikeda, K.; Deser, C.; Rasmussen, R.; Clark, M. P.; Arnold, J. R.

    2015-12-01

    The uncertainty in future climate predictions is as large or larger than the mean climate change signal. As such, any predictions of future climate need to incorporate and quantify the sources of this uncertainty. One of the largest sources comes from the internal, chaotic, variability within the climate system itself. This variability has been approximated using the 30 ensemble members of the Community Earth System Model (CESM) large ensemble. Here we examine the wet and dry end members of this ensemble for cool-season precipitation in the Colorado Rocky Mountains with a set of high-resolution regional climate model simulations. We have used the Weather Research and Forecasting model (WRF) to simulate the periods 1990-2000, 2025-2035, and 2070-2080 on a 4km grid. These simulations show that the broad patterns of change depicted in CESM are inherited by the high-resolution simulations; however, the differences in the height and location of the mountains in the WRF simulation, relative to the CESM simulation, means that the location and magnitude of the precipitation changes are very different. We further show that high-resolution simulations with the Intermediate Complexity Atmospheric Research model (ICAR) predict a similar spatial pattern in the change signal as WRF for these ensemble members. We then use ICAR to examine the rest of the CESM Large Ensemble as well as the uncertainty in the regional climate model due to the choice of physics parameterizations.

  6. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    International Nuclear Information System (INIS)

    Muelaner, J E; Wang, Z; Keogh, P S; Brownell, J; Fisher, D

    2016-01-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products. (paper)

  7. Uncertainty of Doppler reactivity worth due to uncertainties of JENDL-3.2 resonance parameters

    Energy Technology Data Exchange (ETDEWEB)

    Zukeran, Atsushi [Hitachi Ltd., Hitachi, Ibaraki (Japan). Power and Industrial System R and D Div.; Hanaki, Hiroshi; Nakagawa, Tuneo; Shibata, Keiichi; Ishikawa, Makoto

    1998-03-01

    Analytical formula of Resonance Self-shielding Factor (f-factor) is derived from the resonance integral (J-function) based on NR approximation and the analytical expression for Doppler reactivity worth ({rho}) is also obtained by using the result. Uncertainties of the f-factor and Doppler reactivity worth are evaluated on the basis of sensitivity coefficients to the resonance parameters. The uncertainty of the Doppler reactivity worth at 487{sup 0}K is about 4 % for the PNC Large Fast Breeder Reactor. (author)

  8. Uncertainty Evaluation of Reactivity Coefficients for a large advanced SFR Core Design

    International Nuclear Information System (INIS)

    Khamakhem, Wassim; Rimpault, Gerald

    2008-01-01

    Sodium Cooled Fast Reactors are currently being reshaped in order to meet Generation IV goals on economics, safety and reliability, sustainability and proliferation resistance. Recent studies have led to large SFR cores for a 3600 MWth power plants, cores which exhibit interesting features. The designs have had to balance between competing aspects such as sustainability and safety characteristics. Sustainability in neutronic terms is translated into positive breeding gain and safety into rather low Na void reactivity effects. The studies have been done on two SFR concepts using oxide and carbide fuels. The use of the sensitivity theory in the ERANOS determinist code system has been used. Calculations have been performed with different sodium evaluations: JEF2.2, ERALIB-1 and the most recent JEFF3.1 and ENDF/B-VII in order to make a broad comparison. Values for the Na void reactivity effect exhibit differences as large as 14% when using the different sodium libraries. Uncertainties due to nuclear data on the reactivity coefficients were performed with BOLNA variances-covariances data, the Na Void Effect uncertainties are near to 12% at 1σ. Since, the uncertainties are far beyond the target accuracy for a design achieving high performance, two directions are envisaged: the first one is to perform new differential measurements or in a second attempt use integral experiments to improve effectively the nuclear data set and its uncertainties such as performed in the past with ERALIB1. (authors)

  9. Large break LOCA uncertainty evaluation and comparison with conservative calculation

    International Nuclear Information System (INIS)

    Glaeser, H.G.

    2004-01-01

    The first formulation of the USA Code of Federal Regulations (CFR) 10CFR50 with applicable sections specific to NPP licensing requirements was released 1976. Over a decade later 10CFR 50.46 allowed the use of BE codes instead of conservative code models but uncertainties have to be identified and quantified. Guidelines were released that described interpretations developed over the intervening years that are applicable. Other countries established similar conservative procedures and acceptance criteria. Because conservative methods were used to calculate the peak values of key parameters, such as peak clad temperature (PCT), it was always acknowledged that a large margin, between the 'conservative' calculated value and the 'true' value, existed. Beside USA, regulation in other countries, like Germany, for example, allowed that the state of science and technology is applied in licensing. I.e. the increase of experimental evidence and progress in code development during time could be used. There was no requirement to apply a pure evaluation methodology with licensed assumptions and frozen codes. The thermal-hydraulic system codes became more and more best-estimate codes based on comprehensive validation. This development was and is possible because the rules and guidelines provide the necessary latitude to consider further development of safety technology. Best estimate codes are allowed to be used in licensing in combination with conservative initial and boundary conditions. However, uncertainty quantification is not required. Since some of the initial and boundary conditions are more conservative compared with those internationally used (e.g. 106% reactor power instead 102%, a single failure plus a non-availability due to preventive maintenance is assumed, etc.) it is claimed that the uncertainties of code models are covered. Since many utilities apply for power increase, calculation results come closer to some licensing criteria. The situation in German licensing

  10. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    Science.gov (United States)

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly

  11. Estimation of Peaking Factor Uncertainty due to Manufacturing Tolerance using Statistical Sampling Method

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kyung Hoon; Park, Ho Jin; Lee, Chung Chan; Cho, Jin Young [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The purpose of this paper is to study the effect on output parameters in the lattice physics calculation due to the last input uncertainty such as manufacturing deviations from nominal value for material composition and geometric dimensions. In a nuclear design and analysis, the lattice physics calculations are usually employed to generate lattice parameters for the nodal core simulation and pin power reconstruction. These lattice parameters which consist of homogenized few-group cross-sections, assembly discontinuity factors, and form-functions can be affected by input uncertainties which arise from three different sources: 1) multi-group cross-section uncertainties, 2) the uncertainties associated with methods and modeling approximations utilized in lattice physics codes, and 3) fuel/assembly manufacturing uncertainties. In this paper, data provided by the light water reactor (LWR) uncertainty analysis in modeling (UAM) benchmark has been used as the manufacturing uncertainties. First, the effect of each input parameter has been investigated through sensitivity calculations at the fuel assembly level. Then, uncertainty in prediction of peaking factor due to the most sensitive input parameter has been estimated using the statistical sampling method, often called the brute force method. For our analysis, the two-dimensional transport lattice code DeCART2D and its ENDF/B-VII.1 based 47-group library were used to perform the lattice physics calculation. Sensitivity calculations have been performed in order to study the influence of manufacturing tolerances on the lattice parameters. The manufacturing tolerance that has the largest influence on the k-inf is the fuel density. The second most sensitive parameter is the outer clad diameter.

  12. Measurements of fusion neutron yields by neutron activation technique: Uncertainty due to the uncertainty on activation cross-sections

    Energy Technology Data Exchange (ETDEWEB)

    Stankunas, Gediminas, E-mail: gediminas.stankunas@lei.lt [Lithuanian Energy Institute, Laboratory of Nuclear Installation Safety, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Batistoni, Paola [ENEA, Via E. Fermi, 45, 00044 Frascati, Rome (Italy); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Sjöstrand, Henrik; Conroy, Sean [Department of Physics and Astronomy, Uppsala University, PO Box 516, SE-75120 Uppsala (Sweden); EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom)

    2015-07-11

    The neutron activation technique is routinely used in fusion experiments to measure the neutron yields. This paper investigates the uncertainty on these measurements as due to the uncertainties on dosimetry and activation reactions. For this purpose, activation cross-sections were taken from the International Reactor Dosimetry and Fusion File (IRDFF-v1.05) in 640 groups ENDF-6 format for several reactions of interest for both 2.5 and 14 MeV neutrons. Activation coefficients (reaction rates) have been calculated using the neutron flux spectra at JET vacuum vessel, both for DD and DT plasmas, calculated by MCNP in the required 640-energy group format. The related uncertainties for the JET neutron spectra are evaluated as well using the covariance data available in the library. These uncertainties are in general small, but not negligible when high accuracy is required in the determination of the fusion neutron yields.

  13. Alignment measurements uncertainties for large assemblies using probabilistic analysis techniques

    CERN Document Server

    AUTHOR|(CDS)2090816; Almond, Heather

    Big science and ambitious industrial projects continually push forward with technical requirements beyond the grasp of conventional engineering techniques. Example of those are ultra-high precision requirements in the field of celestial telescopes, particle accelerators and aerospace industry. Such extreme requirements are limited largely by the capability of the metrology used, namely, it’s uncertainty in relation to the alignment tolerance required. The current work was initiated as part of Maria Curie European research project held at CERN, Geneva aiming to answer those challenges as related to future accelerators requiring alignment of 2 m large assemblies to tolerances in the 10 µm range. The thesis has found several gaps in current knowledge limiting such capability. Among those was the lack of application of state of the art uncertainty propagation methods in alignment measurements metrology. Another major limiting factor found was the lack of uncertainty statements in the thermal errors compensatio...

  14. Uncertainty budget in internal monostandard NAA for small and large size samples analysis

    International Nuclear Information System (INIS)

    Dasari, K.B.; Acharya, R.

    2014-01-01

    Total uncertainty budget evaluation on determined concentration value is important under quality assurance programme. Concentration calculation in NAA or carried out by relative NAA and k0 based internal monostandard NAA (IM-NAA) method. IM-NAA method has been used for small and large sample analysis of clay potteries. An attempt was made to identify the uncertainty components in IM-NAA and uncertainty budget for La in both small and large size samples has been evaluated and compared. (author)

  15. Sensitivity/uncertainty analysis for free-in-air tissue kerma due to initial radiation at Hiroshima and Nagasaki

    International Nuclear Information System (INIS)

    Lillie, R.A.; Broadhead, B.L.; Pace, J.V. III

    1988-01-01

    Uncertainty estimates and cross correlations by range/survivor have been calculated for the Hiroshima and Nagasaki free-in-air (FIA) tissue kerma obtained from two-dimensional air/ground transport calculations. The uncertainties due to modeling parameter and basic nuclear transport data uncertainties were calculated for 700-, 1000-, and 1500-m ground ranges. Only the FIA tissue kerma due to initial radiation was treated in the analysis; the uncertainties associated with terrain and building shielding and phantom attenuation were not considered in this study. Uncertainties of --20% were obtained for the prompt neutron and secondary gamma kerma and 30% for the prompt gamma kerma at both cities. The uncertainties on the total prompt kerma at Hiroshima and Nagasaki are --18 and 15%, respectively. The estimated uncertainties vary only slightly by ground range and are fairly highly correlated. The total prompt kerma uncertainties are dominated by the secondary gamma uncertainties, which in turn are dominated by the modeling parameter uncertainties, particularly those associated with the weapon yield and radiation sources

  16. Attributing uncertainty in streamflow simulations due to variable inputs via the Quantile Flow Deviation metric

    Science.gov (United States)

    Shoaib, Syed Abu; Marshall, Lucy; Sharma, Ashish

    2018-06-01

    Every model to characterise a real world process is affected by uncertainty. Selecting a suitable model is a vital aspect of engineering planning and design. Observation or input errors make the prediction of modelled responses more uncertain. By way of a recently developed attribution metric, this study is aimed at developing a method for analysing variability in model inputs together with model structure variability to quantify their relative contributions in typical hydrological modelling applications. The Quantile Flow Deviation (QFD) metric is used to assess these alternate sources of uncertainty. The Australian Water Availability Project (AWAP) precipitation data for four different Australian catchments is used to analyse the impact of spatial rainfall variability on simulated streamflow variability via the QFD. The QFD metric attributes the variability in flow ensembles to uncertainty associated with the selection of a model structure and input time series. For the case study catchments, the relative contribution of input uncertainty due to rainfall is higher than that due to potential evapotranspiration, and overall input uncertainty is significant compared to model structure and parameter uncertainty. Overall, this study investigates the propagation of input uncertainty in a daily streamflow modelling scenario and demonstrates how input errors manifest across different streamflow magnitudes.

  17. Risk Management and Uncertainty in Large Complex Public Projects

    DEFF Research Database (Denmark)

    Neerup Themsen, Tim; Harty, Chris; Tryggestad, Kjell

    Governmental actors worldwide are promoting risk management as a rational approach to man-age uncertainty and improve the abilities to deliver large complex projects according to budget, time plans, and pre-set project specifications: But what do we know about the effects of risk management...... on the abilities to meet such objectives? Using Callon’s (1998) twin notions of framing and overflowing we examine the implementation of risk management within the Dan-ish public sector and the effects this generated for the management of two large complex pro-jects. We show how the rational framing of risk...... management have generated unexpected costly outcomes such as: the undermining of the longer-term value and societal relevance of the built asset, the negligence of the wider range of uncertainties emerging during project processes, and constraining forms of knowledge. We also show how expert accountants play...

  18. Model uncertainties in top-quark physics

    CERN Document Server

    Seidel, Markus

    2014-01-01

    The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.

  19. Extensive neutronic sensitivity-uncertainty analysis of a fusion reactor shielding blanket

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-01-01

    In this paper the results are presented of an extensive neutronic sensitivity-uncertainty study performed for the design of a shielding blanket for a next-step fusion reactor, such as ITER. A code system was used, which was developed at ECN Petten. The uncertainty in an important response parameter, the neutron heating in the inboard superconducting coils, was evaluated. Neutron transport calculations in the 100 neutron group GAM-II structure were performed using the code ANISN. For the sensitivity and uncertainty calculations the code SUSD was used. Uncertainties due to cross-section uncertainties were taken into account as well as uncertainties due to uncertainties in energy and angular distributions of scattered neutrons (SED and SAD uncertainties, respectively). The subject of direct-term uncertainties (i.e. uncertainties due to uncertainties in the kerma factors of the superconducting coils) is briefly touched upon. It is shown that SAD uncertainties, which have been largely neglected until now, contribute significantly to the total uncertainty. Moreover, the contribution of direct-term uncertainties may be large. The total uncertainty in the neutron heating, only due to Fe cross-sections, amounts to approximately 25%, which is rather large. However, uncertainty data are scarce and the data may very well be conservative. It is shown in this paper that with the code system used, sensitivity and uncertainty calculations can be performed in a straightforward way. Therefore, it is suggested that emphasis is now put on the generation of realistic, reliable covariance data for cross-sections as well as for angular and energy distributions. ((orig.))

  20. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  1. Sensitivity of Process Design due to Uncertainties in Property Estimates

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Jones, Mark Nicholas; Sarup, Bent

    2012-01-01

    The objective of this paper is to present a systematic methodology for performing analysis of sensitivity of process design due to uncertainties in property estimates. The methodology provides the following results: a) list of properties with critical importance on design; b) acceptable levels of...... in chemical processes. Among others vapour pressure accuracy for azeotropic mixtures is critical and needs to be measured or estimated with a ±0.25% accuracy to satisfy acceptable safety levels in design....

  2. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  3. Uncertainty on PIV mean and fluctuating velocity due to bias and random errors

    International Nuclear Information System (INIS)

    Wilson, Brandon M; Smith, Barton L

    2013-01-01

    Particle image velocimetry is a powerful and flexible fluid velocity measurement tool. In spite of its widespread use, the uncertainty of PIV measurements has not been sufficiently addressed to date. The calculation and propagation of local, instantaneous uncertainties on PIV results into the measured mean and Reynolds stresses are demonstrated for four PIV error sources that impact uncertainty through the vector computation: particle image density, diameter, displacement and velocity gradients. For the purpose of this demonstration, velocity data are acquired in a rectangular jet. Hot-wire measurements are compared to PIV measurements with velocity fields computed using two PIV algorithms. Local uncertainty on the velocity mean and Reynolds stress for these algorithms are automatically estimated using a previously published method. Previous work has shown that PIV measurements can become ‘noisy’ in regions of high shear as well as regions of small displacement. This paper also demonstrates the impact of these effects by comparing PIV data to data acquired using hot-wire anemometry, which does not suffer from the same issues. It is confirmed that flow gradients, large particle images and insufficient particle image displacements can result in elevated measurements of turbulence levels. The uncertainty surface method accurately estimates the difference between hot-wire and PIV measurements for most cases. The uncertainty based on each algorithm is found to be unique, motivating the use of algorithm-specific uncertainty estimates. (paper)

  4. Thermodynamic Temperatures of High-Temperature Fixed Points: Uncertainties Due to Temperature Drop and Emissivity

    Science.gov (United States)

    Castro, P.; Machin, G.; Bloembergen, P.; Lowe, D.; Whittam, A.

    2014-07-01

    This study forms part of the European Metrology Research Programme project implementing the New Kelvin to assign thermodynamic temperatures to a selected set of high-temperature fixed points (HTFPs), Cu, Co-C, Pt-C, and Re-C. A realistic thermal model of these HTFPs, developed in finite volume software ANSYS FLUENT, was constructed to quantify the uncertainty associated with the temperature drop across the back wall of the cell. In addition, the widely applied software package, STEEP3 was used to investigate the influence of cell emissivity. The temperature drop, , relates to the temperature difference due to the net loss of heat from the aperture of the cavity between the back wall of the cavity, viewed by the thermometer, defining the radiance temperature, and the solid-liquid interface of the alloy, defining the transition temperature of the HTFP. The actual value of can be used either as a correction (with associated uncertainty) to thermodynamic temperature evaluations of HTFPs, or as an uncertainty contribution to the overall estimated uncertainty. In addition, the effect of a range of furnace temperature profiles on the temperature drop was calculated and found to be negligible for Cu, Co-C, and Pt-C and small only for Re-C. The effective isothermal emissivity is calculated over the wavelength range from 450 nm to 850 nm for different assumed values of surface emissivity. Even when furnace temperature profiles are taken into account, the estimated emissivities change only slightly from the effective isothermal emissivity of the bare cell. These emissivity calculations are used to estimate the uncertainty in the temperature assignment due to the uncertainty in the emissivity of the blackbody.

  5. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    Science.gov (United States)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  6. Scalable multi-objective control for large scale water resources systems under uncertainty

    Science.gov (United States)

    Giuliani, Matteo; Quinn, Julianne; Herman, Jonathan; Castelletti, Andrea; Reed, Patrick

    2016-04-01

    The use of mathematical models to support the optimal management of environmental systems is rapidly expanding over the last years due to advances in scientific knowledge of the natural processes, efficiency of the optimization techniques, and availability of computational resources. However, undergoing changes in climate and society introduce additional challenges for controlling these systems, ultimately motivating the emergence of complex models to explore key causal relationships and dependencies on uncontrolled sources of variability. In this work, we contribute a novel implementation of the evolutionary multi-objective direct policy search (EMODPS) method for controlling environmental systems under uncertainty. The proposed approach combines direct policy search (DPS) with hierarchical parallelization of multi-objective evolutionary algorithms (MOEAs) and offers a threefold advantage: the DPS simulation-based optimization can be combined with any simulation model and does not add any constraint on modeled information, allowing the use of exogenous information in conditioning the decisions. Moreover, the combination of DPS and MOEAs prompts the generation or Pareto approximate set of solutions for up to 10 objectives, thus overcoming the decision biases produced by cognitive myopia, where narrow or restrictive definitions of optimality strongly limit the discovery of decision relevant alternatives. Finally, the use of large-scale MOEAs parallelization improves the ability of the designed solutions in handling the uncertainty due to severe natural variability. The proposed approach is demonstrated on a challenging water resources management problem represented by the optimal control of a network of four multipurpose water reservoirs in the Red River basin (Vietnam). As part of the medium-long term energy and food security national strategy, four large reservoirs have been constructed on the Red River tributaries, which are mainly operated for hydropower

  7. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Science.gov (United States)

    Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub

    2016-05-01

    Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.

  8. Sensitivity/uncertainty analysis for the Hiroshima dosimetry reevaluation effort

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Lillie, R.A.; Pace, J.V. III; Cacuci, D.G.

    1987-01-01

    Uncertainty estimates and cross correlations by range/survivor location have been obtained for the free-in-air (FIA) tissue kerma for the Hiroshima atomic event. These uncertainties in the FIA kerma include contributions due to various modeling parameters and the basic cross section data and are given at three ground ranges, 700, 1000 and 1500 m. The estimated uncertainties are nearly constant over the given ground ranges and are approximately 27% for the prompt neutron kerma and secondary gamma kerma and 35% for the prompt gamma kerma. The total kerma uncertainty is dominated by the secondary gamma kerma uncertainties which are in turn largely due to the modeling parameter uncertainties

  9. Sustainability Risk Evaluation for Large-Scale Hydropower Projects with Hybrid Uncertainty

    Directory of Open Access Journals (Sweden)

    Weiyao Tang

    2018-01-01

    Full Text Available As large-scale hydropower projects are influenced by many factors, risk evaluations are complex. This paper considers a hydropower project as a complex system from the perspective of sustainability risk, and divides it into three subsystems: the natural environment subsystem, the eco-environment subsystem and the socioeconomic subsystem. Risk-related factors and quantitative dimensions of each subsystem are comprehensively analyzed considering uncertainty of some quantitative dimensions solved by hybrid uncertainty methods, including fuzzy (e.g., the national health degree, the national happiness degree, the protection of cultural heritage, random (e.g., underground water levels, river width, and fuzzy random uncertainty (e.g., runoff volumes, precipitation. By calculating the sustainability risk-related degree in each of the risk-related factors, a sustainable risk-evaluation model is built. Based on the calculation results, the critical sustainability risk-related factors are identified and targeted to reduce the losses caused by sustainability risk factors of the hydropower project. A case study at the under-construction Baihetan hydropower station is presented to demonstrate the viability of the risk-evaluation model and to provide a reference for the sustainable risk evaluation of other large-scale hydropower projects.

  10. Calculation of design uncertainties for the development of fusion reactor blankets, taking into account uncertainties in nuclear data

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The use is demonstrated of the newly developed ECN-SUSD sensitivity/uncertainty code system. With ECN-SUSD it is possible to calculate uncertainties in response parameters in fixed source calculations due to cross section uncertainties (using MF33) as well as to uncertainties in angular distributions (using MF34). It is shown that the latter contribution, which is generally neglected because of the lack of MF34-data in modern evaluations (except for EFF), is large in fusion reactor shielding calculations. (orig.)

  11. Aleatoric and epistemic uncertainties in sampling based nuclear data uncertainty and sensitivity analyses

    International Nuclear Information System (INIS)

    Zwermann, W.; Krzykacz-Hausmann, B.; Gallner, L.; Klein, M.; Pautz, A.; Velkov, K.

    2012-01-01

    Sampling based uncertainty and sensitivity analyses due to epistemic input uncertainties, i.e. to an incomplete knowledge of uncertain input parameters, can be performed with arbitrary application programs to solve the physical problem under consideration. For the description of steady-state particle transport, direct simulations of the microscopic processes with Monte Carlo codes are often used. This introduces an additional source of uncertainty, the aleatoric sampling uncertainty, which is due to the randomness of the simulation process performed by sampling, and which adds to the total combined output sampling uncertainty. So far, this aleatoric part of uncertainty is minimized by running a sufficiently large number of Monte Carlo histories for each sample calculation, thus making its impact negligible as compared to the impact from sampling the epistemic uncertainties. Obviously, this process may cause high computational costs. The present paper shows that in many applications reliable epistemic uncertainty results can also be obtained with substantially lower computational effort by performing and analyzing two appropriately generated series of samples with much smaller number of Monte Carlo histories each. The method is applied along with the nuclear data uncertainty and sensitivity code package XSUSA in combination with the Monte Carlo transport code KENO-Va to various critical assemblies and a full scale reactor calculation. It is shown that the proposed method yields output uncertainties and sensitivities equivalent to the traditional approach, with a high reduction of computing time by factors of the magnitude of 100. (authors)

  12. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    The fields of sensitivity and uncertainty analysis have traditionally been dominated by statistical techniques when large-scale modeling codes are being analyzed. These methods are able to estimate sensitivities, generate response surfaces, and estimate response probability distributions given the input parameter probability distributions. Because the statistical methods are computationally costly, they are usually applied only to problems with relatively small parameter sets. Deterministic methods, on the other hand, are very efficient and can handle large data sets, but generally require simpler models because of the considerable programming effort required for their implementation. The first part of this paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. This second part of the paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. This paper is applicable to low-level radioactive waste disposal system performance assessment

  13. Value of Uncertainty: The Lost Opportunities in Large Projects

    Directory of Open Access Journals (Sweden)

    Agnar Johansen

    2016-08-01

    Full Text Available The uncertainty management theory has become well established over the last 20–30 years. However, the authors suggest that it does not fully address why opportunities often remain unexploited. Empirical studies show a stronger focus on mitigating risks than exploiting opportunities. This paper therefore addresses why so few opportunities are explored in large projects. The theory claims that risks and opportunities should be equally managed in the same process. In two surveys, conducted in six (private and public companies over a four-year period, project managers stated that uncertainty management is about managing risk and opportunities. However, two case studies from 12 projects from the same companies revealed that all of them had their main focus on risks, and most of the opportunities were left unexploited. We have developed a theoretical explanation model to shed light on this phenomena. The concept is a reflection based on findings from our empirical data up against current project management, uncertainty, risk and stakeholder literature. Our model shows that the threshold for pursuing a potential opportunity is high. If a potential opportunity should be considered, it must be extremely interesting, since it may require contract changes, and the project must abandon an earlier-accepted best solution.

  14. Large uncertainty in carbon uptake potential of land-based climate-change mitigation efforts.

    Science.gov (United States)

    Krause, Andreas; Pugh, Thomas A M; Bayer, Anita D; Li, Wei; Leung, Felix; Bondeau, Alberte; Doelman, Jonathan C; Humpenöder, Florian; Anthoni, Peter; Bodirsky, Benjamin L; Ciais, Philippe; Müller, Christoph; Murray-Tortarolo, Guillermo; Olin, Stefan; Popp, Alexander; Sitch, Stephen; Stehfest, Elke; Arneth, Almut

    2018-07-01

    Most climate mitigation scenarios involve negative emissions, especially those that aim to limit global temperature increase to 2°C or less. However, the carbon uptake potential in land-based climate change mitigation efforts is highly uncertain. Here, we address this uncertainty by using two land-based mitigation scenarios from two land-use models (IMAGE and MAgPIE) as input to four dynamic global vegetation models (DGVMs; LPJ-GUESS, ORCHIDEE, JULES, LPJmL). Each of the four combinations of land-use models and mitigation scenarios aimed for a cumulative carbon uptake of ~130 GtC by the end of the century, achieved either via the cultivation of bioenergy crops combined with carbon capture and storage (BECCS) or avoided deforestation and afforestation (ADAFF). Results suggest large uncertainty in simulated future land demand and carbon uptake rates, depending on the assumptions related to land use and land management in the models. Total cumulative carbon uptake in the DGVMs is highly variable across mitigation scenarios, ranging between 19 and 130 GtC by year 2099. Only one out of the 16 combinations of mitigation scenarios and DGVMs achieves an equivalent or higher carbon uptake than achieved in the land-use models. The large differences in carbon uptake between the DGVMs and their discrepancy against the carbon uptake in IMAGE and MAgPIE are mainly due to different model assumptions regarding bioenergy crop yields and due to the simulation of soil carbon response to land-use change. Differences between land-use models and DGVMs regarding forest biomass and the rate of forest regrowth also have an impact, albeit smaller, on the results. Given the low confidence in simulated carbon uptake for a given land-based mitigation scenario, and that negative emissions simulated by the DGVMs are typically lower than assumed in scenarios consistent with the 2°C target, relying on negative emissions to mitigate climate change is a highly uncertain strategy. © 2018 John

  15. Uncertainty quantification of surface-water/groundwater exchange estimates in large wetland systems using Python

    Science.gov (United States)

    Hughes, J. D.; Metz, P. A.

    2014-12-01

    Most watershed studies include observation-based water budget analyses to develop first-order estimates of significant flow terms. Surface-water/groundwater (SWGW) exchange is typically assumed to be equal to the residual of the sum of inflows and outflows in a watershed. These estimates of SWGW exchange, however, are highly uncertain as a result of the propagation of uncertainty inherent in the calculation or processing of the other terms of the water budget, such as stage-area-volume relations, and uncertainties associated with land-cover based evapotranspiration (ET) rate estimates. Furthermore, the uncertainty of estimated SWGW exchanges can be magnified in large wetland systems that transition from dry to wet during wet periods. Although it is well understood that observation-based estimates of SWGW exchange are uncertain it is uncommon for the uncertainty of these estimates to be directly quantified. High-level programming languages like Python can greatly reduce the effort required to (1) quantify the uncertainty of estimated SWGW exchange in large wetland systems and (2) evaluate how different approaches for partitioning land-cover data in a watershed may affect the water-budget uncertainty. We have used Python with the Numpy, Scipy.stats, and pyDOE packages to implement an unconstrained Monte Carlo approach with Latin Hypercube sampling to quantify the uncertainty of monthly estimates of SWGW exchange in the Floral City watershed of the Tsala Apopka wetland system in west-central Florida, USA. Possible sources of uncertainty in the water budget analysis include rainfall, ET, canal discharge, and land/bathymetric surface elevations. Each of these input variables was assigned a probability distribution based on observation error or spanning the range of probable values. The Monte Carlo integration process exposes the uncertainties in land-cover based ET rate estimates as the dominant contributor to the uncertainty in SWGW exchange estimates. We will discuss

  16. Propagation of nuclear data uncertainties for fusion power measurements

    Directory of Open Access Journals (Sweden)

    Sjöstrand Henrik

    2017-01-01

    Full Text Available Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.

  17. Systematic uncertainties in long-baseline neutrino oscillations for large θ₁₃

    Energy Technology Data Exchange (ETDEWEB)

    Coloma, Pilar; Huber, Patrick; Kopp, Joachim; Winter, Walter

    2013-02-01

    We study the physics potential of future long-baseline neutrino oscillation experiments at large θ₁₃, focusing especially on systematic uncertainties. We discuss superbeams, \\bbeams, and neutrino factories, and for the first time compare these experiments on an equal footing with respect to systematic errors. We explicitly simulate near detectors for all experiments, we use the same implementation of systematic uncertainties for all experiments, and we fully correlate the uncertainties among detectors, oscillation channels, and beam polarizations as appropriate. As our primary performance indicator, we use the achievable precision in the measurement of the CP violating phase $\\deltacp$. We find that a neutrino factory is the only instrument that can measure $\\deltacp$ with a precision similar to that of its quark sector counterpart. All neutrino beams operating at peak energies ≳2 GeV are quite robust with respect to systematic uncertainties, whereas especially \\bbeams and \\thk suffer from large cross section uncertainties in the quasi-elastic regime, combined with their inability to measure the appearance signal cross sections at the near detector. A noteworthy exception is the combination of a γ =100 \\bbeam with an \\spl-based superbeam, in which all relevant cross sections can be measured in a self-consistent way. This provides a performance, second only to the neutrino factory. For other superbeam experiments such as \\lbno and the setups studied in the context of the \\lbne reconfiguration effort, statistics turns out to be the bottleneck. In almost all cases, the near detector is not critical to control systematics since the combined fit of appearance and disappearance data already constrains the impact of systematics to be small provided that the three active flavor oscillation framework is valid.

  18. Warning and prevention based on estimates with large uncertainties: the case of low-frequency and large-impact events like tsunamis

    Science.gov (United States)

    Tinti, Stefano; Armigliato, Alberto; Pagnoni, Gianluca; Zaniboni, Filippo

    2013-04-01

    Geoscientists deal often with hazardous processes like earthquakes, volcanic eruptions, tsunamis, hurricanes, etc., and their research is aimed not only to a better understanding of the physical processes, but also to provide assessment of the space and temporal evolution of a given individual event (i.e. to provide short-term prediction) and of the expected evolution of a group of events (i.e. to provide statistical estimates referred to a given return period, and a given geographical area). One of the main issues of any scientific method is how to cope with measurement errors, a topic which in case of forecast of ongoing or of future events translates into how to deal with forecast uncertainties. In general, the more data are available and processed to make a prediction, the more accurate the prediction is expected to be if the scientific approach is sound, and the smaller the associated uncertainties are. However, there are several important cases where assessment is to be made with insufficient data or insufficient time for processing, which leads to large uncertainties. Two examples can be given taken from tsunami science, since tsunamis are rare events that may have destructive power and very large impact. One example is the case of warning for a tsunami generated by a near-coast earthquake, which is an issue at the focus of the European funded project NearToWarn. Warning has to be launched before tsunami hits the coast, that is in a few minutes after its generation. This may imply that data collected in such a short time are not yet enough for an accurate evaluation, also because the implemented monitoring system (if any) could be inadequate (f.i. one reason of inadequacy could be that implementing a dense instrumental network could be judged too expensive for rare events) The second case is the long term prevention from tsunami strikes. Tsunami infrequency may imply that the historical record for a given piece of coast is too short to capture a statistical

  19. A new robust adaptive controller for vibration control of active engine mount subjected to large uncertainties

    International Nuclear Information System (INIS)

    Fakhari, Vahid; Choi, Seung-Bok; Cho, Chang-Hyun

    2015-01-01

    This work presents a new robust model reference adaptive control (MRAC) for vibration control caused from vehicle engine using an electromagnetic type of active engine mount. Vibration isolation performances of the active mount associated with the robust controller are evaluated in the presence of large uncertainties. As a first step, an active mount with linear solenoid actuator is prepared and its dynamic model is identified via experimental test. Subsequently, a new robust MRAC based on the gradient method with σ-modification is designed by selecting a proper reference model. In designing the robust adaptive control, structured (parametric) uncertainties in the stiffness of the passive part of the mount and in damping ratio of the active part of the mount are considered to investigate the robustness of the proposed controller. Experimental and simulation results are presented to evaluate performance focusing on the robustness behavior of the controller in the face of large uncertainties. The obtained results show that the proposed controller can sufficiently provide the robust vibration control performance even in the presence of large uncertainties showing an effective vibration isolation. (paper)

  20. Evaluating Sources of Risks in Large Engineering Projects: The Roles of Equivocality and Uncertainty

    Directory of Open Access Journals (Sweden)

    Leena Pekkinen

    2015-11-01

    Full Text Available Contemporary project risk management literature introduces uncertainty, i.e., the lack of information, as a fundamental basis of project risks. In this study the authors assert that equivocality, i.e., the existence of multiple and conflicting interpretations, can also serve as a basis of risks. With an in-depth empirical investigation of a large complex engineering project the authors identified risk sources having their bases in the situations where uncertainty or equivocality was the predominant attribute. The information processing theory proposes different managerial practices for risk management based on the sources of risks in uncertainty or equivocality.

  1. Quantification of Back-End Nuclear Fuel Cycle Metrics Uncertainties Due to Cross Sections

    International Nuclear Information System (INIS)

    Tracy E. Stover Jr.

    2007-01-01

    This work examines uncertainties in the back end fuel cycle metrics of isotopic composition, decay heat, radioactivity, and radiotoxicity. Most advanced fuel cycle scenarios, including the ones represented in this work, are limited by one or more of these metrics, so that quantification of them becomes of great importance in order to optimize or select one of these scenarios. Uncertainty quantification, in this work, is performed by propagating cross-section covariance data, and later number density covariance data, through a reactor physics and depletion code sequence. Propagation of uncertainty is performed primarily via the Efficient Subspace Method (ESM). ESM decomposes the covariance data into singular pairs and perturbs input data along independent directions of the uncertainty and only for the most significant values of that uncertainty. Results of these perturbations being collected, ESM directly calculates the covariance of the observed output posteriori. By exploiting the rank deficient nature of the uncertainty data, ESM works more efficiently than traditional stochastic sampling, but is shown to produce equivalent results. ESM is beneficial for very detailed models with large amounts of input data that make stochastic sampling impractical. In this study various fuel cycle scenarios are examined. Simplified, representative models of pressurized water reactor (PWR) and boiling water reactor (BWR) fuels composed of both uranium oxide and mixed oxides are examined. These simple models are intended to give a representation of the uncertainty that can be associated with open uranium oxide fuel cycles and closed mixed oxide fuel cycles. The simplified models also serve as a demonstration to show that ESM and stochastic sampling produce equivalent results, because these models require minimum computer resources and have amounts of input data small enough such that either method can be quickly implemented and a numerical experiment performed. The simplified

  2. Managing the continuum certainty, uncertainty, unpredictability in large engineering projects

    CERN Document Server

    Caron, Franco

    2013-01-01

    The brief will describe how to develop a risk analysis applied to a project , through a sequence of steps: risk management planning, risk identification, risk classification, risk assessment, risk quantification, risk response planning, risk monitoring and control, process close out and lessons learning. The project risk analysis and management process will be applied to large engineering projects, in particular related to the oil and gas industry. The brief will address the overall range of possible events affecting the project moving from certainty (project issues) through uncertainty (project risks) to unpredictability (unforeseeable events), considering both negative and positive events. Some quantitative techniques (simulation, event tree, Bayesian inference, etc.) will be used to develop risk quantification. The brief addresses a typical subject in the area of project management, with reference to large engineering projects concerning the realization of large plants and infrastructures. These projects a...

  3. Deterministic methods for sensitivity and uncertainty analysis in large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Oblow, E.M.; Pin, F.G.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.; Lucius, J.L.

    1987-01-01

    The fields of sensitivity and uncertainty analysis are dominated by statistical techniques when large-scale modeling codes are being analyzed. This paper reports on the development and availability of two systems, GRESS and ADGEN, that make use of computer calculus compilers to automate the implementation of deterministic sensitivity analysis capability into existing computer models. This automation removes the traditional limitation of deterministic sensitivity methods. The paper describes a deterministic uncertainty analysis method (DUA) that uses derivative information as a basis to propagate parameter probability distributions to obtain result probability distributions. The paper demonstrates the deterministic approach to sensitivity and uncertainty analysis as applied to a sample problem that models the flow of water through a borehole. The sample problem is used as a basis to compare the cumulative distribution function of the flow rate as calculated by the standard statistical methods and the DUA method. The DUA method gives a more accurate result based upon only two model executions compared to fifty executions in the statistical case

  4. Implications of nuclear data uncertainties to reactor design

    International Nuclear Information System (INIS)

    Greebler, P.; Hutchins, B.A.; Cowan, C.L.

    1970-01-01

    Uncertainties in nuclear data require significant allowances to be made in the design and the operating conditions of reactor cores and of shielded-reactor-plant and fuel-processing systems. These allowances result in direct cost increases due to overdesign of components and equipment and reduced core and fuel operating performance. Compromising the allowances for data uncertainties has indirect cost implications due to increased risks of failure to meet plant and fuel performance objectives, with warrantees involved in some cases, and to satisfy licensed safety requirements. Fast breeders are the most sensitive power reactors to the uncertainties in nuclear data over the neutron energy range of interest for fission reactors, and this paper focuses on the implications of the data uncertainties to design and operation of fast breeder reactors and fuel-processing systems. The current status of uncertainty in predicted physics parameters due to data uncertainties is reviewed and compared with the situation in 1966 and that projected for within the next two years due to anticipated data improvements. Implications of the uncertainties in the predicted physics parameters to design and operation are discussed for both a near-term prototype or demonstration breeder plant (∼300 MW(e)) and a longer-term large (∼1000 MW(e)) plant. Significant improvements in the nuclear data have been made during the past three years, the most important of these to fast power reactors being the 239 Pu alpha below 15 keV. The most important remaining specific data uncertainties are illustrated by their individual contributions to the computational uncertainty of selected physics parameters, and recommended priorities and accuracy requirements for improved data are presented

  5. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.; Mai, Paul Martin

    2014-01-01

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  6. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.

    2014-03-25

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  7. Sampling based uncertainty analysis of 10% hot leg break LOCA in large scale test facility

    International Nuclear Information System (INIS)

    Sengupta, Samiran; Kraina, V.; Dubey, S. K.; Rao, R. S.; Gupta, S. K.

    2010-01-01

    Sampling based uncertainty analysis was carried out to quantify uncertainty in predictions of best estimate code RELAP5/MOD3.2 for a thermal hydraulic test (10% hot leg break LOCA) performed in the Large Scale Test Facility (LSTF) as a part of an IAEA coordinated research project. The nodalisation of the test facility was qualified for both steady state and transient level by systematically applying the procedures led by uncertainty methodology based on accuracy extrapolation (UMAE); uncertainty analysis was carried out using the Latin hypercube sampling (LHS) method to evaluate uncertainty for ten input parameters. Sixteen output parameters were selected for uncertainty evaluation and uncertainty band between 5 th and 95 th percentile of the output parameters were evaluated. It was observed that the uncertainty band for the primary pressure during two phase blowdown is larger than that of the remaining period. Similarly, a larger uncertainty band is observed relating to accumulator injection flow during reflood phase. Importance analysis was also carried out and standard rank regression coefficients were computed to quantify the effect of each individual input parameter on output parameters. It was observed that the break discharge coefficient is the most important uncertain parameter relating to the prediction of all the primary side parameters and that the steam generator (SG) relief pressure setting is the most important parameter in predicting the SG secondary pressure

  8. Interpretation of the peak areas in gamma-ray spectra that have a large relative uncertainty

    International Nuclear Information System (INIS)

    Korun, M.; Maver Modec, P.; Vodenik, B.

    2012-01-01

    Empirical evidence is provided that the areas of peaks having a relative uncertainty in excess of 30% are overestimated. This systematic influence is of a statistical nature and originates in way the peak-analyzing routine recognizes the small peaks. It is not easy to detect this influence since it is smaller than the peak-area uncertainty. However, the systematic influence can be revealed in repeated measurements under the same experimental conditions, e.g., in background measurements. To evaluate the systematic influence, background measurements were analyzed with the peak-analyzing procedure described by Korun et al. (2008). The magnitude of the influence depends on the relative uncertainty of the peak area and may amount, in the conditions used in the peak analysis, to a factor of 5 at relative uncertainties exceeding 60%. From the measurements, the probability for type-II errors, as a function of the relative uncertainty of the peak area, was extracted. This probability is near zero below an uncertainty of 30% and rises to 90% at uncertainties exceeding 50%. - Highlights: ► A systematic influence affecting small peak areas in gamma-ray spectra is described. ► The influence originates in the peak locating procedure, using a pre-determined sensitivity. ► The predetermined sensitivity makes peak areas with large uncertainties to be overestimated. ► The influence depends on the relative uncertainty of the number of counts in the peak. ► Corrections exceeding a factor of 3 are attained at peak area uncertainties exceeding 60%.

  9. Uncertainties in environmental impact assessments due to expert opinion. Case study. Radioactive waste in Slovenia

    International Nuclear Information System (INIS)

    Kontic, B.; Ravnik, M.

    1998-01-01

    A comprehensive study was done at the J. Stefan Institute in Ljubljana and the School of Environmental Sciences in Nova Gorica in relation to sources of uncertainties in long-term environmental impact assessment (EIA). Under the research two main components were examined: first, methodology of the preparation of an EIA, and second validity of an expert opinion. Following the findings of the research a survey was performed in relation to assessing acceptability of radioactive waste repository by the regulatory. The components of dose evaluation in different time frames were examined in terms of susceptibility to uncertainty. Uncertainty associated to human exposure in the far future is so large that dose and risk, as individual numerical indicators of safety, by our opinion, should not be used in compliance assessment for radioactive waste repository. On the other hand, results of the calculations on the amount and activity of low and intermediate level waste and the spent fuel from the Krsko NPP show that expert's understanding of the treated questions can be expressed in transparent way giving credible output of the models used.(author)

  10. Nuclear Physical Uncertainties in Modeling X-Ray Bursts

    Science.gov (United States)

    Regis, Eric; Amthor, A. Matthew

    2017-09-01

    Type I x-ray bursts occur when a neutron star accretes material from the surface of another star in a compact binary star system. For certain accretion rates and material compositions, much of the nuclear material is burned in short, explosive bursts. Using a one-dimensional stellar model, Kepler, and a comprehensive nuclear reaction rate library, ReacLib, we have simulated chains of type I x-ray bursts. Unfortunately, there are large remaining uncertainties in the nuclear reaction rates involved, since many of the isotopes reacting are unstable and have not yet been studied experimentally. Some individual reactions, when varied within their estimated uncertainty, alter the light curves dramatically. This limits our ability to understand the structure of the neutron star. Previous studies have looked at the effects of individual reaction rate uncertainties. We have applied a Monte Carlo method ``-simultaneously varying a set of reaction rates'' -in order to probe the expected uncertainty in x-ray burst behaviour due to the total uncertainty in all nuclear reaction rates. Furthermore, we aim to discover any nonlinear effects due to the coupling between different reaction rates. Early results show clear non-linear effects. This research was made possible by NSF-DUE Grant 1317446, BUScholars Program.

  11. Uncertainty in eddy covariance measurements and its application to physiological models

    Science.gov (United States)

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  12. Effects of uncertainty in model predictions of individual tree volume on large area volume estimates

    Science.gov (United States)

    Ronald E. McRoberts; James A. Westfall

    2014-01-01

    Forest inventory estimates of tree volume for large areas are typically calculated by adding model predictions of volumes for individual trees. However, the uncertainty in the model predictions is generally ignored with the result that the precision of the large area volume estimates is overestimated. The primary study objective was to estimate the effects of model...

  13. Uncertainty assessment of a model for biological nitrogen and phosphorus removal: Application to a large wastewater treatment plant

    Science.gov (United States)

    Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare

    In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.

  14. Uncertainty in the Future Distribution of Tropospheric Ozone over West Africa due to Variability in Anthropogenic Emissions Estimates between 2025 and 2050

    Directory of Open Access Journals (Sweden)

    J. E. Williams

    2011-01-01

    Full Text Available Particle and trace gas emissions due to anthropogenic activity are expected to increase significantly in West Africa over the next few decades due to rising population and more energy intensive lifestyles. Here we perform 3D global chemistry-transport model calculations for 2025 and 2050 using both a “business-as-usual” (A1B and “clean economy” (B1 future anthropogenic emission scenario to focus on the changes in the distribution and uncertainties associated with tropospheric O3 due to the various projected emission scenarios. When compared to the present-day troposphere we find that there are significant increases in tropospheric O3 for the A1B emission scenario, with the largest increases being located in the lower troposphere near the source regions and into the Sahel around 15–20°N. In part this increase is due to more efficient NOx re-cycling related to increases in the background methane concentrations. Examining the uncertainty across different emission inventories reveals that there is an associated uncertainty of up to ~20% in the predicted increases at 2025 and 2050. For the upper troposphere, where increases in O3 have a more pronounced impact on radiative forcing, the uncertainty is influenced by transport of O3 rich air from Asia on the Tropical Easterly Jet.

  15. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  16. Characterizing Uncertainty In Electrical Resistivity Tomography Images Due To Subzero Temperature Variability

    Science.gov (United States)

    Herring, T.; Cey, E. E.; Pidlisecky, A.

    2017-12-01

    Time-lapse electrical resistivity tomography (ERT) is used to image changes in subsurface electrical conductivity (EC), e.g. due to a saline contaminant plume. Temperature variation also produces an EC response, which interferes with the signal of interest. Temperature compensation requires the temperature distribution and the relationship between EC and temperature, but this relationship at subzero temperatures is not well defined. The goal of this study is to examine how uncertainty in the subzero EC/temperature relationship manifests in temperature corrected ERT images, especially with respect to relevant plume parameters (location, contaminant mass, etc.). First, a lab experiment was performed to determine the EC of fine-grained glass beads over a range of temperatures (-20° to 20° C) and saturations. The measured EC/temperature relationship was then used to add temperature effects to a hypothetical EC model of a conductive plume. Forward simulations yielded synthetic field data to which temperature corrections were applied. Varying the temperature/EC relationship used in the temperature correction and comparing the temperature corrected ERT results to the synthetic model enabled a quantitative analysis of the error of plume parameters associated with temperature variability. Modeling possible scenarios in this way helps to establish the feasibility of different time-lapse ERT applications by quantifying the uncertainty associated with parameter(s) of interest.

  17. Robust design optimization method for centrifugal impellers under surface roughness uncertainties due to blade fouling

    Science.gov (United States)

    Ju, Yaping; Zhang, Chuhua

    2016-03-01

    Blade fouling has been proved to be a great threat to compressor performance in operating stage. The current researches on fouling-induced performance degradations of centrifugal compressors are based mainly on simplified roughness models without taking into account the realistic factors such as spatial non-uniformity and randomness of the fouling-induced surface roughness. Moreover, little attention has been paid to the robust design optimization of centrifugal compressor impellers with considerations of blade fouling. In this paper, a multi-objective robust design optimization method is developed for centrifugal impellers under surface roughness uncertainties due to blade fouling. A three-dimensional surface roughness map is proposed to describe the nonuniformity and randomness of realistic fouling accumulations on blades. To lower computational cost in robust design optimization, the support vector regression (SVR) metamodel is combined with the Monte Carlo simulation (MCS) method to conduct the uncertainty analysis of fouled impeller performance. The analyzed results show that the critical fouled region associated with impeller performance degradations lies at the leading edge of blade tip. The SVR metamodel has been proved to be an efficient and accurate means in the detection of impeller performance variations caused by roughness uncertainties. After design optimization, the robust optimal design is found to be more efficient and less sensitive to fouling uncertainties while maintaining good impeller performance in the clean condition. This research proposes a systematic design optimization method for centrifugal compressors with considerations of blade fouling, providing a practical guidance to the design of advanced centrifugal compressors.

  18. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  19. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  20. Fuzzy Uncertainty Evaluation for Fault Tree Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ki Beom; Shim, Hyung Jin [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hanyang University, Seoul (Korea, Republic of)

    2015-05-15

    This traditional probabilistic approach can calculate relatively accurate results. However it requires a long time because of repetitive computation due to the MC method. In addition, when informative data for statistical analysis are not sufficient or some events are mainly caused by human error, the probabilistic approach may not be possible because uncertainties of these events are difficult to be expressed by probabilistic distributions. In order to reduce the computation time and quantify uncertainties of top events when basic events whose uncertainties are difficult to be expressed by probabilistic distributions exist, the fuzzy uncertainty propagation based on fuzzy set theory can be applied. In this paper, we develop a fuzzy uncertainty propagation code and apply the fault tree of the core damage accident after the large loss of coolant accident (LLOCA). The fuzzy uncertainty propagation code is implemented and tested for the fault tree of the radiation release accident. We apply this code to the fault tree of the core damage accident after the LLOCA in three cases and compare the results with those computed by the probabilistic uncertainty propagation using the MC method. The results obtained by the fuzzy uncertainty propagation can be calculated in relatively short time, covering the results obtained by the probabilistic uncertainty propagation.

  1. Large contribution of natural aerosols to uncertainty in indirect forcing

    Science.gov (United States)

    Carslaw, K. S.; Lee, L. A.; Reddington, C. L.; Pringle, K. J.; Rap, A.; Forster, P. M.; Mann, G. W.; Spracklen, D. V.; Woodhouse, M. T.; Regayre, L. A.; Pierce, J. R.

    2013-11-01

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  2. Large contribution of natural aerosols to uncertainty in indirect forcing.

    Science.gov (United States)

    Carslaw, K S; Lee, L A; Reddington, C L; Pringle, K J; Rap, A; Forster, P M; Mann, G W; Spracklen, D V; Woodhouse, M T; Regayre, L A; Pierce, J R

    2013-11-07

    The effect of anthropogenic aerosols on cloud droplet concentrations and radiative properties is the source of one of the largest uncertainties in the radiative forcing of climate over the industrial period. This uncertainty affects our ability to estimate how sensitive the climate is to greenhouse gas emissions. Here we perform a sensitivity analysis on a global model to quantify the uncertainty in cloud radiative forcing over the industrial period caused by uncertainties in aerosol emissions and processes. Our results show that 45 per cent of the variance of aerosol forcing since about 1750 arises from uncertainties in natural emissions of volcanic sulphur dioxide, marine dimethylsulphide, biogenic volatile organic carbon, biomass burning and sea spray. Only 34 per cent of the variance is associated with anthropogenic emissions. The results point to the importance of understanding pristine pre-industrial-like environments, with natural aerosols only, and suggest that improved measurements and evaluation of simulated aerosols in polluted present-day conditions will not necessarily result in commensurate reductions in the uncertainty of forcing estimates.

  3. Exploring the uncertainty in attributing sediment contributions in fingerprinting studies due to uncertainty in determining element concentrations in source areas.

    Science.gov (United States)

    Gomez, Jose Alfonso; Owens, Phillip N.; Koiter, Alex J.; Lobb, David

    2016-04-01

    One of the major sources of uncertainty in attributing sediment sources in fingerprinting studies is the uncertainty in determining the concentrations of the elements used in the mixing model due to the variability of the concentrations of these elements in the source materials (e.g., Kraushaar et al., 2015). The uncertainty in determining the "true" concentration of a given element in each one of the source areas depends on several factors, among them the spatial variability of that element, the sampling procedure and sampling density. Researchers have limited control over these factors, and usually sampling density tends to be sparse, limited by time and the resources available. Monte Carlo analysis has been used regularly in fingerprinting studies to explore the probable solutions within the measured variability of the elements in the source areas, providing an appraisal of the probability of the different solutions (e.g., Collins et al., 2012). This problem can be considered analogous to the propagation of uncertainty in hydrologic models due to uncertainty in the determination of the values of the model parameters, and there are many examples of Monte Carlo analysis of this uncertainty (e.g., Freeze, 1980; Gómez et al., 2001). Some of these model analyses rely on the simulation of "virtual" situations that were calibrated from parameter values found in the literature, with the purpose of providing insight about the response of the model to different configurations of input parameters. This approach - evaluating the answer for a "virtual" problem whose solution could be known in advance - might be useful in evaluating the propagation of uncertainty in mixing models in sediment fingerprinting studies. In this communication, we present the preliminary results of an on-going study evaluating the effect of variability of element concentrations in source materials, sampling density, and the number of elements included in the mixing models. For this study a virtual

  4. Uncertainties in effective dose estimates of adult CT head scans: The effect of head size

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2009-01-01

    Purpose: This study is an extension of a previous study where the uncertainties in effective dose estimates from adult CT head scans were calculated using four CT effective dose estimation methods, three of which were computer programs (CT-EXPO, CTDOSIMETRY, and IMPACTDOSE) and one that involved the dose length product (DLP). However, that study did not include the uncertainty contribution due to variations in head sizes. Methods: The uncertainties due to head size variations were estimated by first using the computer program data to calculate doses to small and large heads. These doses were then compared with doses calculated for the phantom heads used by the computer programs. An uncertainty was then assigned based on the difference between the small and large head doses and the doses of the phantom heads. Results: The uncertainties due to head size variations alone were found to be between 4% and 26% depending on the method used and the patient gender. When these uncertainties were included with the results of the previous study, the overall uncertainties in effective dose estimates (stated at the 95% confidence interval) were 20%-31% (CT-EXPO), 15%-30% (CTDOSIMETRY), 20%-36% (IMPACTDOSE), and 31%-40% (DLP). Conclusions: For the computer programs, the lower overall uncertainties were still achieved when measured values of CT dose index were used rather than tabulated values. For DLP dose estimates, head size variations made the largest (for males) and second largest (for females) contributions to effective dose uncertainty. An improvement in the uncertainty of the DLP method dose estimates will be achieved if head size variation can be taken into account.

  5. Uncertainties in effective dose estimates of adult CT head scans: The effect of head size

    Energy Technology Data Exchange (ETDEWEB)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E. [Department of Medical Physics, Royal Adelaide Hospital, Adelaide, South Australia 5000 (Australia) and School of Electrical and Information Engineering (Applied Physics), University of South Australia, Mawson Lakes, South Australia 5095 (Australia); Division of Medical Imaging, Women' s and Children' s Hospital, North Adelaide, South Australia 5006 (Australia) and School of Electrical and Information Engineering (Applied Physics), University of South Australia, Mawson Lakes, South Australia 5095 (Australia); School of Electrical and Information Engineering (Applied Physics), University of South Australia, Mawson Lakes, South Australia 5095 (Australia)

    2009-09-15

    Purpose: This study is an extension of a previous study where the uncertainties in effective dose estimates from adult CT head scans were calculated using four CT effective dose estimation methods, three of which were computer programs (CT-EXPO, CTDOSIMETRY, and IMPACTDOSE) and one that involved the dose length product (DLP). However, that study did not include the uncertainty contribution due to variations in head sizes. Methods: The uncertainties due to head size variations were estimated by first using the computer program data to calculate doses to small and large heads. These doses were then compared with doses calculated for the phantom heads used by the computer programs. An uncertainty was then assigned based on the difference between the small and large head doses and the doses of the phantom heads. Results: The uncertainties due to head size variations alone were found to be between 4% and 26% depending on the method used and the patient gender. When these uncertainties were included with the results of the previous study, the overall uncertainties in effective dose estimates (stated at the 95% confidence interval) were 20%-31% (CT-EXPO), 15%-30% (CTDOSIMETRY), 20%-36% (IMPACTDOSE), and 31%-40% (DLP). Conclusions: For the computer programs, the lower overall uncertainties were still achieved when measured values of CT dose index were used rather than tabulated values. For DLP dose estimates, head size variations made the largest (for males) and second largest (for females) contributions to effective dose uncertainty. An improvement in the uncertainty of the DLP method dose estimates will be achieved if head size variation can be taken into account.

  6. Uncertainty in soil-structure interaction analysis of a nuclear power plant due to different analytical techniques

    International Nuclear Information System (INIS)

    Chen, J.C.; Chun, R.C.; Goudreau, G.L.; Maslenikov, O.R.; Johnson, J.J.

    1984-01-01

    This paper summarizes the results of the dynamic response analysis of the Zion reactor containment building using three different soil-structure interaction (SSI) analytical procedures: the substructure method, CLASSI; the equivalent linear finite element approach, ALUSH and the nonlinear finite element procedure, DYNA3D. Uncertainties in analyzing a soil-structure system due to SSI analysis procedures were investigated. Responses at selected locations in the structure were compared: peak accelerations and response spectra

  7. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Science.gov (United States)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  8. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    Energy Technology Data Exchange (ETDEWEB)

    Huan, Xun [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Geraci, Gianluca [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eldred, Michael S. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Vane, Zachary P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Lacaze, Guilhem [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Oefelein, Joseph C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  9. Assessing Fatigue and Ultimate Load Uncertainty in Floating Offshore Wind Turbines Due to Varying Simulation Length

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, G.; Lackner, M.; Haid, L.; Matha, D.; Jonkman, J.; Robertson, A.

    2013-07-01

    With the push towards siting wind turbines farther offshore due to higher wind quality and less visibility, floating offshore wind turbines, which can be located in deep water, are becoming an economically attractive option. The International Electrotechnical Commission's (IEC) 61400-3 design standard covers fixed-bottom offshore wind turbines, but there are a number of new research questions that need to be answered to modify these standards so that they are applicable to floating wind turbines. One issue is the appropriate simulation length needed for floating turbines. This paper will discuss the results from a study assessing the impact of simulation length on the ultimate and fatigue loads of the structure, and will address uncertainties associated with changing the simulation length for the analyzed floating platform. Recommendations of required simulation length based on load uncertainty will be made and compared to current simulation length requirements.

  10. Confronting the Uncertainty in Aerosol Forcing Using Comprehensive Observational Data

    Science.gov (United States)

    Johnson, J. S.; Regayre, L. A.; Yoshioka, M.; Pringle, K.; Sexton, D.; Lee, L.; Carslaw, K. S.

    2017-12-01

    The effect of aerosols on cloud droplet concentrations and radiative properties is the largest uncertainty in the overall radiative forcing of climate over the industrial period. In this study, we take advantage of a large perturbed parameter ensemble of simulations from the UK Met Office HadGEM-UKCA model (the aerosol component of the UK Earth System Model) to comprehensively sample uncertainty in aerosol forcing. Uncertain aerosol and atmospheric parameters cause substantial aerosol forcing uncertainty in climatically important regions. As the aerosol radiative forcing itself is unobservable, we investigate the potential for observations of aerosol and radiative properties to act as constraints on the large forcing uncertainty. We test how eight different theoretically perfect aerosol and radiation observations can constrain the forcing uncertainty over Europe. We find that the achievable constraint is weak unless many diverse observations are used simultaneously. This is due to the complex relationships between model output responses and the multiple interacting parameter uncertainties: compensating model errors mean there are many ways to produce the same model output (known as model equifinality) which impacts on the achievable constraint. However, using all eight observable quantities together we show that the aerosol forcing uncertainty can potentially be reduced by around 50%. This reduction occurs as we reduce a large sample of model variants (over 1 million) that cover the full parametric uncertainty to around 1% that are observationally plausible.Constraining the forcing uncertainty using real observations is a more complex undertaking, in which we must account for multiple further uncertainties including measurement uncertainties, structural model uncertainties and the model discrepancy from reality. Here, we make a first attempt to determine the true potential constraint on the forcing uncertainty from our model that is achievable using a comprehensive

  11. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Czech Academy of Sciences Publication Activity Database

    Yu, X.; Lamačová, Anna; Duffy, Ch.; Krám, P.; Hruška, Jakub

    2016-01-01

    Roč. 90, part B (2016), s. 90-101 ISSN 0098-3004 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : Uncertainty * Evapotranspiration * Forest management * PIHM * Biome-BGC Subject RIV: DA - Hydrology ; Limnology OBOR OECD: Hydrology Impact factor: 2.533, year: 2016

  12. Methodologies of Uncertainty Propagation Calculation

    International Nuclear Information System (INIS)

    Chojnacki, Eric

    2002-01-01

    After recalling the theoretical principle and the practical difficulties of the methodologies of uncertainty propagation calculation, the author discussed how to propagate input uncertainties. He said there were two kinds of input uncertainty: - variability: uncertainty due to heterogeneity, - lack of knowledge: uncertainty due to ignorance. It was therefore necessary to use two different propagation methods. He demonstrated this in a simple example which he generalised, treating the variability uncertainty by the probability theory and the lack of knowledge uncertainty by the fuzzy theory. He cautioned, however, against the systematic use of probability theory which may lead to unjustifiable and illegitimate precise answers. Mr Chojnacki's conclusions were that the importance of distinguishing variability and lack of knowledge increased as the problem was getting more and more complex in terms of number of parameters or time steps, and that it was necessary to develop uncertainty propagation methodologies combining probability theory and fuzzy theory

  13. Assessing Groundwater Model Uncertainty for the Central Nevada Test Area

    International Nuclear Information System (INIS)

    Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd

    2002-01-01

    The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation

  14. Uncertainty analysis and validation of environmental models. The empirically based uncertainty analysis

    International Nuclear Information System (INIS)

    Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie

    1996-01-01

    The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae

  15. Uncertainties on decay heat power due to fission product data uncertainties; Incertitudes sur la puissance residuelle dues aux incertitudes sur les donnees de produits de fission

    Energy Technology Data Exchange (ETDEWEB)

    Rebah, J

    1998-08-01

    Following a reactor shutdown, after the fission process has completely faded out, a significant quantity of energy known as 'decay heat' continues to be generated in the core. The knowledge with a good precision of the decay heat released in a fuel after reactor shutdown is necessary for: residual heat removal for normal operation or emergency shutdown condition, the design of cooling systems and spent fuel handling. By the summation calculations method, the decay heat is equal to the sum of the energies released by individual fission products. Under taking into account all nuclides that contribute significantly to the total decay heat, the results from summation method are comparable with the measured ones. Without the complete covariance information of nuclear data, the published uncertainty analyses of fission products decay heat summation calculation give underestimated errors through the variance/covariance analysis in consideration of correlation between the basic nuclear data, we calculate in this work the uncertainties on the decay heat associated with the summation calculations. Contribution to the total error of decay heat comes from uncertainties in three terms: fission yields, half-lives and average beta and gamma decay energy. (author)

  16. ESFR core optimization and uncertainty studies

    International Nuclear Information System (INIS)

    Rineiski, A.; Vezzoni, B.; Zhang, D.; Marchetti, M.; Gabrielli, F.; Maschek, W.; Chen, X.-N.; Buiron, L.; Krepel, J.; Sun, K.; Mikityuk, K.; Polidoro, F.; Rochman, D.; Koning, A.J.; DaCruz, D.F.; Tsige-Tamirat, H.; Sunderland, R.

    2015-01-01

    In the European Sodium Fast Reactor (ESFR) project supported by EURATOM in 2008-2012, a concept for a large 3600 MWth sodium-cooled fast reactor design was investigated. In particular, reference core designs with oxide and carbide fuel were optimized to improve their safety parameters. Uncertainties in these parameters were evaluated for the oxide option. Core modifications were performed first to reduce the sodium void reactivity effect. Introduction of a large sodium plenum with an absorber layer above the core and a lower axial fertile blanket improve the total sodium void effect appreciably, bringing it close to zero for a core with fresh fuel, in line with results obtained worldwide, while not influencing substantially other core physics parameters. Therefore an optimized configuration, CONF2, with a sodium plenum and a lower blanket was established first and used as a basis for further studies in view of deterioration of safety parameters during reactor operation. Further options to study were an inner fertile blanket, introduction of moderator pins, a smaller core height, special designs for pins, such as 'empty' pins, and subassemblies. These special designs were proposed to facilitate melted fuel relocation in order to avoid core re-criticality under severe accident conditions. In the paper further CONF2 modifications are compared in terms of safety and fuel balance. They may bring further improvements in safety, but their accurate assessment requires additional studies, including transient analyses. Uncertainty studies were performed by employing a so-called Total Monte-Carlo method, for which a large number of nuclear data files is produced for single isotopes and then used in Monte-Carlo calculations. The uncertainties for the criticality, sodium void and Doppler effects, effective delayed neutron fraction due to uncertainties in basic nuclear data were assessed for an ESFR core. They prove applicability of the available nuclear data for ESFR

  17. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1984-01-01

    . The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis...... is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  18. Two-dimensional cross-section and SED uncertainty analysis for the Fusion Engineering Device (FED)

    International Nuclear Information System (INIS)

    Embrechts, M.J.; Urban, W.T.; Dudziak, D.J.

    1982-01-01

    The theory of two-dimensional cross-section and secondary-energy-distribution (SED) sensitivity was implemented by developing a two-dimensional sensitivity and uncertainty analysis code, SENSIT-2D. Analyses of the Fusion Engineering Design (FED) conceptual inboard shield indicate that, although the calculated uncertainties in the 2-D model are of the same order of magnitude as those resulting from the 1-D model, there might be severe differences. The more complex the geometry, the more compulsory a 2-D analysis becomes. Specific results show that the uncertainty for the integral heating of the toroidal field (TF) coil for the FED is 114.6%. The main contributors to the cross-section uncertainty are chromium and iron. Contributions to the total uncertainty were smaller for nickel, copper, hydrogen and carbon. All analyses were performed with the Los Alamos 42-group cross-section library generated from ENDF/B-V data, and the COVFILS covariance matrix library. The large uncertainties due to chromium result mainly from large convariances for the chromium total and elastic scattering cross sections

  19. Uncertainty analysis methods for quantification of source terms using a large computer code

    International Nuclear Information System (INIS)

    Han, Seok Jung

    1997-02-01

    Quantification of uncertainties in the source term estimations by a large computer code, such as MELCOR and MAAP, is an essential process of the current probabilistic safety assessments (PSAs). The main objectives of the present study are (1) to investigate the applicability of a combined procedure of the response surface method (RSM) based on input determined from a statistical design and the Latin hypercube sampling (LHS) technique for the uncertainty analysis of CsI release fractions under a hypothetical severe accident sequence of a station blackout at Young-Gwang nuclear power plant using MAAP3.0B code as a benchmark problem; and (2) to propose a new measure of uncertainty importance based on the distributional sensitivity analysis. On the basis of the results obtained in the present work, the RSM is recommended to be used as a principal tool for an overall uncertainty analysis in source term quantifications, while using the LHS in the calculations of standardized regression coefficients (SRC) and standardized rank regression coefficients (SRRC) to determine the subset of the most important input parameters in the final screening step and to check the cumulative distribution functions (cdfs) obtained by RSM. Verification of the response surface model for its sufficient accuracy is a prerequisite for the reliability of the final results obtained by the combined procedure proposed in the present work. In the present study a new measure has been developed to utilize the metric distance obtained from cumulative distribution functions (cdfs). The measure has been evaluated for three different cases of distributions in order to assess the characteristics of the measure: The first case and the second are when the distribution is known as analytical distributions and the other case is when the distribution is unknown. The first case is given by symmetry analytical distributions. The second case consists of two asymmetry distributions of which the skewness is non zero

  20. Effect of uncertainties on probabilistic-based design capacity of hydrosystems

    Science.gov (United States)

    Tung, Yeou-Koung

    2018-02-01

    Hydrosystems engineering designs involve analysis of hydrometric data (e.g., rainfall, floods) and use of hydrologic/hydraulic models, all of which contribute various degrees of uncertainty to the design process. Uncertainties in hydrosystem designs can be generally categorized into aleatory and epistemic types. The former arises from the natural randomness of hydrologic processes whereas the latter are due to knowledge deficiency in model formulation and model parameter specification. This study shows that the presence of epistemic uncertainties induces uncertainty in determining the design capacity. Hence, the designer needs to quantify the uncertainty features of design capacity to determine the capacity with a stipulated performance reliability under the design condition. Using detention basin design as an example, the study illustrates a methodological framework by considering aleatory uncertainty from rainfall and epistemic uncertainties from the runoff coefficient, curve number, and sampling error in design rainfall magnitude. The effects of including different items of uncertainty and performance reliability on the design detention capacity are examined. A numerical example shows that the mean value of the design capacity of the detention basin increases with the design return period and this relation is found to be practically the same regardless of the uncertainty types considered. The standard deviation associated with the design capacity, when subject to epistemic uncertainty, increases with both design frequency and items of epistemic uncertainty involved. It is found that the epistemic uncertainty due to sampling error in rainfall quantiles should not be ignored. Even with a sample size of 80 (relatively large for a hydrologic application) the inclusion of sampling error in rainfall quantiles resulted in a standard deviation about 2.5 times higher than that considering only the uncertainty of the runoff coefficient and curve number. Furthermore, the

  1. Evidence Theory Based Uncertainty Quantification in Radiological Risk due to Accidental Release of Radioactivity from a Nuclear Power Plant

    International Nuclear Information System (INIS)

    Ingale, S. V.; Datta, D.

    2010-01-01

    Consequence of the accidental release of radioactivity from a nuclear power plant is assessed in terms of exposure or dose to the members of the public. Assessment of risk is routed through this dose computation. Dose computation basically depends on the basic dose assessment model and exposure pathways. One of the exposure pathways is the ingestion of contaminated food. The aim of the present paper is to compute the uncertainty associated with the risk to the members of the public due to the ingestion of contaminated food. The governing parameters of the ingestion dose assessment model being imprecise, we have approached evidence theory to compute the bound of the risk. The uncertainty is addressed by the belief and plausibility fuzzy measures.

  2. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  3. Multi-fidelity uncertainty quantification in large-scale predictive simulations of turbulent flow

    Science.gov (United States)

    Geraci, Gianluca; Jofre-Cruanyes, Lluis; Iaccarino, Gianluca

    2017-11-01

    The performance characterization of complex engineering systems often relies on accurate, but computationally intensive numerical simulations. It is also well recognized that in order to obtain a reliable numerical prediction the propagation of uncertainties needs to be included. Therefore, Uncertainty Quantification (UQ) plays a fundamental role in building confidence in predictive science. Despite the great improvement in recent years, even the more advanced UQ algorithms are still limited to fairly simplified applications and only moderate parameter dimensionality. Moreover, in the case of extremely large dimensionality, sampling methods, i.e. Monte Carlo (MC) based approaches, appear to be the only viable alternative. In this talk we describe and compare a family of approaches which aim to accelerate the convergence of standard MC simulations. These methods are based on hierarchies of generalized numerical resolutions (multi-level) or model fidelities (multi-fidelity), and attempt to leverage the correlation between Low- and High-Fidelity (HF) models to obtain a more accurate statistical estimator without introducing additional HF realizations. The performance of these methods are assessed on an irradiated particle laden turbulent flow (PSAAP II solar energy receiver). This investigation was funded by the United States Department of Energy's (DoE) National Nuclear Security Administration (NNSA) under the Predicitive Science Academic Alliance Program (PSAAP) II at Stanford University.

  4. Identifying and Analyzing Uncertainty Structures in the TRMM Microwave Imager Precipitation Product over Tropical Ocean Basins

    Science.gov (United States)

    Liu, Jianbo; Kummerow, Christian D.; Elsaesser, Gregory S.

    2016-01-01

    Despite continuous improvements in microwave sensors and retrieval algorithms, our understanding of precipitation uncertainty is quite limited, due primarily to inconsistent findings in studies that compare satellite estimates to in situ observations over different parts of the world. This study seeks to characterize the temporal and spatial properties of uncertainty in the Tropical Rainfall Measuring Mission Microwave Imager surface rainfall product over tropical ocean basins. Two uncertainty analysis frameworks are introduced to qualitatively evaluate the properties of uncertainty under a hierarchy of spatiotemporal data resolutions. The first framework (i.e. 'climate method') demonstrates that, apart from random errors and regionally dependent biases, a large component of the overall precipitation uncertainty is manifested in cyclical patterns that are closely related to large-scale atmospheric modes of variability. By estimating the magnitudes of major uncertainty sources independently, the climate method is able to explain 45-88% of the monthly uncertainty variability. The percentage is largely resolution dependent (with the lowest percentage explained associated with a 1 deg x 1 deg spatial/1 month temporal resolution, and highest associated with a 3 deg x 3 deg spatial/3 month temporal resolution). The second framework (i.e. 'weather method') explains regional mean precipitation uncertainty as a summation of uncertainties associated with individual precipitation systems. By further assuming that self-similar recurring precipitation systems yield qualitatively comparable precipitation uncertainties, the weather method can consistently resolve about 50 % of the daily uncertainty variability, with only limited dependence on the regions of interest.

  5. Numerical Continuation Methods for Intrusive Uncertainty Quantification Studies

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Phipps, Eric Todd [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-09-01

    Rigorous modeling of engineering systems relies on efficient propagation of uncertainty from input parameters to model outputs. In recent years, there has been substantial development of probabilistic polynomial chaos (PC) Uncertainty Quantification (UQ) methods, enabling studies in expensive computational models. One approach, termed ”intrusive”, involving reformulation of the governing equations, has been found to have superior computational performance compared to non-intrusive sampling-based methods in relevant large-scale problems, particularly in the context of emerging architectures. However, the utility of intrusive methods has been severely limited due to detrimental numerical instabilities associated with strong nonlinear physics. Previous methods for stabilizing these constructions tend to add unacceptably high computational costs, particularly in problems with many uncertain parameters. In order to address these challenges, we propose to adapt and improve numerical continuation methods for the robust time integration of intrusive PC system dynamics. We propose adaptive methods, starting with a small uncertainty for which the model has stable behavior and gradually moving to larger uncertainty where the instabilities are rampant, in a manner that provides a suitable solution.

  6. Large storage operations under climate change: expanding uncertainties and evolving tradeoffs

    Science.gov (United States)

    Giuliani, Matteo; Anghileri, Daniela; Castelletti, Andrea; Vu, Phuong Nam; Soncini-Sessa, Rodolfo

    2016-03-01

    In a changing climate and society, large storage systems can play a key role for securing water, energy, and food, and rebalancing their cross-dependencies. In this letter, we study the role of large storage operations as flexible means of adaptation to climate change. In particular, we explore the impacts of different climate projections for different future time horizons on the multi-purpose operations of the existing system of large dams in the Red River basin (China-Laos-Vietnam). We identify the main vulnerabilities of current system operations, understand the risk of failure across sectors by exploring the evolution of the system tradeoffs, quantify how the uncertainty associated to climate scenarios is expanded by the storage operations, and assess the expected costs if no adaptation is implemented. Results show that, depending on the climate scenario and the time horizon considered, the existing operations are predicted to change on average from -7 to +5% in hydropower production, +35 to +520% in flood damages, and +15 to +160% in water supply deficit. These negative impacts can be partially mitigated by adapting the existing operations to future climate, reducing the loss of hydropower to 5%, potentially saving around 34.4 million US year-1 at the national scale. Since the Red River is paradigmatic of many river basins across south east Asia, where new large dams are under construction or are planned to support fast growing economies, our results can support policy makers in prioritizing responses and adaptation strategies to the changing climate.

  7. Uncertainty analysis of light water reactor unit fuel pin cells

    Energy Technology Data Exchange (ETDEWEB)

    Kamerow, S.; Ivanov, K., E-mail: sln107@PSU.EDU, E-mail: kni1@PSU.EDU [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, PA (United States); Moreno, C. Arenas, E-mail: cristina.arenas@UPC.EDU [Department of Physics and Nuclear Engineering, Technical University of Catalonia, Barcelona (Spain)

    2011-07-01

    The study explored the calculation of uncertainty based on available covariance data and computational tools. Uncertainty due to temperature changes and different fuel compositions are the main focus of this analysis. Selected unit fuel pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analyses were performed using TSUNAMI-1D sequence in SCALE 6.0. It was found that uncertainties increase with increasing temperature while k{sub eff} decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributor of uncertainty, namely nuclide reaction {sup 238}U (n, gamma). The sensitivity grew larger as the capture cross-section of {sup 238}U expanded due to Doppler broadening. In addition, three different compositions (UOx, MOx, and UOxGd{sub 2}O{sub 3}) of fuel cells were analyzed. It showed a remarkable increase in uncertainty in k{sub eff} for the case of the MOx fuel cell and UOxGd{sub 2}O{sub 3} fuel cell. The increase in the uncertainty of k{sub eff} in UOxGd{sub 2}O{sub 3} fuel was nearly twice of that in MOx fuel and almost four times the amount in UOx fuel. The components of the uncertainties in k{sub eff} in each case were examined and it was found that the neutron-nuclide reaction of {sup 238}U, mainly (n,n'), contributed the most to the uncertainties in the cases of MOx and UOxGd{sub 2}O{sub 3}. At higher energy, the covariance coefficient matrix of {sup 238}U (n,n') to {sup 238}U (n,n') and {sup 238}U (n,n') cross-section showed very large values. Further, examination of the UOxGd{sub 2}O{sub 3} case found that the {sup 238}U (n,n') became the dominant contributor to the uncertainty because most of the thermal neutrons in the cell were absorbed by Gadolinium in UOxGd{sub 2}O{sub 3} case and thus shifting the neutron spectrum to higher energy. For the MOx case on other hand, {sup 239}Pu has a very strong absorption cross-section at low energy

  8. The state of the art of the impact of sampling uncertainty on measurement uncertainty

    Science.gov (United States)

    Leite, V. J.; Oliveira, E. C.

    2018-03-01

    The measurement uncertainty is a parameter that marks the reliability and can be divided into two large groups: sampling and analytical variations. Analytical uncertainty is a controlled process, performed in the laboratory. The same does not occur with the sampling uncertainty, which, because it faces several obstacles and there is no clarity on how to perform the procedures, has been neglected, although it is admittedly indispensable to the measurement process. This paper aims at describing the state of the art of sampling uncertainty and at assessing its relevance to measurement uncertainty.

  9. Damage assessment of composite plate structures with material and measurement uncertainty

    Science.gov (United States)

    Chandrashekhar, M.; Ganguli, Ranjan

    2016-06-01

    Composite materials are very useful in structural engineering particularly in weight sensitive applications. Two different test models of the same structure made from composite materials can display very different dynamic behavior due to large uncertainties associated with composite material properties. Also, composite structures can suffer from pre-existing imperfections like delaminations, voids or cracks during fabrication. In this paper, we show that modeling and material uncertainties in composite structures can cause considerable problem in damage assessment. A recently developed C0 shear deformable locking free refined composite plate element is employed in the numerical simulations to alleviate modeling uncertainty. A qualitative estimate of the impact of modeling uncertainty on the damage detection problem is made. A robust Fuzzy Logic System (FLS) with sliding window defuzzifier is used for delamination damage detection in composite plate type structures. The FLS is designed using variations in modal frequencies due to randomness in material properties. Probabilistic analysis is performed using Monte Carlo Simulation (MCS) on a composite plate finite element model. It is demonstrated that the FLS shows excellent robustness in delamination detection at very high levels of randomness in input data.

  10. Application of status uncertainty analysis methods for AP1000 LBLOCA calculation

    International Nuclear Information System (INIS)

    Zhang Shunxiang; Liang Guoxing

    2012-01-01

    Parameter uncertainty analysis is developed by using the reasonable method to establish the response relations between input parameter uncertainties and output uncertainties. The application of the parameter uncertainty analysis makes the simulation of plant state more accuracy and improves the plant economy with reasonable security assurance. The AP1000 LBLOCA was analyzed in this paper and the results indicate that the random sampling statistical analysis method, sensitivity analysis numerical method and traditional error propagation analysis method can provide quite large peak cladding temperature (PCT) safety margin, which is much helpful for choosing suitable uncertainty analysis method to improve the plant economy. Additionally, the random sampling statistical analysis method applying mathematical statistics theory makes the largest safety margin due to the reducing of the conservation. Comparing with the traditional conservative bounding parameter analysis method, the random sampling method can provide the PCT margin of 100 K, while the other two methods can only provide 50-60 K. (authors)

  11. Physical Uncertainty Bounds (PUB)

    Energy Technology Data Exchange (ETDEWEB)

    Vaughan, Diane Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Dean L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-03-19

    This paper introduces and motivates the need for a new methodology for determining upper bounds on the uncertainties in simulations of engineered systems due to limited fidelity in the composite continuum-level physics models needed to simulate the systems. We show that traditional uncertainty quantification methods provide, at best, a lower bound on this uncertainty. We propose to obtain bounds on the simulation uncertainties by first determining bounds on the physical quantities or processes relevant to system performance. By bounding these physics processes, as opposed to carrying out statistical analyses of the parameter sets of specific physics models or simply switching out the available physics models, one can obtain upper bounds on the uncertainties in simulated quantities of interest.

  12. Uncertainty Evaluation of Best Estimate Calculation Results

    International Nuclear Information System (INIS)

    Glaeser, H.

    2006-01-01

    Efforts are underway in Germany to perform analysis using best estimate computer codes and to include uncertainty evaluation in licensing. The German Reactor Safety Commission (RSK) issued a recommendation to perform uncertainty analysis in loss of coolant accident safety analyses (LOCA), recently. A more general requirement is included in a draft revision of the German Nuclear Regulation which is an activity of the German Ministry of Environment and Reactor Safety (BMU). According to the recommendation of the German RSK to perform safety analyses for LOCA in licensing the following deterministic requirements have still to be applied: Most unfavourable single failure, Unavailability due to preventive maintenance, Break location, Break size and break type, Double ended break, 100 percent through 200 percent, Large, medium and small break, Loss of off-site power, Core power (at accident initiation the most unfavourable conditions and values have to be assumed which may occur under normal operation taking into account the set-points of integral power and power density control. Measurement and calibration errors can be considered statistically), Time of fuel cycle. Analysis using best estimate codes with evaluation of uncertainties is the only way to quantify conservatisms with regard to code models and uncertainties of plant, fuel parameters and decay heat. This is especially the case for approaching licensing limits, e.g. due to power up-rates, higher burn-up and higher enrichment. Broader use of best estimate analysis is therefore envisaged in the future. Since some deterministic unfavourable assumptions regarding availability of NPP systems are still used, some conservatism in best-estimate analyses remains. Methods of uncertainty analyses have been developed and applied by the vendor Framatome ANP as well as by GRS in Germany. The GRS development was sponsored by the German Ministry of Economy and Labour (BMWA). (author)

  13. Output gap uncertainty and real-time monetary policy

    Directory of Open Access Journals (Sweden)

    Francesco Grigoli

    2015-12-01

    Full Text Available Output gap estimates are subject to a wide range of uncertainty owing principally to the difficulty in distinguishing between cycle and trend in real time. We show that country desks tend to overestimate economic slack, especially during recessions, and that uncertainty in initial output gap estimates persists several years. Only a small share of output gap revisions is predictable based on output dynamics, data quality, and policy frameworks. We also show that for a group of Latin American inflation targeters the prescriptions from monetary policy rules are subject to large changes due to revised output gap estimates. These explain a sizable proportion of the deviation of inflation from target, suggesting this information is not accounted for in real-time policy decisions.

  14. Kalman filter application to mitigate the errors in the trajectory simulations due to the lunar gravitational model uncertainty

    International Nuclear Information System (INIS)

    Gonçalves, L D; Rocco, E M; De Moraes, R V; Kuga, H K

    2015-01-01

    This paper aims to simulate part of the orbital trajectory of Lunar Prospector mission to analyze the relevance of using a Kalman filter to estimate the trajectory. For this study it is considered the disturbance due to the lunar gravitational potential using one of the most recent models, the LP100K model, which is based on spherical harmonics, and considers the maximum degree and order up to the value 100. In order to simplify the expression of the gravitational potential and, consequently, to reduce the computational effort required in the simulation, in some cases, lower values for degree and order are used. Following this aim, it is made an analysis of the inserted error in the simulations when using such values of degree and order to propagate the spacecraft trajectory and control. This analysis was done using the standard deviation that characterizes the uncertainty for each one of the values of the degree and order used in LP100K model for the satellite orbit. With knowledge of the uncertainty of the gravity model adopted, lunar orbital trajectory simulations may be accomplished considering these values of uncertainty. Furthermore, it was also used a Kalman filter, where is considered the sensor's uncertainty that defines the satellite position at each step of the simulation and the uncertainty of the model, by means of the characteristic variance of the truncated gravity model. Thus, this procedure represents an effort to approximate the results obtained using lower values for the degree and order of the spherical harmonics, to the results that would be attained if the maximum accuracy of the model LP100K were adopted. Also a comparison is made between the error in the satellite position in the situation in which the Kalman filter is used and the situation in which the filter is not used. The data for the comparison were obtained from the standard deviation in the velocity increment of the space vehicle. (paper)

  15. Uncertainty and sensitivity analysis in nuclear accident consequence assessment

    International Nuclear Information System (INIS)

    Karlberg, Olof.

    1989-01-01

    This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)

  16. Uncertainty analysis of energy consumption in dwellings

    Energy Technology Data Exchange (ETDEWEB)

    Pettersen, Trine Dyrstad

    1997-12-31

    This thesis presents a comprehensive study of an energy estimation model that can be used to examine the uncertainty of predicted energy consumption in a dwelling. The variation and uncertainty of input parameters due to the outdoor climate, the building construction and the inhabitants are studied as a basis for further energy evaluations. The occurring variations of energy consumption in nominal similar dwellings are also investigated due to verification of the simulated energy consumption. The main topics are (1) a study of expected variations and uncertainties in both input parameters used in energy consumption calculations and the energy consumption in the dwelling, (2) the development and evaluation of a simplified energy calculation model that considers uncertainties due to the input parameters, (3) an evaluation of the influence of the uncertain parameters on the total variation so that the most important parameters can be identified, and (4) the recommendation of a simplified procedure for treating uncertainties or possible deviations from average conditions. 90 refs., 182 figs., 73 tabs.

  17. Changes in Rectal Dose Due to Alterations in Beam Angles for Setup Uncertainty and Range Uncertainty in Carbon-Ion Radiotherapy for Prostate Cancer.

    Directory of Open Access Journals (Sweden)

    Yoshiki Kubota

    Full Text Available Carbon-ion radiotherapy of prostate cancer is challenging in patients with metal implants in one or both hips. Problems can be circumvented by using fields at oblique angles. To evaluate the influence of setup and range uncertainties accompanying oblique field angles, we calculated rectal dose changes with oblique orthogonal field angles, using a device with fixed fields at 0° and 90° and a rotating patient couch.Dose distributions were calculated at the standard angles of 0° and 90°, and then at 30° and 60°. Setup uncertainty was simulated with changes from -2 mm to +2 mm for fields in the anterior-posterior, left-right, and cranial-caudal directions, and dose changes from range uncertainty were calculated with a 1 mm water-equivalent path length added to the target isocenter in each angle. The dose distributions regarding the passive irradiation method were calculated using the K2 dose algorithm.The rectal volumes with 0°, 30°, 60°, and 90° field angles at 95% of the prescription dose were 3.4±0.9 cm3, 2.8±1.1 cm3, 2.2±0.8 cm3, and 3.8±1.1 cm3, respectively. As compared with 90° fields, 30° and 60° fields had significant advantages regarding setup uncertainty and significant disadvantages regarding range uncertainty, but were not significantly different from the 90° field setup and range uncertainties.The setup and range uncertainties calculated at 30° and 60° field angles were not associated with a significant change in rectal dose relative to those at 90°.

  18. The role of social cost-benefit analysis in societal decision-making under large uncertainties with application to robbery at a cash depot

    International Nuclear Information System (INIS)

    Jones-Lee, M.; Aven, T.

    2009-01-01

    Social cost-benefit analysis is a well-established method for guiding decisions about safety investments, particularly in situations in which it is possible to make accurate predictions of future performance. However, its direct applicability to situations involving large degrees of uncertainty is less obvious and this raises the question of the extent to which social cost-benefit analysis can provide a useful input to the decision framework that has been explicitly developed to deal with safety decisions in which uncertainty is a major factor, namely risk analysis. This is the main focus of the arguments developed in this paper. In particular, we provide new insights by examining the fundamentals of both approaches and our principal conclusion is that social cost-benefit analysis and risk analysis represent complementary input bases to the decision-making process, and even in the case of large uncertainties social cost-benefit analysis may provide very useful decision support. What is required is the establishment of a proper contextual framework which structures and gives adequate weight to the uncertainties. An application to the possibility of a robbery at a cash depot is examined as a practical example.

  19. Large scale applicability of a Fully Adaptive Non-Intrusive Spectral Projection technique: Sensitivity and uncertainty analysis of a transient

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Lathouwers, Danny; Kloosterman, Jan Leen; Hagen, Tim van der

    2014-01-01

    Highlights: • Grid and basis adaptive Polynomial Chaos techniques are presented for S and U analysis. • Dimensionality reduction and incremental polynomial order reduce computational costs. • An unprotected loss of flow transient is investigated in a Gas Cooled Fast Reactor. • S and U analysis is performed with MC and adaptive PC methods, for 42 input parameters. • PC accurately estimates means, variances, PDFs, sensitivities and uncertainties. - Abstract: Since the early years of reactor physics the most prominent sensitivity and uncertainty (S and U) analysis methods in the nuclear community have been adjoint based techniques. While these are very effective for pure neutronics problems due to the linearity of the transport equation, they become complicated when coupled non-linear systems are involved. With the continuous increase in computational power such complicated multi-physics problems are becoming progressively tractable, hence affordable and easily applicable S and U analysis tools also have to be developed in parallel. For reactor physics problems for which adjoint methods are prohibitive Polynomial Chaos (PC) techniques offer an attractive alternative to traditional random sampling based approaches. At TU Delft such PC methods have been studied for a number of years and this paper presents a large scale application of our Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm for performing the sensitivity and uncertainty analysis of a Gas Cooled Fast Reactor (GFR) Unprotected Loss Of Flow (ULOF) transient. The transient was simulated using the Cathare 2 code system and a fully detailed model of the GFR2400 reactor design that was investigated in the European FP7 GoFastR project. Several sources of uncertainty were taken into account amounting to an unusually high number of stochastic input parameters (42) and numerous output quantities were investigated. The results show consistently good performance of the applied adaptive PC

  20. Propagation of statistical and nuclear data uncertainties in Monte Carlo burn-up calculations

    International Nuclear Information System (INIS)

    Garcia-Herranz, Nuria; Cabellos, Oscar; Sanz, Javier; Juan, Jesus; Kuijper, Jim C.

    2008-01-01

    Two methodologies to propagate the uncertainties on the nuclide inventory in combined Monte Carlo-spectrum and burn-up calculations are presented, based on sensitivity/uncertainty and random sampling techniques (uncertainty Monte Carlo method). Both enable the assessment of the impact of uncertainties in the nuclear data as well as uncertainties due to the statistical nature of the Monte Carlo neutron transport calculation. The methodologies are implemented in our MCNP-ACAB system, which combines the neutron transport code MCNP-4C and the inventory code ACAB. A high burn-up benchmark problem is used to test the MCNP-ACAB performance in inventory predictions, with no uncertainties. A good agreement is found with the results of other participants. This benchmark problem is also used to assess the impact of nuclear data uncertainties and statistical flux errors in high burn-up applications. A detailed calculation is performed to evaluate the effect of cross-section uncertainties in the inventory prediction, taking into account the temporal evolution of the neutron flux level and spectrum. Very large uncertainties are found at the unusually high burn-up of this exercise (800 MWd/kgHM). To compare the impact of the statistical errors in the calculated flux with respect to the cross uncertainties, a simplified problem is considered, taking a constant neutron flux level and spectrum. It is shown that, provided that the flux statistical deviations in the Monte Carlo transport calculation do not exceed a given value, the effect of the flux errors in the calculated isotopic inventory are negligible (even at very high burn-up) compared to the effect of the large cross-section uncertainties available at present in the data files

  1. Propagation of statistical and nuclear data uncertainties in Monte Carlo burn-up calculations

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Herranz, Nuria [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid, UPM (Spain)], E-mail: nuria@din.upm.es; Cabellos, Oscar [Departamento de Ingenieria Nuclear, Universidad Politecnica de Madrid, UPM (Spain); Sanz, Javier [Departamento de Ingenieria Energetica, Universidad Nacional de Educacion a Distancia, UNED (Spain); Juan, Jesus [Laboratorio de Estadistica, Universidad Politecnica de Madrid, UPM (Spain); Kuijper, Jim C. [NRG - Fuels, Actinides and Isotopes Group, Petten (Netherlands)

    2008-04-15

    Two methodologies to propagate the uncertainties on the nuclide inventory in combined Monte Carlo-spectrum and burn-up calculations are presented, based on sensitivity/uncertainty and random sampling techniques (uncertainty Monte Carlo method). Both enable the assessment of the impact of uncertainties in the nuclear data as well as uncertainties due to the statistical nature of the Monte Carlo neutron transport calculation. The methodologies are implemented in our MCNP-ACAB system, which combines the neutron transport code MCNP-4C and the inventory code ACAB. A high burn-up benchmark problem is used to test the MCNP-ACAB performance in inventory predictions, with no uncertainties. A good agreement is found with the results of other participants. This benchmark problem is also used to assess the impact of nuclear data uncertainties and statistical flux errors in high burn-up applications. A detailed calculation is performed to evaluate the effect of cross-section uncertainties in the inventory prediction, taking into account the temporal evolution of the neutron flux level and spectrum. Very large uncertainties are found at the unusually high burn-up of this exercise (800 MWd/kgHM). To compare the impact of the statistical errors in the calculated flux with respect to the cross uncertainties, a simplified problem is considered, taking a constant neutron flux level and spectrum. It is shown that, provided that the flux statistical deviations in the Monte Carlo transport calculation do not exceed a given value, the effect of the flux errors in the calculated isotopic inventory are negligible (even at very high burn-up) compared to the effect of the large cross-section uncertainties available at present in the data files.

  2. Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark

    International Nuclear Information System (INIS)

    Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.

    2013-01-01

    The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)

  3. Uncertainty Analysis of Light Water Reactor Fuel Lattices

    Directory of Open Access Journals (Sweden)

    C. Arenas

    2013-01-01

    Full Text Available The study explored the calculation of uncertainty based on available cross-section covariance data and computational tool on fuel lattice levels, which included pin cell and the fuel assembly models. Uncertainty variations due to temperatures changes and different fuel compositions are the main focus of this analysis. Selected assemblies and unit pin cells were analyzed according to the OECD LWR UAM benchmark specifications. Criticality and uncertainty analysis were performed using TSUNAMI-2D sequence in SCALE 6.1. It was found that uncertainties increase with increasing temperature, while kinf decreases. This increase in the uncertainty is due to the increase in sensitivity of the largest contributing reaction of uncertainty, namely, the neutron capture reaction 238U(n, γ due to the Doppler broadening. In addition, three types (UOX, MOX, and UOX-Gd2O3 of fuel material compositions were analyzed. A remarkable increase in uncertainty in kinf was observed for the case of MOX fuel. The increase in uncertainty of kinf in MOX fuel was nearly twice the corresponding value in UOX fuel. The neutron-nuclide reaction of 238U, mainly inelastic scattering (n, n′, contributed the most to the uncertainties in the MOX fuel, shifting the neutron spectrum to higher energy compared to the UOX fuel.

  4. Climate Projections and Uncertainty Communication.

    Science.gov (United States)

    Joslyn, Susan L; LeClerc, Jared E

    2016-01-01

    Lingering skepticism about climate change might be due in part to the way climate projections are perceived by members of the public. Variability between scientists' estimates might give the impression that scientists disagree about the fact of climate change rather than about details concerning the extent or timing. Providing uncertainty estimates might clarify that the variability is due in part to quantifiable uncertainty inherent in the prediction process, thereby increasing people's trust in climate projections. This hypothesis was tested in two experiments. Results suggest that including uncertainty estimates along with climate projections leads to an increase in participants' trust in the information. Analyses explored the roles of time, place, demographic differences (e.g., age, gender, education level, political party affiliation), and initial belief in climate change. Implications are discussed in terms of the potential benefit of adding uncertainty estimates to public climate projections. Copyright © 2015 Cognitive Science Society, Inc.

  5. Evaluation of photonuclear reaction cross-sections using the reduction method for large systematic uncertainties

    International Nuclear Information System (INIS)

    Varlamov, V.V.; Efimkin, N.G.; Ishkhanov, B.S.; Sapunenko, V.V.

    1994-12-01

    The authors describe a method based on the reduction method for the evaluation of photonuclear reaction cross-sections obtained under conditions where there are large systematic uncertainties (different instrumental functions, calibration and normalization errors). The evaluation method involves using the actual instrumental function (photon spectrum) of each individual experiment to reduce the data to a representation generated by an instrumental function of better quality. The objective is to find the most reasonably achievable monoenergetic representation of the information on the reaction cross-section derived from the results of various experiments and to take into account the calibration and normalization errors in these experiments. The method was used to obtain the evaluated total photoneutron reaction cross-section (γ,xn) for a large number of nuclei. Data obtained for 16 O and 208 Pb are presented. (author). 36 refs, 6 figs, 4 tabs

  6. Robust nonlinear control of nuclear reactors under model uncertainty

    International Nuclear Information System (INIS)

    Park, Moon Ghu

    1993-02-01

    A nonlinear model-based control method is developed for the robust control of a nuclear reactor. The nonlinear plant model is used to design a unique control law which covers a wide operating range. The robustness is a crucial factor for the fully automatic control of reactor power due to time-varying, uncertain parameters, and state estimation error, or unmodeled dynamics. A variable structure control (VSC) method is introduced which consists of an adaptive performance specification (fime control) after the tracking error reaches the narrow boundary-layer by a time-optimal control (coarse control). Variable structure control is a powerful method for nonlinear system controller design which has inherent robustness to parameter variations or external disturbances using the known uncertainty bounds, and it requires very low computational efforts. In spite of its desirable properties, conventional VSC presents several important drawbacks that limit its practical applicability. One of the most undesirable phenomena is chattering, which implies extremely high control activity and may excite high-frequency unmodeled dynamics. This problem is due to the neglected actuator time-delay or sampling effects. The problem was partially remedied by replacing chattering control by a smooth control inter-polation in a boundary layer neighnboring a time-varying sliding surface. But, for the nuclear reactor systems which has very fast dynamic response, the sampling effect may destroy the narrow boundary layer when a large uncertainty bound is used. Due to the very short neutron life time, large uncertainty bound leads to the high gain in feedback control. To resolve this problem, a derivative feedback is introduced that gives excellent performance by reducing the uncertainty bound. The stability of tracking error dynamics is guaranteed by the second method of Lyapunov using the two-level uncertainty bounds that are obtained from the knowledge of uncertainty bound and the estimated

  7. Benchmarking NLDAS-2 Soil Moisture and Evapotranspiration to Separate Uncertainty Contributions

    Science.gov (United States)

    Nearing, Grey S.; Mocko, David M.; Peters-Lidard, Christa D.; Kumar, Sujay V.; Xia, Youlong

    2016-01-01

    Model benchmarking allows us to separate uncertainty in model predictions caused 1 by model inputs from uncertainty due to model structural error. We extend this method with a large-sample approach (using data from multiple field sites) to measure prediction uncertainty caused by errors in (i) forcing data, (ii) model parameters, and (iii) model structure, and use it to compare the efficiency of soil moisture state and evapotranspiration flux predictions made by the four land surface models in the North American Land Data Assimilation System Phase 2 (NLDAS-2). Parameters dominated uncertainty in soil moisture estimates and forcing data dominated uncertainty in evapotranspiration estimates; however, the models themselves used only a fraction of the information available to them. This means that there is significant potential to improve all three components of the NLDAS-2 system. In particular, continued work toward refining the parameter maps and look-up tables, the forcing data measurement and processing, and also the land surface models themselves, has potential to result in improved estimates of surface mass and energy balances.

  8. Brine migration resulting from CO2 injection into saline aquifers – An approach to risk estimation including various levels of uncertainty

    DEFF Research Database (Denmark)

    Walter, Lena; Binning, Philip John; Oladyshkin, Sergey

    2012-01-01

    resulting from displaced brine. Quantifying risk on the basis of numerical simulations requires consideration of different kinds of uncertainties and this study considers both, scenario uncertainty and statistical uncertainty. Addressing scenario uncertainty involves expert opinion on relevant geological......Comprehensive risk assessment is a major task for large-scale projects such as geological storage of CO2. Basic hazards are damage to the integrity of caprocks, leakage of CO2, or reduction of groundwater quality due to intrusion of fluids. This study focuses on salinization of freshwater aquifers...... for large-scale 3D models including complex physics. Therefore, we apply a model reduction based on arbitrary polynomial chaos expansion combined with probabilistic collocation method. It is shown that, dependent on data availability, both types of uncertainty can be equally significant. The presented study...

  9. Uncertainty in visual processes predicts geometrical optical illusions.

    Science.gov (United States)

    Fermüller, Cornelia; Malm, Henrik

    2004-03-01

    It is proposed in this paper that many geometrical optical illusions, as well as illusory patterns due to motion signals in line drawings, are due to the statistics of visual computations. The interpretation of image patterns is preceded by a step where image features such as lines, intersections of lines, or local image movement must be derived. However, there are many sources of noise or uncertainty in the formation and processing of images, and they cause problems in the estimation of these features; in particular, they cause bias. As a result, the locations of features are perceived erroneously and the appearance of the patterns is altered. The bias occurs with any visual processing of line features; under average conditions it is not large enough to be noticeable, but illusory patterns are such that the bias is highly pronounced. Thus, the broader message of this paper is that there is a general uncertainty principle which governs the workings of vision systems, and optical illusions are an artifact of this principle.

  10. Critical mid-term uncertainties in long-term decarbonisation pathways

    International Nuclear Information System (INIS)

    Usher, Will; Strachan, Neil

    2012-01-01

    Over the next decade, large energy investments are required in the UK to meet growing energy service demands and legally binding emission targets under a pioneering policy agenda. These are necessary despite deep mid-term (2025–2030) uncertainties over which national policy makers have little control. We investigate the effect of two critical mid-term uncertainties on optimal near-term investment decisions using a two-stage stochastic energy system model. The results show that where future fossil fuel prices are uncertain: (i) the near term hedging strategy to 2030 differs from any one deterministic fuel price scenario and is structurally dissimilar to a simple ‘average’ of the deterministic scenarios, and (ii) multiple recourse strategies from 2030 are perturbed by path dependencies caused by hedging investments. Evaluating the uncertainty under a decarbonisation agenda shows that fossil fuel price uncertainty is very expensive at around £20 billion. The addition of novel mitigation options reduces the value of fossil fuel price uncertainty to £11 billion. Uncertain biomass import availability shows a much lower value of uncertainty at £300 million. This paper reveals the complex relationship between the flexibility of the energy system and mitigating the costs of uncertainty due to the path-dependencies caused by the long-life times of both infrastructures and generation technologies. - Highlights: ► Critical mid-term uncertainties affect near-term investments in UK energy system. ► Deterministic scenarios give conflicting near-term actions. ► Stochastic scenarios give one near-term hedging strategy. ► Technologies exhibit path dependency or flexibility. ► Fossil fuel price uncertainty is very expensive, biomass availability uncertainty is not.

  11. Uncertainty, joint uncertainty, and the quantum uncertainty principle

    International Nuclear Information System (INIS)

    Narasimhachar, Varun; Poostindouz, Alireza; Gour, Gilad

    2016-01-01

    Historically, the element of uncertainty in quantum mechanics has been expressed through mathematical identities called uncertainty relations, a great many of which continue to be discovered. These relations use diverse measures to quantify uncertainty (and joint uncertainty). In this paper we use operational information-theoretic principles to identify the common essence of all such measures, thereby defining measure-independent notions of uncertainty and joint uncertainty. We find that most existing entropic uncertainty relations use measures of joint uncertainty that yield themselves to a small class of operational interpretations. Our notion relaxes this restriction, revealing previously unexplored joint uncertainty measures. To illustrate the utility of our formalism, we derive an uncertainty relation based on one such new measure. We also use our formalism to gain insight into the conditions under which measure-independent uncertainty relations can be found. (paper)

  12. Inventories and sales uncertainty\\ud

    OpenAIRE

    Caglayan, M.; Maioli, S.; Mateut, S.

    2011-01-01

    We investigate the empirical linkages between sales uncertainty and firms´ inventory investment behavior while controlling for firms´ financial strength. Using large panels of manufacturing firms from several European countries we find that higher sales uncertainty leads to larger stocks of inventories. We also identify an indirect effect of sales uncertainty on inventory accumulation through the financial strength of firms. Our results provide evidence that financial strength mitigates the a...

  13. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    The main objective of safety analysis is to demonstrate in a robust way that all safety requirements are met, i.e. sufficient margins exist between real values of important parameters and their threshold values at which damage of the barriers against release of radioactivity would occur. As stated in the IAEA Safety Requirements for Design of NPPs 'a safety analysis of the plant design shall be conducted in which methods of both deterministic and probabilistic analysis shall be applied'. It is required that 'the computer programs, analytical methods and plant models used in the safety analysis shall be verified and validated, and adequate consideration shall be given to uncertainties'. Uncertainties are present in calculations due to the computer codes, initial and boundary conditions, plant state, fuel parameters, scaling and numerical solution algorithm. All conservative approaches, still widely used, were introduced to cover uncertainties due to limited capability for modelling and understanding of physical phenomena at the early stages of safety analysis. The results obtained by this approach are quite unrealistic and the level of conservatism is not fully known. Another approach is the use of Best Estimate (BE) codes with realistic initial and boundary conditions. If this approach is selected, it should be based on statistically combined uncertainties for plant initial and boundary conditions, assumptions and code models. The current trends are going into direction of the best estimate code with some conservative assumptions of the system with realistic input data with uncertainty analysis. The BE analysis with evaluation of uncertainties offers, in addition, a way to quantify the existing plant safety margins. Its broader use in the future is therefore envisaged, even though it is not always feasible because of the difficulty of quantifying code uncertainties with sufficiently narrow range for every phenomenon and for each accident sequence. In this paper

  14. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  15. Measurement-based climatology of aerosol direct radiative effect, its sensitivities, and uncertainties from a background southeast US site

    Science.gov (United States)

    Sherman, James P.; McComiskey, Allison

    2018-03-01

    Aerosol optical properties measured at Appalachian State University's co-located NASA AERONET and NOAA ESRL aerosol network monitoring sites over a nearly four-year period (June 2012-Feb 2016) are used, along with satellite-based surface reflectance measurements, to study the seasonal variability of diurnally averaged clear sky aerosol direct radiative effect (DRE) and radiative efficiency (RE) at the top-of-atmosphere (TOA) and at the surface. Aerosol chemistry and loading at the Appalachian State site are likely representative of the background southeast US (SE US), home to high summertime aerosol loading and one of only a few regions not to have warmed during the 20th century. This study is the first multi-year ground truth DRE study in the SE US, using aerosol network data products that are often used to validate satellite-based aerosol retrievals. The study is also the first in the SE US to quantify DRE uncertainties and sensitivities to aerosol optical properties and surface reflectance, including their seasonal dependence.Median DRE for the study period is -2.9 W m-2 at the TOA and -6.1 W m-2 at the surface. Monthly median and monthly mean DRE at the TOA (surface) are -1 to -2 W m-2 (-2 to -3 W m-2) during winter months and -5 to -6 W m-2 (-10 W m-2) during summer months. The DRE cycles follow the annual cycle of aerosol optical depth (AOD), which is 9 to 10 times larger in summer than in winter. Aerosol RE is anti-correlated with DRE, with winter values 1.5 to 2 times more negative than summer values. Due to the large seasonal dependence of aerosol DRE and RE, we quantify the sensitivity of DRE to aerosol optical properties and surface reflectance, using a calendar day representative of each season (21 December for winter; 21 March for spring, 21 June for summer, and 21 September for fall). We use these sensitivities along with measurement uncertainties of aerosol optical properties and surface reflectance to calculate DRE uncertainties. We also estimate

  16. Effect of activation cross section uncertainties in transmutation analysis of realistic low-activation steels for IFMIF

    Energy Technology Data Exchange (ETDEWEB)

    Cabellos, O.; Garcya-Herranz, N.; Sanz, J. [Institute of Nuclear Fusion, UPM, Madrid (Spain); Cabellos, O.; Garcya-Herranz, N.; Fernandez, P.; Fernandez, B. [Dept. of Nuclear Engineering, UPM, Madrid (Spain); Sanz, J. [Dept. of Power Engineering, UNED, Madrid (Spain); Reyes, S. [Safety, Environment and Health Group, ITER Joint Work Site, Cadarache Center (France)

    2008-07-01

    We address uncertainty analysis to draw conclusions on the reliability of the activation calculation in the International Fusion Materials Irradiation Facility (IFMIF) under the potential impact of activation cross section uncertainties. The Monte Carlo methodology implemented in ACAB code gives the uncertainty estimates due to the synergetic/global effect of the complete set of cross section uncertainties. An element-by-element analysis has been demonstrated as a helpful tool to easily analyse the transmutation performance of irradiated materials.The uncertainty analysis results showed that for times over about 24 h the relative error in the contact dose rate can be as large as 23 per cent. We have calculated the effect of cross section uncertainties in the IFMIF activation of all different elements. For EUROFER, uncertainties in H and He elements are 7.3% and 5.6%, respectively. We have found significant uncertainties in the transmutation response for C, P and Nb.

  17. Uncertainty quantification in resonance absorption

    International Nuclear Information System (INIS)

    Williams, M.M.R.

    2012-01-01

    We assess the uncertainty in the resonance escape probability due to uncertainty in the neutron and radiation line widths for the first 21 resonances in 232 Th as given by . Simulation, quadrature and polynomial chaos methods are used and the resonance data are assumed to obey a beta distribution. We find the uncertainty in the total resonance escape probability to be the equivalent, in reactivity, of 75–130 pcm. Also shown are pdfs of the resonance escape probability for each resonance and the variation of the uncertainty with temperature. The viability of the polynomial chaos expansion method is clearly demonstrated.

  18. Application of extended statistical combination of uncertainties methodology for digital nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    In, Wang Ki; Uh, Keun Sun; Chul, Kim Heui [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-02-01

    A technically more direct statistical combinations of uncertainties methodology, extended SCU (XSCU), was applied to statistically combine the uncertainties associated with the DNBR alarm setpoint and the DNBR trip setpoint of digital nuclear power plants. The modified SCU (MSCU) methodology is currently used as the USNRC approved design methodology to perform the same function. In this report, the MSCU and XSCU methodologies were compared in terms of the total uncertainties and the net margins to the DNBR alarm and trip setpoints. The MSCU methodology resulted in the small total penalties due to a significantly negative bias which are quite large. However the XSCU methodology gave the virtually unbiased total uncertainties. The net margins to the DNBR alarm and trip setpoints by the MSCU methodology agree with those by the XSCU methodology within statistical variations. (Author) 12 refs., 17 figs., 5 tabs.

  19. Uncertainties in HTGR neutron-physical characteristics due to computational errors and technological tolerances

    International Nuclear Information System (INIS)

    Glushkov, E.S.; Grebennik, V.N.; Davidenko, V.G.; Kosovskij, V.G.; Smirnov, O.N.; Tsibul'skij, V.F.

    1991-01-01

    The paper is dedicated to the consideration of uncertainties is neutron-physical characteristics (NPC) of high-temperature gas-cooled reactors (HTGR) with a core as spherical fuel element bed, which are caused by calculations from HTGR parameters mean values affecting NPC. Among NPC are: effective multiplication factor, burnup depth, reactivity effect, control element worth, distribution of neutrons and heat release over a reactor core, etc. The short description of calculated methods and codes used for HTGR calculations in the USSR is given and evaluations of NPC uncertainties of the methodical character are presented. Besides, the analysis of the effect technological deviations in parameters of reactor main elements such as uranium amount in the spherical fuel element, number of neutron-absorbing impurities in the reactor core and reflector, etc, upon the NPC is carried out. Results of some experimental studies of NPC of critical assemblies with graphite moderator are given as applied to HTGR. The comparison of calculations results and experiments on critical assemblies has made it possible to evaluate uncertainties of calculated description of HTGR NPC. (author). 8 refs, 8 figs, 6 tabs

  20. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  1. Procedure to approximately estimate the uncertainty of material ratio parameters due to inhomogeneity of surface roughness

    International Nuclear Information System (INIS)

    Hüser, Dorothee; Thomsen-Schmidt, Peter; Hüser, Jonathan; Rief, Sebastian; Seewig, Jörg

    2016-01-01

    Roughness parameters that characterize contacting surfaces with regard to friction and wear are commonly stated without uncertainties, or with an uncertainty only taking into account a very limited amount of aspects such as repeatability of reproducibility (homogeneity) of the specimen. This makes it difficult to discriminate between different values of single roughness parameters. Therefore uncertainty assessment methods are required that take all relevant aspects into account. In the literature this is rarely performed and examples specific for parameters used in friction and wear are not yet given. We propose a procedure to derive the uncertainty from a single profile employing a statistical method that is based on the statistical moments of the amplitude distribution and the autocorrelation length of the profile. To show the possibilities and the limitations of this method we compare the uncertainty derived from a single profile with that derived from a high statistics experiment. (paper)

  2. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    Science.gov (United States)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  3. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    Science.gov (United States)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  4. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  5. Uncertainty of forest carbon stock changes. Implications to the total uncertainty of GHG inventory of Finland

    International Nuclear Information System (INIS)

    Monni, S.; Savolainen, I.; Peltoniemi, M.; Lehtonen, A.; Makipaa, R.; Palosuo, T.

    2007-01-01

    Uncertainty analysis facilitates identification of the most important categories affecting greenhouse gas (GHG) inventory uncertainty and helps in prioritisation of the efforts needed for development of the inventory. This paper presents an uncertainty analysis of GHG emissions of all Kyoto sectors and gases for Finland consolidated with estimates of emissions/removals from LULUCF categories. In Finland, net GHG emissions in 2003 were around 69 Tg (±15 Tg) CO2 equivalents. The uncertainties in forest carbon sink estimates in 2003 were larger than in most other emission categories, but of the same order of magnitude as in carbon stock change estimates in other land use, land-use change and forestry (LULUCF) categories, and in N2O emissions from agricultural soils. Uncertainties in sink estimates of 1990 were lower, due to better availability of data. Results of this study indicate that inclusion of the forest carbon sink to GHG inventories reported to the UNFCCC increases uncertainties in net emissions notably. However, the decrease in precision is accompanied by an increase in the accuracy of the overall net GHG emissions due to improved completeness of the inventory. The results of this study can be utilised when planning future GHG mitigation protocols and emission trading schemes and when analysing environmental benefits of climate conventions

  6. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    data and representation of the results to the decision makers play an important role. Second, we introduce a selection of alternative, so-called “post-probabilistic”, risk management methods developed across different scientific fields to cope with uncertainty due to lack of knowledge. Possibilities......Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... for overcoming industrial PSS risk management challenges are suggested through application of post-probabilistic methods. We conclude with the discussion on the importance for the field to consider their application....

  7. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    International Nuclear Information System (INIS)

    Bailey, D; Spaans, J; Kumaraswamy, L; Podgorsak, M

    2016-01-01

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  8. SU-F-T-301: Planar Dose Pass Rate Inflation Due to the MapCHECK Measurement Uncertainty Function

    Energy Technology Data Exchange (ETDEWEB)

    Bailey, D [Northside Hospital Cancer Institute, Atlanta, GA (United States); Spaans, J; Kumaraswamy, L; Podgorsak, M [Roswell Park Cancer Institute, Buffalo, NY (United States)

    2016-06-15

    Purpose: To quantify the effect of the Measurement Uncertainty function on planar dosimetry pass rates, as analyzed with Sun Nuclear Corporation analytic software (“MapCHECK” or “SNC Patient”). This optional function is toggled on by default upon software installation, and automatically increases the user-defined dose percent difference (%Diff) tolerance for each planar dose comparison. Methods: Dose planes from 109 IMRT fields and 40 VMAT arcs were measured with the MapCHECK 2 diode array, and compared to calculated planes from a commercial treatment planning system. Pass rates were calculated within the SNC analytic software using varying calculation parameters, including Measurement Uncertainty on and off. By varying the %Diff criterion for each dose comparison performed with Measurement Uncertainty turned off, an effective %Diff criterion was defined for each field/arc corresponding to the pass rate achieved with MapCHECK Uncertainty turned on. Results: For 3%/3mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.8–1.1% average, depending on plan type and calculation technique, for an average pass rate increase of 1.0–3.5% (maximum +8.7%). For 2%, 2 mm analysis, the Measurement Uncertainty function increases the user-defined %Diff by 0.7–1.2% average, for an average pass rate increase of 3.5–8.1% (maximum +14.2%). The largest increases in pass rate are generally seen with poorly-matched planar dose comparisons; the MapCHECK Uncertainty effect is markedly smaller as pass rates approach 100%. Conclusion: The Measurement Uncertainty function may substantially inflate planar dose comparison pass rates for typical IMRT and VMAT planes. The types of uncertainties incorporated into the function (and their associated quantitative estimates) as described in the software user’s manual may not accurately estimate realistic measurement uncertainty for the user’s measurement conditions. Pass rates listed in published

  9. Fundamental uncertainty limit of optical flow velocimetry according to Heisenberg's uncertainty principle.

    Science.gov (United States)

    Fischer, Andreas

    2016-11-01

    Optical flow velocity measurements are important for understanding the complex behavior of flows. Although a huge variety of methods exist, they are either based on a Doppler or a time-of-flight measurement principle. Doppler velocimetry evaluates the velocity-dependent frequency shift of light scattered at a moving particle, whereas time-of-flight velocimetry evaluates the traveled distance of a scattering particle per time interval. Regarding the aim of achieving a minimal measurement uncertainty, it is unclear if one principle allows to achieve lower uncertainties or if both principles can achieve equal uncertainties. For this reason, the natural, fundamental uncertainty limit according to Heisenberg's uncertainty principle is derived for Doppler and time-of-flight measurement principles, respectively. The obtained limits of the velocity uncertainty are qualitatively identical showing, e.g., a direct proportionality for the absolute value of the velocity to the power of 32 and an indirect proportionality to the square root of the scattered light power. Hence, both measurement principles have identical potentials regarding the fundamental uncertainty limit due to the quantum mechanical behavior of photons. This fundamental limit can be attained (at least asymptotically) in reality either with Doppler or time-of-flight methods, because the respective Cramér-Rao bounds for dominating photon shot noise, which is modeled as white Poissonian noise, are identical with the conclusions from Heisenberg's uncertainty principle.

  10. Uncertainty in soil carbon accounting due to unrecognized soil erosion.

    Science.gov (United States)

    Sanderman, Jonathan; Chappell, Adrian

    2013-01-01

    The movement of soil organic carbon (SOC) during erosion and deposition events represents a major perturbation to the terrestrial carbon cycle. Despite the recognized impact soil redistribution can have on the carbon cycle, few major carbon accounting models currently allow for soil mass flux. Here, we modified a commonly used SOC model to include a soil redistribution term and then applied it to scenarios which explore the implications of unrecognized erosion and deposition for SOC accounting. We show that models that assume a static landscape may be calibrated incorrectly as erosion of SOC is hidden within the decay constants. This implicit inclusion of erosion then limits the predictive capacity of these models when applied to sites with different soil redistribution histories. Decay constants were found to be 15-50% slower when an erosion rate of 15 t soil ha(-1)  yr(-1) was explicitly included in the SOC model calibration. Static models cannot account for SOC change resulting from agricultural management practices focused on reducing erosion rates. Without accounting for soil redistribution, a soil sampling scheme which uses a fixed depth to support model development can create large errors in actual and relative changes in SOC stocks. When modest levels of erosion were ignored, the combined uncertainty in carbon sequestration rates was 0.3-1.0 t CO2  ha(-1)  yr(-1) . This range is similar to expected sequestration rates for many management options aimed at increasing SOC levels. It is evident from these analyses that explicit recognition of soil redistribution is critical to the success of a carbon monitoring or trading scheme which seeks to credit agricultural activities. © 2012 Blackwell Publishing Ltd.

  11. A Statistical Modeling Framework for Characterising Uncertainty in Large Datasets: Application to Ocean Colour

    Directory of Open Access Journals (Sweden)

    Peter E. Land

    2018-05-01

    Full Text Available Uncertainty estimation is crucial to establishing confidence in any data analysis, and this is especially true for Essential Climate Variables, including ocean colour. Methods for deriving uncertainty vary greatly across data types, so a generic statistics-based approach applicable to multiple data types is an advantage to simplify the use and understanding of uncertainty data. Progress towards rigorous uncertainty analysis of ocean colour has been slow, in part because of the complexity of ocean colour processing. Here, we present a general approach to uncertainty characterisation, using a database of satellite-in situ matchups to generate a statistical model of satellite uncertainty as a function of its contributing variables. With an example NASA MODIS-Aqua chlorophyll-a matchups database mostly covering the north Atlantic, we demonstrate a model that explains 67% of the squared error in log(chlorophyll-a as a potentially correctable bias, with the remaining uncertainty being characterised as standard deviation and standard error at each pixel. The method is quite general, depending only on the existence of a suitable database of matchups or reference values, and can be applied to other sensors and data types such as other satellite observed Essential Climate Variables, empirical algorithms derived from in situ data, or even model data.

  12. Assessment of SFR Wire Wrap Simulation Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results

  13. Uncertainty propagation through dynamic models of assemblies of mechanical structures

    International Nuclear Information System (INIS)

    Daouk, Sami

    2016-01-01

    When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)

  14. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    discretization parameters. We show that the temporal resolution should be at least 1 h to ensure errors less than 0.2 °C in modeled MAGT, and the uppermost ground layer should at most be 20 mm thick. Within the topographic setting, the total parametric output uncertainties expressed as the length of the 95% uncertainty interval of the Monte Carlo simulations range from 0.5 to 1.5 °C for clay and silt, and ranges from 0.5 to around 2.4 °C for peat, sand, gravel and rock. These uncertainties are comparable to the variability of ground surface temperatures measured within 10 m × 10 m grids in Switzerland. The increased uncertainties for sand, peat and gravel are largely due to their sensitivity to the hydraulic conductivity.

  15. Uncertainty propagation in urban hydrology water quality modelling

    NARCIS (Netherlands)

    Torres Matallana, Arturo; Leopold, U.; Heuvelink, G.B.M.

    2016-01-01

    Uncertainty is often ignored in urban hydrology modelling. Engineering practice typically ignores uncertainties and uncertainty propagation. This can have large impacts, such as the wrong dimensioning of urban drainage systems and the inaccurate estimation of pollution in the environment caused

  16. Errors in mean and fluctuating velocity due to PIV bias and precision uncertainties

    International Nuclear Information System (INIS)

    Wilson, B.; Smith, B.L.

    2011-01-01

    Particle Image Velocimetry is a powerful fluid velocity measurement tool that has recently become important for CFD validation experiments. Knowledge of experimental uncertainty is important to CFD validation, but the uncertainty of PIV is very complex and not well understood. Previous work has shown that PIV measurements can become 'noisy' in regions of high shear as well as regions of small displacement. This paper aims to demonstrate the impact of these effects on validation data by comparing PIV data to data acquired using hot-wire anemometry, which does not suffer from the same issues. It is confirmed that shear and insufficient particle displacements can result in elevated measurements of turbulence levels. (author)

  17. Relational uncertainty in service dyads

    DEFF Research Database (Denmark)

    Kreye, Melanie

    2017-01-01

    in service dyads and how they resolve it through suitable organisational responses to increase the level of service quality. Design/methodology/approach: We apply the overall logic of Organisational Information-Processing Theory (OIPT) and present empirical insights from two industrial case studies collected...... the relational uncertainty increased the functional quality while resolving the partner’s organisational uncertainty increased the technical quality of the delivered service. Originality: We make two contributions. First, we introduce relational uncertainty to the OM literature as the inability to predict...... and explain the actions of a partnering organisation due to a lack of knowledge about their abilities and intentions. Second, we present suitable organisational responses to relational uncertainty and their effect on service quality....

  18. Uncertainty and its propagation in dynamics models

    International Nuclear Information System (INIS)

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  19. Total uncertainty of low velocity thermal anemometers for measurement of indoor air movements

    DEFF Research Database (Denmark)

    Jørgensen, F.; Popiolek, Z.; Melikov, Arsen Krikor

    2004-01-01

    For a specific thermal anemometer with omnidirectional velocity sensor the expanded total uncertainty in measured mean velocity Û(Vmean) and the expanded total uncertainty in measured turbulence intensity Û(Tu) due to different error sources are estimated. The values are based on a previously...... developed mathematical model of the anemometer in combination with a large database of representative room flows measured with a 3-D Laser Doppler anemometer (LDA). A direct comparison between measurements with a thermal anemometer and a 3-D LDA in flows of varying velocity and turbulence intensity shows...... good agreement not only between the two instruments but also between the thermal anemometer and its mathematical model. The differences in the measurements performed with the two instruments are all well within the measurement uncertainty of both anemometers....

  20. Uncertainty in sap flow-based transpiration due to xylem properties

    Science.gov (United States)

    Looker, N. T.; Hu, J.; Martin, J. T.; Jencso, K. G.

    2014-12-01

    Transpiration, the evaporative loss of water from plants through their stomata, is a key component of the terrestrial water balance, influencing streamflow as well as regional convective systems. From a plant physiological perspective, transpiration is both a means of avoiding destructive leaf temperatures through evaporative cooling and a consequence of water loss through stomatal uptake of carbon dioxide. Despite its hydrologic and ecological significance, transpiration remains a notoriously challenging process to measure in heterogeneous landscapes. Sap flow methods, which estimate transpiration by tracking the velocity of a heat pulse emitted into the tree sap stream, have proven effective for relating transpiration dynamics to climatic variables. To scale sap flow-based transpiration from the measured domain (often area) to the whole-tree level, researchers generally assume constancy of scale factors (e.g., wood thermal diffusivity (k), radial and azimuthal distributions of sap velocity, and conducting sapwood area (As)) through time, across space, and within species. For the widely used heat-ratio sap flow method (HRM), we assessed the sensitivity of transpiration estimates to uncertainty in k (a function of wood moisture content and density) and As. A sensitivity analysis informed by distributions of wood moisture content, wood density and As sampled across a gradient of water availability indicates that uncertainty in these variables can impart substantial error when scaling sap flow measurements to the whole tree. For species with variable wood properties, the application of the HRM assuming a spatially constant k or As may systematically over- or underestimate whole-tree transpiration rates, resulting in compounded error in ecosystem-scale estimates of transpiration.

  1. Climate change impact assessment and adaptation under uncertainty

    NARCIS (Netherlands)

    Wardekker, J.A.

    2011-01-01

    Expected impacts of climate change are associated with large uncertainties, particularly at the local level. Adaptation scientists, practitioners, and decision-makers will need to find ways to cope with these uncertainties. Several approaches have been suggested as ‘uncertainty-proof’ to some

  2. Controls on gas transfer velocities in a large river

    Science.gov (United States)

    The emission of biogenic gases from large rivers can be an important component of regional greenhouse gas budgets. However, emission rate estimates are often poorly constrained due to uncertainties in the air-water gas exchange rate. We used the floating chamber method to estim...

  3. Uncertainties of predictions from parton distribution functions. I. The Lagrange multiplier method

    International Nuclear Information System (INIS)

    Stump, D.; Pumplin, J.; Brock, R.; Casey, D.; Huston, J.; Kalk, J.; Lai, H. L.; Tung, W. K.

    2002-01-01

    We apply the Lagrange multiplier method to study the uncertainties of physical predictions due to the uncertainties of parton distribution functions (PDF's), using the cross section σ W for W production at a hadron collider as an archetypal example. An effective χ 2 function based on the CTEQ global QCD analysis is used to generate a series of PDF's, each of which represents the best fit to the global data for some specified value of σ W . By analyzing the likelihood of these 'alterative hypotheses', using available information on errors from the individual experiments, we estimate that the fractional uncertainty of σ W due to current experimental input to the PDF analysis is approximately ±4% at the Fermilab Tevatron, and ±8-10% at the CERN Large Hadron Collider. We give sets of PDF's corresponding to these up and down variations of σ W . We also present similar results on Z production at the colliders. Our method can be applied to any combination of physical variables in precision QCD phenomenology, and it can be used to generate benchmarks for testing the accuracy of approximate methods based on the error matrix

  4. A continental-scale hydrology and water quality model for Europe: Calibration and uncertainty of a high-resolution large-scale SWAT model

    Science.gov (United States)

    Abbaspour, K. C.; Rouholahnejad, E.; Vaghefi, S.; Srinivasan, R.; Yang, H.; Kløve, B.

    2015-05-01

    A combination of driving forces are increasing pressure on local, national, and regional water supplies needed for irrigation, energy production, industrial uses, domestic purposes, and the environment. In many parts of Europe groundwater quantity, and in particular quality, have come under sever degradation and water levels have decreased resulting in negative environmental impacts. Rapid improvements in the economy of the eastern European block of countries and uncertainties with regard to freshwater availability create challenges for water managers. At the same time, climate change adds a new level of uncertainty with regard to freshwater supplies. In this research we build and calibrate an integrated hydrological model of Europe using the Soil and Water Assessment Tool (SWAT) program. Different components of water resources are simulated and crop yield and water quality are considered at the Hydrological Response Unit (HRU) level. The water resources are quantified at subbasin level with monthly time intervals. Leaching of nitrate into groundwater is also simulated at a finer spatial level (HRU). The use of large-scale, high-resolution water resources models enables consistent and comprehensive examination of integrated system behavior through physically-based, data-driven simulation. In this article we discuss issues with data availability, calibration of large-scale distributed models, and outline procedures for model calibration and uncertainty analysis. The calibrated model and results provide information support to the European Water Framework Directive and lay the basis for further assessment of the impact of climate change on water availability and quality. The approach and methods developed are general and can be applied to any large region around the world.

  5. On the relationship between aerosol model uncertainty and radiative forcing uncertainty.

    Science.gov (United States)

    Lee, Lindsay A; Reddington, Carly L; Carslaw, Kenneth S

    2016-05-24

    The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple "equifinal" models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.

  6. arXiv Addendum to: Predictions for Higgs production at the Tevatron and the associated uncertainties

    CERN Document Server

    Baglio, Julien

    2010-01-01

    We update the theoretical predictions for the production cross sections of the Standard Model Higgs boson at the Fermilab Tevatron collider, focusing on the two main search channels, the gluon-gluon fusion mechanism $gg \\to H$ and the Higgs-strahlung processes $q \\bar q \\to VH$ with $V=W/Z$, including all relevant higher order QCD and electroweak corrections in perturbation theory. We then estimate the various uncertainties affecting these predictions: the scale uncertainties which are viewed as a measure of the unknown higher order effects, the uncertainties from the parton distribution functions and the related errors on the strong coupling constant, as well as the uncertainties due to the use of an effective theory approach in the determination of the radiative corrections in the $gg \\to H$ process at next-to-next-to-leading order. We find that while the cross sections are well under control in the Higgs--strahlung processes, the theoretical uncertainties are rather large in the case of the gluon-gluon fus...

  7. Uncertainty in oil projects

    International Nuclear Information System (INIS)

    Limperopoulos, G.J.

    1995-01-01

    This report presents an oil project valuation under uncertainty by means of two well-known financial techniques: The Capital Asset Pricing Model (CAPM) and The Black-Scholes Option Pricing Formula. CAPM gives a linear positive relationship between expected rate of return and risk but does not take into consideration the aspect of flexibility which is crucial for an irreversible investment as an oil price is. Introduction of investment decision flexibility by using real options can increase the oil project value substantially. Some simple tests for the importance of uncertainty in stock market for oil investments are performed. Uncertainty in stock returns is correlated with aggregate product market uncertainty according to Pindyck (1991). The results of the tests are not satisfactory due to the short data series but introducing two other explanatory variables the interest rate and Gross Domestic Product make the situation better. 36 refs., 18 figs., 6 tabs

  8. Uncertainty of SWAT model at different DEM resolutions in a large mountainous watershed.

    Science.gov (United States)

    Zhang, Peipei; Liu, Ruimin; Bao, Yimeng; Wang, Jiawei; Yu, Wenwen; Shen, Zhenyao

    2014-04-15

    The objective of this study was to enhance understanding of the sensitivity of the SWAT model to the resolutions of Digital Elevation Models (DEMs) based on the analysis of multiple evaluation indicators. The Xiangxi River, a large tributary of Three Gorges Reservoir in China, was selected as the study area. A range of 17 DEM spatial resolutions, from 30 to 1000 m, was examined, and the annual and monthly model outputs based on each resolution were compared. The following results were obtained: (i) sediment yield was greatly affected by DEM resolution; (ii) the prediction of dissolved oxygen load was significantly affected by DEM resolutions coarser than 500 m; (iii) Total Nitrogen (TN) load was not greatly affected by the DEM resolution; (iv) Nitrate Nitrogen (NO₃-N) and Total Phosphorus (TP) loads were slightly affected by the DEM resolution; and (v) flow and Ammonia Nitrogen (NH₄-N) load were essentially unaffected by the DEM resolution. The flow and dissolved oxygen load decreased more significantly in the dry season than in the wet and normal seasons. Excluding flow and dissolved oxygen, the uncertainties of the other Hydrology/Non-point Source (H/NPS) pollution indicators were greater in the wet season than in the dry and normal seasons. Considering the temporal distribution uncertainties, the optimal DEM resolutions for flow was 30-200 m, for sediment and TP was 30-100 m, for dissolved oxygen and NO₃-N was 30-300 m, for NH₄-N was 30 to 70 m and for TN was 30-150 m. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Uncertainty quantification theory, implementation, and applications

    CERN Document Server

    Smith, Ralph C

    2014-01-01

    The field of uncertainty quantification is evolving rapidly because of increasing emphasis on models that require quantified uncertainties for large-scale applications, novel algorithm development, and new computational architectures that facilitate implementation of these algorithms. Uncertainty Quantification: Theory, Implementation, and Applications provides readers with the basic concepts, theory, and algorithms necessary to quantify input and response uncertainties for simulation models arising in a broad range of disciplines. The book begins with a detailed discussion of applications where uncertainty quantification is critical for both scientific understanding and policy. It then covers concepts from probability and statistics, parameter selection techniques, frequentist and Bayesian model calibration, propagation of uncertainties, quantification of model discrepancy, surrogate model construction, and local and global sensitivity analysis. The author maintains a complementary web page where readers ca...

  10. Model uncertainty in financial markets : Long run risk and parameter uncertainty

    NARCIS (Netherlands)

    de Roode, F.A.

    2014-01-01

    Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked

  11. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    Science.gov (United States)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination

  12. Estimated Uncertainty in Segmented Gamma Scanner Assay Results due to the Variation in Drum Tare Weights

    International Nuclear Information System (INIS)

    Bosko, A.; Croft, St.; Gulbransen, E.

    2009-01-01

    General purpose gamma scanners are often used to assay unknown drums that differ from those used to create the default calibration. This introduces a potential source of bias into the matrix correction when the correction is based on the estimation of the mean density of the drum contents from a weigh scale measurement. In this paper we evaluate the magnitude of this bias that may be introduced by performing assay measurements with a system whose matrix correction algorithm was calibrated with a set of standard drums but applied to a population of drums whose tare weight may be different. The matrix correction factors are perturbed in such cases because the unknown difference in tare weight gets reflected as a bias in the derived matrix density. This would be the only impact if the difference in tare weight was due solely to the weight of the lid or base, say. But in reality the reason for the difference may be because the steel wall of the drum is of a different thickness. Thus, there is an opposing interplay at work which tends to compensate. The purpose of this work is to evaluate and bound the magnitude of the resulting assay uncertainty introduced by tare weight variation. We compare the results obtained using simple analytical models and the 3-D ray tracing with ISOCS software to illustrate and quantify the problem. The numerical results allow a contribution to the Total Measurement Uncertainty (TMU) to be propagated into the final assay result. (authors)

  13. Water supply infrastructure planning under multiple uncertainties: A differentiated approach

    Science.gov (United States)

    Fletcher, S.; Strzepek, K.

    2017-12-01

    Many water planners face increased pressure on water supply systems from increasing demands from population and economic growth in combination with uncertain water supply. Supply uncertainty arises from short-term climate variability and long-term climate change as well as uncertainty in groundwater availability. Social and economic uncertainties - such as sectoral competition for water, food and energy security, urbanization, and environmental protection - compound physical uncertainty. Further, the varying risk aversion of stakeholders and water managers makes it difficult to assess the necessity of expensive infrastructure investments to reduce risk. We categorize these uncertainties on two dimensions: whether they can be updated over time by collecting additional information, and whether the uncertainties can be described probabilistically or are "deep" uncertainties whose likelihood is unknown. Based on this, we apply a decision framework that combines simulation for probabilistic uncertainty, scenario analysis for deep uncertainty, and multi-stage decision analysis for uncertainties that are reduced over time with additional information. In light of these uncertainties and the investment costs of large infrastructure, we propose the assessment of staged, modular infrastructure and information updating as a hedge against risk. We apply this framework to cases in Melbourne, Australia and Riyadh, Saudi Arabia. Melbourne is a surface water system facing uncertain population growth and variable rainfall and runoff. A severe drought from 1997 to 2009 prompted investment in a 150 MCM/y reverse osmosis desalination plan with a capital cost of 3.5 billion. Our analysis shows that flexible design in which a smaller portion of capacity is developed initially with the option to add modular capacity in the future can mitigate uncertainty and reduce the expected lifetime costs by up to 1 billion. In Riyadh, urban water use relies on fossil groundwater aquifers and

  14. Uncertainty analysis for secondary energy distributions

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1978-01-01

    In many transport calculations the integral design parameter of interest (response) is determined mainly by secondary particles such as gamma rays from (n,γ) reactions or secondary neutrons from inelastic scattering events or (n,2n) reactions. Standard sensitivity analysis usually allows to calculate the sensitivities to the production cross sections of such secondaries, but an extended formalism is needed to also obtain the sensitivities to the energy distribution of the generated secondary particles. For a 30-group standard cross-section set 84% of all non-zero table positions pertain to the description of secondary energy distributions (SED's) and only 16% to the actual reaction cross sections. Therefore, any sensitivity/uncertainty analysis which does not consider the effects of SED's is incomplete and neglects most of the input data. This paper describes the methods of how sensitivity profiles for SED's are obtained and used to estimate the uncertainty of an integral response due to uncertainties in these SED's. The detailed theory is documented elsewhere and implemented in the LASL sensitivity code SENSIT. SED sensitivity profiles have proven particularly valuable in cross-section uncertainty analyses for fusion reactors. Even when the production cross sections for secondary neutrons were assumed to be without error, the uncertainties in the energy distribution of these secondaries produced appreciable uncertainties in the calculated tritium breeding rate. However, complete error files for SED's are presently nonexistent. Therefore, methods will be described that allow rough error estimates due to estimated SED uncertainties based on integral SED sensitivities

  15. Uncertainties in historical pollution data from sedimentary records from an Australian urban floodplain lake

    Science.gov (United States)

    Lintern, A.; Leahy, P.; Deletic, A.; Heijnis, H.; Zawadzki, A.; Gadd, P.; McCarthy, D.

    2018-05-01

    Sediment cores from aquatic environments can provide valuable information about historical pollution levels and sources. However, there is little understanding of the uncertainties associated with these findings. The aim of this study is to fill this knowledge gap by proposing a framework for quantifying the uncertainties in historical heavy metal pollution records reconstructed from sediment cores. This uncertainty framework consists of six sources of uncertainty: uncertainties in (1) metals analysis methods, (2) spatial variability of sediment core heavy metal profiles, (3) sub-sampling intervals, (4) the sediment chronology, (5) the assumption that metal levels in bed sediments reflect the magnitude of metal inputs into the aquatic system, and (6) post-depositional transformation of metals. We apply this uncertainty framework to an urban floodplain lake in South-East Australia (Willsmere Billabong). We find that for this site, uncertainties in historical dated heavy metal profiles can be up to 176%, largely due to uncertainties in the sediment chronology, and in the assumption that the settled heavy metal mass is equivalent to the heavy metal mass entering the aquatic system. As such, we recommend that future studies reconstructing historical pollution records using sediment cores from aquatic systems undertake an investigation of the uncertainties in the reconstructed pollution record, using the uncertainty framework provided in this study. We envisage that quantifying and understanding the uncertainties associated with the reconstructed pollution records will facilitate the practical application of sediment core heavy metal profiles in environmental management projects.

  16. Evaluating uncertainties in regional climate simulations over South America at the seasonal scale

    Energy Technology Data Exchange (ETDEWEB)

    Solman, Silvina A. [Centro de Investigaciones del Mar y la Atmosfera CIMA/CONICET-UBA, DCAO/FCEN, UMI-IFAECI/CNRS, CIMA-Ciudad Universitaria, Buenos Aires (Argentina); Pessacg, Natalia L. [Centro Nacional Patagonico (CONICET), Puerto Madryn, Chubut (Argentina)

    2012-07-15

    This work focuses on the evaluation of different sources of uncertainty affecting regional climate simulations over South America at the seasonal scale, using the MM5 model. The simulations cover a 3-month period for the austral spring season. Several four-member ensembles were performed in order to quantify the uncertainty due to: the internal variability; the definition of the regional model domain; the choice of physical parameterizations and the selection of physical parameters within a particular cumulus scheme. The uncertainty was measured by means of the spread among individual members of each ensemble during the integration period. Results show that the internal variability, triggered by differences in the initial conditions, represents the lowest level of uncertainty for every variable analyzed. The geographic distribution of the spread among ensemble members depends on the variable: for precipitation and temperature the largest spread is found over tropical South America while for the mean sea level pressure the largest spread is located over the southeastern Atlantic Ocean, where large synoptic-scale activity occurs. Using nudging techniques to ingest the boundary conditions reduces dramatically the internal variability. The uncertainty due to the domain choice displays a similar spatial pattern compared with the internal variability, except for the mean sea level pressure field, though its magnitude is larger all over the model domain for every variable. The largest spread among ensemble members is found for the ensemble in which different combinations of physical parameterizations are selected. The perturbed physics ensemble produces a level of uncertainty slightly larger than the internal variability. This study suggests that no matter what the source of uncertainty is, the geographical distribution of the spread among members of the ensembles is invariant, particularly for precipitation and temperature. (orig.)

  17. Sensitivity and uncertainty analysis

    CERN Document Server

    Cacuci, Dan G; Navon, Ionel Michael

    2005-01-01

    As computer-assisted modeling and analysis of physical processes have continued to grow and diversify, sensitivity and uncertainty analyses have become indispensable scientific tools. Sensitivity and Uncertainty Analysis. Volume I: Theory focused on the mathematical underpinnings of two important methods for such analyses: the Adjoint Sensitivity Analysis Procedure and the Global Adjoint Sensitivity Analysis Procedure. This volume concentrates on the practical aspects of performing these analyses for large-scale systems. The applications addressed include two-phase flow problems, a radiative c

  18. Calibration and Forward Uncertainty Propagation for Large-eddy Simulations of Engineering Flows

    Energy Technology Data Exchange (ETDEWEB)

    Templeton, Jeremy Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Blaylock, Myra L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Domino, Stefan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hewson, John C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kumar, Pritvi Raj [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ling, Julia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Najm, Habib N. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ruiz, Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Safta, Cosmin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Stewart, Alessia [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wagner, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The objective of this work is to investigate the efficacy of using calibration strategies from Uncertainty Quantification (UQ) to determine model coefficients for LES. As the target methods are for engineering LES, uncertainty from numerical aspects of the model must also be quantified. 15 The ultimate goal of this research thread is to generate a cost versus accuracy curve for LES such that the cost could be minimized given an accuracy prescribed by an engineering need. Realization of this goal would enable LES to serve as a predictive simulation tool within the engineering design process.

  19. Estimating the uncertainty in thermochemical calculations for oxygen-hydrogen combustors

    Science.gov (United States)

    Sims, Joseph David

    The thermochemistry program CEA2 was combined with the statistical thermodynamics program PAC99 in a Monte Carlo simulation to determine the uncertainty in several CEA2 output variables due to uncertainty in thermodynamic reference values for the reactant and combustion species. In all, six typical performance parameters were examined, along with the required intermediate calculations (five gas properties and eight stoichiometric coefficients), for three hydrogen-oxygen combustors: a main combustor, an oxidizer preburner and a fuel preburner. The three combustors were analyzed in two different modes: design mode, where, for the first time, the uncertainty in thermodynamic reference values---taken from the literature---was considered (inputs to CEA2 were specified and so had no uncertainty); and data reduction mode, where inputs to CEA2 did have uncertainty. The inputs to CEA2 were contrived experimental measurements that were intended to represent the typical combustor testing facility. In design mode, uncertainties in the performance parameters were on the order of 0.1% for the main combustor, on the order of 0.05% for the oxidizer preburner and on the order of 0.01% for the fuel preburner. Thermodynamic reference values for H2O were the dominant sources of uncertainty, as was the assigned enthalpy for liquid oxygen. In data reduction mode, uncertainties in performance parameters increased significantly as a result of the uncertainties in experimental measurements compared to uncertainties in thermodynamic reference values. Main combustor and fuel preburner theoretical performance values had uncertainties of about 0.5%, while the oxidizer preburner had nearly 2%. Associated experimentally-determined performance values for all three combustors were 3% to 4%. The dominant sources of uncertainty in this mode were the propellant flowrates. These results only apply to hydrogen-oxygen combustors and should not be generalized to every propellant combination. Species for

  20. Rational consensus under uncertainty: Expert judgment in the EC-USNRC uncertainty study

    International Nuclear Information System (INIS)

    Cooke, R.; Kraan, B.; Goossens, L.

    1999-01-01

    Governmental bodies are confronted with the problem of achieving rational consensus in the face of substantial uncertainties. The area of accident consequence management for nuclear power plants affords a good example. Decisions with regard to evacuation, decontamination, and food bans must be taken on the basis of predictions of environmental transport of radioactive material, contamination through the food chain, cancer induction, and the like. These predictions use mathematical models containing scores of uncertain parameters. Decision makers want to take, and want to be perceived to take, these decisions in a rational manner. The question is, how can this be accomplished in the face of large uncertainties? Indeed, the very presence of uncertainty poses a threat to rational consensus. Decision makers will necessarily base their actions on the judgments of experts. The experts, however, will not agree among themselves, as otherwise we would not speak of large uncertainties. Any given expert's viewpoint will be favorable to the interests of some stakeholders, and hostile to the interests of others. If a decision maker bases his/her actions on the views of one single expert, then (s)he is invariably open to charges of partiality toward the interests favored by this viewpoint. An appeal to 'impartial' or 'disinterested' experts will fail for two reasons. First, experts have interests; they have jobs, mortgages and professional reputations. Second, even if expert interests could somehow be quarantined, even then the experts would disagree. Expert disagreement is not explained by diverging interests, and consensus cannot be reached by shielding the decision process from expert interests. If rational consensus requires expert agreement, then rational consensus is simply not possible in the face of uncertainty. If rational consensus under uncertainty is to be achieved, then evidently the views of a diverse set of experts must be taken into account. The question is how

  1. Computational chemical product design problems under property uncertainties

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Cignitti, Stefano; Abildskov, Jens

    2017-01-01

    Three different strategies of how to combine computational chemical product design with Monte Carlo based methods for uncertainty analysis of chemical properties are outlined. One method consists of a computer-aided molecular design (CAMD) solution and a post-processing property uncertainty...... fluid design. While the higher end of the uncertainty range of the process model output is similar for the best performing fluids, the lower end of the uncertainty range differs largely....

  2. Uncertainty of a hydrological climate change impact assessment - Is it really all about climate uncertainty?

    Science.gov (United States)

    Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian

    2013-04-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a

  3. Dealing with exploration uncertainties

    International Nuclear Information System (INIS)

    Capen, E.

    1992-01-01

    Exploration for oil and gas should fulfill the most adventurous in their quest for excitement and surprise. This paper tries to cover that tall order. The authors will touch on the magnitude of the uncertainty (which is far greater than in most other businesses), the effects of not knowing target sizes very well, how to build uncertainty into analyses naturally, how to tie reserves and chance estimates to economics, and how to look at the portfolio effect of an exploration program. With no apologies, the authors will be using a different language for some readers - the language of uncertainty, which means probability and statistics. These tools allow one to combine largely subjective exploration information with the more analytical data from the engineering and economic side

  4. Uncertainties in fission-product decay-heat calculations

    Energy Technology Data Exchange (ETDEWEB)

    Oyamatsu, K.; Ohta, H.; Miyazono, T.; Tasaka, K. [Nagoya Univ. (Japan)

    1997-03-01

    The present precision of the aggregate decay heat calculations is studied quantitatively for 50 fissioning systems. In this evaluation, nuclear data and their uncertainty data are taken from ENDF/B-VI nuclear data library and those which are not available in this library are supplemented by a theoretical consideration. An approximate method is proposed to simplify the evaluation of the uncertainties in the aggregate decay heat calculations so that we can point out easily nuclei which cause large uncertainties in the calculated decay heat values. In this paper, we attempt to clarify the justification of the approximation which was not very clear at the early stage of the study. We find that the aggregate decay heat uncertainties for minor actinides such as Am and Cm isotopes are 3-5 times as large as those for {sup 235}U and {sup 239}Pu. The recommended values by Atomic Energy Society of Japan (AESJ) were given for 3 major fissioning systems, {sup 235}U(t), {sup 239}Pu(t) and {sup 238}U(f). The present results are consistent with the AESJ values for these systems although the two evaluations used different nuclear data libraries and approximations. Therefore, the present results can also be considered to supplement the uncertainty values for the remaining 17 fissioning systems in JNDC2, which were not treated in the AESJ evaluation. Furthermore, we attempt to list nuclear data which cause large uncertainties in decay heat calculations for the future revision of decay and yield data libraries. (author)

  5. Uncertainty in hydrological signatures

    Science.gov (United States)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around ±10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from ±10% to ±15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  6. Roughness coefficient and its uncertainty in gravel-bed river

    Directory of Open Access Journals (Sweden)

    Ji-Sung Kim

    2010-06-01

    Full Text Available Manning's roughness coefficient was estimated for a gravel-bed river reach using field measurements of water level and discharge, and the applicability of various methods used for estimation of the roughness coefficient was evaluated. Results show that the roughness coefficient tends to decrease with increasing discharge and water depth, and over a certain range it appears to remain constant. Comparison of roughness coefficients calculated by field measurement data with those estimated by other methods shows that, although the field-measured values provide approximate roughness coefficients for relatively large discharge, there seems to be rather high uncertainty due to the difference in resultant values. For this reason, uncertainty related to the roughness coefficient was analyzed in terms of change in computed variables. On average, a 20% increase of the roughness coefficient causes a 7% increase in the water depth and an 8% decrease in velocity, but there may be about a 15% increase in the water depth and an equivalent decrease in velocity for certain cross-sections in the study reach. Finally, the validity of estimated roughness coefficient based on field measurements was examined. A 10% error in discharge measurement may lead to more than 10% uncertainty in roughness coefficient estimation, but corresponding uncertainty in computed water depth and velocity is reduced to approximately 5%. Conversely, the necessity for roughness coefficient estimation by field measurement is confirmed.

  7. Unrealized Global Temperature Increase: Implications of Current Uncertainties

    Science.gov (United States)

    Schwartz, Stephen E.

    2018-04-01

    Unrealized increase in global mean surface air temperature (GMST) may result from the climate system not being in steady state with forcings and/or from cessation of negative aerosol forcing that would result from decreases in emissions. An observation-constrained method is applied to infer the dependence of Earth's climate sensitivity on forcing by anthropogenic aerosols within the uncertainty on that forcing given by the Fifth (2013) Assessment Report of the Intergovernmental Panel on Climate Change. Within these uncertainty ranges the increase in GMST due to temperature lag for future forcings held constant is slight (0.09-0.19 K over 20 years; 0.12-0.26 K over 100 years). However, the incremental increase in GMST that would result from a hypothetical abrupt cessation of sources of aerosols could be quite large but is highly uncertain, 0.1-1.3 K over 20 years. Decrease in CO2 abundance and forcing following abrupt cessation of emissions would offset these increases in GMST over 100 years by as little as 0.09 K to as much as 0.8 K. The uncertainties quantified here greatly limit confidence in projections of change in GMST that would result from any strategy for future reduction of emissions.

  8. Sensitivity and uncertainty studies of the CRAC2 computer code

    International Nuclear Information System (INIS)

    Kocher, D.C.; Ward, R.C.; Killough, G.G.; Dunning, D.E. Jr.; Hicks, B.B.; Hosker, R.P. Jr.; Ku, J.Y.; Rao, K.S.

    1985-05-01

    This report presents a study of the sensitivity of early fatalities, early injuries, latent cancer fatalities, and economic costs for hypothetical nuclear reactor accidents as predicted by the CRAC2 computer code (CRAC = Calculation of Reactor Accident Consequences) to uncertainties in selected models and parameters used in the code. The sources of uncertainty that were investigated in the CRAC2 sensitivity studies include (1) the model for plume rise, (2) the model for wet deposition, (3) the procedure for meteorological bin-sampling involving the selection of weather sequences that contain rain, (4) the dose conversion factors for inhalation as they are affected by uncertainties in the physical and chemical form of the released radionuclides, (5) the weathering half-time for external ground-surface exposure, and (6) the transfer coefficients for estimating exposures via terrestrial foodchain pathways. The sensitivity studies were performed for selected radionuclide releases, hourly meteorological data, land-use data, a fixed non-uniform population distribution, a single evacuation model, and various release heights and sensible heat rates. Two important general conclusions from the sensitivity and uncertainty studies are as follows: (1) The large effects on predicted early fatalities and early injuries that were observed in some of the sensitivity studies apparently are due in part to the presence of thresholds in the dose-response models. Thus, the observed sensitivities depend in part on the magnitude of the radionuclide releases. (2) Some of the effects on predicted early fatalities and early injuries that were observed in the sensitivity studies were comparable to effects that were due only to the selection of different sets of weather sequences in bin-sampling runs. 47 figs., 50 tabs

  9. Probabilistic modelling and uncertainty analysis of flux and water balance changes in a regional aquifer system due to coal seam gas development.

    Science.gov (United States)

    Sreekanth, J; Cui, Tao; Pickett, Trevor; Rassam, David; Gilfedder, Mat; Barrett, Damian

    2018-09-01

    Large scale development of coal seam gas (CSG) is occurring in many sedimentary basins around the world including Australia, where commercial production of CSG has started in the Surat and Bowen basins. CSG development often involves extraction of large volumes of water that results in depressurising aquifers that overlie and/or underlie the coal seams thus perturbing their flow regimes. This can potentially impact regional aquifer systems that are used for many purposes such as irrigation, and stock and domestic water. In this study, we adopt a probabilistic approach to quantify the depressurisation of the Gunnedah coal seams and how this impacts fluxes to, and from the overlying Great Artesian Basin (GAB) Pilliga Sandstone aquifer. The proposed method is suitable when effects of a new resource development activity on the regional groundwater balance needs to be assessed and account for large scale uncertainties in the groundwater flow system and proposed activity. The results indicated that the extraction of water and gas from the coal seam could potentially induce additional fluxes from the Pilliga Sandstone to the deeper formations due to lowering pressure heads in the coal seams. The median value of the rise in the maximum flux from the Pilliga Sandstone to the deeper formations is estimated to be 85ML/year, which is considered insignificant as it forms only about 0.29% of the Long Term Annual Average Extraction Limit of 30GL/year from the groundwater management area. The probabilistic simulation of the water balance components indicates only small changes being induced by CSG development that influence interactions of the Pilliga Sandstone with the overlying and underlying formations and with the surface water courses. The current analyses that quantified the potential maximum impacts of resource developments and how they influences the regional water balance, would greatly underpin future management decisions. Copyright © 2018 Elsevier B.V. All rights

  10. Uncertainty and Sensitivity Analysis of Afterbody Radiative Heating Predictions for Earth Entry

    Science.gov (United States)

    West, Thomas K., IV; Johnston, Christopher O.; Hosder, Serhat

    2016-01-01

    The objective of this work was to perform sensitivity analysis and uncertainty quantification for afterbody radiative heating predictions of Stardust capsule during Earth entry at peak afterbody radiation conditions. The radiation environment in the afterbody region poses significant challenges for accurate uncertainty quantification and sensitivity analysis due to the complexity of the flow physics, computational cost, and large number of un-certain variables. In this study, first a sparse collocation non-intrusive polynomial chaos approach along with global non-linear sensitivity analysis was used to identify the most significant uncertain variables and reduce the dimensions of the stochastic problem. Then, a total order stochastic expansion was constructed over only the important parameters for an efficient and accurate estimate of the uncertainty in radiation. Based on previous work, 388 uncertain parameters were considered in the radiation model, which came from the thermodynamics, flow field chemistry, and radiation modeling. The sensitivity analysis showed that only four of these variables contributed significantly to afterbody radiation uncertainty, accounting for almost 95% of the uncertainty. These included the electronic- impact excitation rate for N between level 2 and level 5 and rates of three chemical reactions in uencing N, N(+), O, and O(+) number densities in the flow field.

  11. Uncertainties in model-independent extractions of amplitudes from complete experiments

    International Nuclear Information System (INIS)

    Hoblit, S.; Sandorfi, A.M.; Kamano, H.; Lee, T.-S.H.

    2012-01-01

    A new generation of over-complete experiments is underway, with the goal of performing a high precision extraction of pseudoscalar meson photo-production amplitudes. Such experimentally determined amplitudes can be used both as a test to validate models and as a starting point for an analytic continuation in the complex plane to search for poles. Of crucial importance for both is the level of uncertainty in the extracted multipoles. We have probed these uncertainties by analyses of pseudo-data for KLambda photoproduction, first for the set of 8 observables that have been published for the K + Lambda channel and then for pseudo-data on a complete set of 16 observables with the uncertainties expected from analyses of ongoing CLAS experiments. In fitting multipoles, we have used a combined Monte Carlo sampling of the amplitude space, with gradient minimization, and have found a shallow X 2 valley pitted with a large number of local minima. This results in bands of solutions that are experimentally indistinguishable. All ongoing experiments will measure observables with limited statistics. We have found a dependence on the particular random choice of values of Gaussian distributed pseudo-data, due to the presence of multiple local minima. This results in actual uncertainties for reconstructed multipoles that are often considerable larger than those returned by gradient minimization routines such as Minuit which find a single local minimum. As intuitively expected, this additional level of uncertainty decreases as larger numbers of observables are included.

  12. Report on the uncertainty methods study

    International Nuclear Information System (INIS)

    1998-06-01

    The Uncertainty Methods Study (UMS) Group, following a mandate from CSNI, has compared five methods for calculating the uncertainty in the predictions of advanced 'best estimate' thermal-hydraulic codes: the Pisa method (based on extrapolation from integral experiments) and four methods identifying and combining input uncertainties. Three of these, the GRS, IPSN and ENUSA methods, use subjective probability distributions, and one, the AEAT method, performs a bounding analysis. Each method has been used to calculate the uncertainty in specified parameters for the LSTF SB-CL-18 5% cold leg small break LOCA experiment in the ROSA-IV Large Scale Test Facility (LSTF). The uncertainty analysis was conducted essentially blind and the participants did not use experimental measurements from the test as input apart from initial and boundary conditions. Participants calculated uncertainty ranges for experimental parameters including pressurizer pressure, primary circuit inventory and clad temperature (at a specified position) as functions of time

  13. Geological-structural models used in SR 97. Uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saksa, P.; Nummela, J. [FINTACT Oy (Finland)

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that

  14. Geological-structural models used in SR 97. Uncertainty analysis

    International Nuclear Information System (INIS)

    Saksa, P.; Nummela, J.

    1998-10-01

    The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the

  15. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  16. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  17. OpenTURNS, an open source uncertainty engineering software

    International Nuclear Information System (INIS)

    Popelin, A.L.; Dufoy, A.

    2013-01-01

    The needs to assess robust performances for complex systems have lead to the emergence of a new industrial simulation challenge: to take into account uncertainties when dealing with complex numerical simulation frameworks. EDF has taken part in the development of an Open Source software platform dedicated to uncertainty propagation by probabilistic methods, named OpenTURNS for Open source Treatment of Uncertainty, Risk and Statistics. OpenTURNS includes a large variety of qualified algorithms in order to manage uncertainties in industrial studies, from the uncertainty quantification step (with possibilities to model stochastic dependence thanks to the copula theory and stochastic processes), to the uncertainty propagation step (with some innovative simulation algorithms as the ziggurat method for normal variables) and the sensitivity analysis one (with some sensitivity index based on the evaluation of means conditioned to the realization of a particular event). It also enables to build some response surfaces that can include the stochastic modeling (with the chaos polynomial method for example). Generic wrappers to link OpenTURNS to the modeling software are proposed. At last, OpenTURNS is largely documented to provide rules to help use and contribution

  18. Uncertainty of climate change impacts and consequences on the prediction of future hydrological trends

    International Nuclear Information System (INIS)

    Minville, M.; Brissette, F.; Leconte, R.

    2008-01-01

    In the future, water is very likely to be the resource that will be most severely affected by climate change. It has been shown that small perturbations in precipitation frequency and/or quantity can result in significant impacts on the mean annual discharge. Moreover, modest changes in natural inflows result in larger changes in reservoir storage. There is however great uncertainty linked to changes in both the magnitude and direction of future hydrological trends. This presentation discusses the various sources of this uncertainty and their potential impact on the prediction of future hydrological trends. A companion paper will look at adaptation potential, taking into account some of the sources of uncertainty discussed in this presentation. Uncertainty is separated into two main components: climatic uncertainty and 'model and methods' uncertainty. Climatic uncertainty is linked to uncertainty in future greenhouse gas emission scenarios (GHGES) and to general circulation models (GCMs), whose representation of topography and climate processes is imperfect, in large part due to computational limitations. The uncertainty linked to natural variability (which may or may not increase) is also part of the climatic uncertainty. 'Model and methods' uncertainty regroups the uncertainty linked to the different approaches and models needed to transform climate data so that they can be used by hydrological models (such as downscaling methods) and the uncertainty of the models themselves and of their use in a changed climate. The impacts of the various sources of uncertainty on the hydrology of a watershed are demonstrated on the Peribonka River basin (Quebec, Canada). The results indicate that all sources of uncertainty can be important and outline the importance of taking these sources into account for any impact and adaptation studies. Recommendations are outlined for such studies. (author)

  19. Hydroclimatic risks and uncertainty in the global power sector

    Science.gov (United States)

    Gidden, Matthew; Byers, Edward; Greve, Peter; Kahil, Taher; Parkinson, Simon; Raptis, Catherine; Rogelj, Joeri; Satoh, Yusuke; van Vliet, Michelle; Wada, Yoshide; Krey, Volker; Langan, Simon; Riahi, Keywan

    2017-04-01

    Approximately 80% of the world's electricity supply depends on reliable water resources. Thermoelectric and hydropower plants have been impacted by low flows and floods in recent years, notably in the US, Brazil, France, and China, amongst other countries. The dependence on reliable flows imputes a large vulnerability to the electricity supply system due to hydrological variability and the impacts of climate change. Using an updated dataset of global electricity capacity with global climate and hydrological data from the ISI-MIP project, we present an overview analysis of power sector vulnerability to hydroclimatic risks, including low river flows and peak flows. We show how electricity generation in individual countries and transboundary river basins can be impacted, helping decision-makers identify key at-risk geographical regions. Furthermore, our use of a multi-model ensemble of climate and hydrological models allows us to quantify the uncertainty of projected impacts, such that basin-level risks and uncertainty can be compared.

  20. Uncertainty analysis in safety assessment

    International Nuclear Information System (INIS)

    Lemos, Francisco Luiz de; Sullivan, Terry

    1997-01-01

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)

  1. Large-Scale Uncertainty and Error Analysis for Time-dependent Fluid/Structure Interactions in Wind Turbine Applications

    Energy Technology Data Exchange (ETDEWEB)

    Alonso, Juan J. [Stanford University; Iaccarino, Gianluca [Stanford University

    2013-08-25

    The following is the final report covering the entire period of this aforementioned grant, June 1, 2011 - May 31, 2013 for the portion of the effort corresponding to Stanford University (SU). SU has partnered with Sandia National Laboratories (PI: Mike S. Eldred) and Purdue University (PI: Dongbin Xiu) to complete this research project and this final report includes those contributions made by the members of the team at Stanford. Dr. Eldred is continuing his contributions to this project under a no-cost extension and his contributions to the overall effort will be detailed at a later time (once his effort has concluded) on a separate project submitted by Sandia National Laboratories. At Stanford, the team is made up of Profs. Alonso, Iaccarino, and Duraisamy, post-doctoral researcher Vinod Lakshminarayan, and graduate student Santiago Padron. At Sandia National Laboratories, the team includes Michael Eldred, Matt Barone, John Jakeman, and Stefan Domino, and at Purdue University, we have Prof. Dongbin Xiu as our main collaborator. The overall objective of this project was to develop a novel, comprehensive methodology for uncertainty quantification by combining stochastic expansions (nonintrusive polynomial chaos and stochastic collocation), the adjoint approach, and fusion with experimental data to account for aleatory and epistemic uncertainties from random variable, random field, and model form sources. The expected outcomes of this activity were detailed in the proposal and are repeated here to set the stage for the results that we have generated during the time period of execution of this project: 1. The rigorous determination of an error budget comprising numerical errors in physical space and statistical errors in stochastic space and its use for optimal allocation of resources; 2. A considerable increase in efficiency when performing uncertainty quantification with a large number of uncertain variables in complex non-linear multi-physics problems; 3. A

  2. Quantifying aggregated uncertainty in Plasmodium falciparum malaria prevalence and populations at risk via efficient space-time geostatistical joint simulation.

    Science.gov (United States)

    Gething, Peter W; Patil, Anand P; Hay, Simon I

    2010-04-01

    Risk maps estimating the spatial distribution of infectious diseases are required to guide public health policy from local to global scales. The advent of model-based geostatistics (MBG) has allowed these maps to be generated in a formal statistical framework, providing robust metrics of map uncertainty that enhances their utility for decision-makers. In many settings, decision-makers require spatially aggregated measures over large regions such as the mean prevalence within a country or administrative region, or national populations living under different levels of risk. Existing MBG mapping approaches provide suitable metrics of local uncertainty--the fidelity of predictions at each mapped pixel--but have not been adapted for measuring uncertainty over large areas, due largely to a series of fundamental computational constraints. Here the authors present a new efficient approximating algorithm that can generate for the first time the necessary joint simulation of prevalence values across the very large prediction spaces needed for global scale mapping. This new approach is implemented in conjunction with an established model for P. falciparum allowing robust estimates of mean prevalence at any specified level of spatial aggregation. The model is used to provide estimates of national populations at risk under three policy-relevant prevalence thresholds, along with accompanying model-based measures of uncertainty. By overcoming previously unchallenged computational barriers, this study illustrates how MBG approaches, already at the forefront of infectious disease mapping, can be extended to provide large-scale aggregate measures appropriate for decision-makers.

  3. Sensitivity of Asteroid Impact Risk to Uncertainty in Asteroid Properties and Entry Parameters

    Science.gov (United States)

    Wheeler, Lorien; Mathias, Donovan; Dotson, Jessie L.; NASA Asteroid Threat Assessment Project

    2017-10-01

    A central challenge in assessing the threat posed by asteroids striking Earth is the large amount of uncertainty inherent throughout all aspects of the problem. Many asteroid properties are not well characterized and can range widely from strong, dense, monolithic irons to loosely bound, highly porous rubble piles. Even for an object of known properties, the specific entry velocity, angle, and impact location can swing the potential consequence from no damage to causing millions of casualties. Due to the extreme rarity of large asteroid strikes, there are also large uncertainties in how different types of asteroids will interact with the atmosphere during entry, how readily they may break up or ablate, and how much surface damage will be caused by the resulting airbursts or impacts.In this work, we use our Probabilistic Asteroid Impact Risk (PAIR) model to investigate the sensitivity of asteroid impact damage to uncertainties in key asteroid properties, entry parameters, or modeling assumptions. The PAIR model combines physics-based analytic models of asteroid entry and damage in a probabilistic Monte Carlo framework to assess the risk posed by a wide range of potential impacts. The model samples from uncertainty distributions of asteroid properties and entry parameters to generate millions of specific impact cases, and models the atmospheric entry and damage for each case, including blast overpressure, thermal radiation, tsunami inundation, and global effects. To assess the risk sensitivity, we alternately fix and vary the different input parameters and compare the effect on the resulting range of damage produced. The goal of these studies is to help guide future efforts in asteroid characterization and model refinement by determining which properties most significantly affect the potential risk.

  4. Correlated uncertainties in integral data

    International Nuclear Information System (INIS)

    McCracken, A.K.

    1978-01-01

    The use of correlated uncertainties in calculational data is shown in cases investigated to lead to a reduction in the uncertainty of calculated quantities of importance to reactor design. It is stressed however that such reductions are likely to be important in a minority of cases of practical interest. The effect of uncertainties in detector cross-sections is considered and is seen to be, in some cases, of equal importance to that in the data used in calculations. Numerical investigations have been limited by the sparse information available on data correlations; some comparisons made of these data reveal quite large inconsistencies for both detector cross-sections and cross-section of interest for reactor calculations

  5. Analyzing the uncertainty of ensemble-based gridded observations in land surface simulations and drought assessment

    Science.gov (United States)

    Ahmadalipour, Ali; Moradkhani, Hamid

    2017-12-01

    Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.

  6. Uncertainty analysis of nonlinear systems employing the first-order reliability method

    International Nuclear Information System (INIS)

    Choi, Chan Kyu; Yoo, Hong Hee

    2012-01-01

    In most mechanical systems, properties of the system elements have uncertainties due to several reasons. For example, mass, stiffness coefficient of a spring, damping coefficient of a damper or friction coefficients have uncertain characteristics. The uncertain characteristics of the elements have a direct effect on the system performance uncertainty. It is very important to estimate the performance uncertainty since the performance uncertainty is directly related to manufacturing yield and consumer satisfaction. Due to this reason, the performance uncertainty should be estimated accurately and considered in the system design. In this paper, performance measures are defined for nonlinear vibration systems and the performance measure uncertainties are estimated employing the first order reliability method (FORM). It was found that the FORM could provide good results in spite of the system nonlinear characteristics. Comparing to the results obtained by Monte Carlo Simulation (MCS), the accuracy of the uncertainty analysis results obtained by the FORM is validated

  7. SENSIT: a cross-section and design sensitivity and uncertainty analysis code

    International Nuclear Information System (INIS)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE

  8. Dose uncertainties for large solar particle events: Input spectra variability and human geometry approximations

    International Nuclear Information System (INIS)

    Townsend, Lawrence W.; Zapp, E. Neal

    1999-01-01

    The true uncertainties in estimates of body organ absorbed dose and dose equivalent, from exposures of interplanetary astronauts to large solar particle events (SPEs), are essentially unknown. Variations in models used to parameterize SPE proton spectra for input into space radiation transport and shielding computer codes can result in uncertainty about the reliability of dose predictions for these events. Also, different radiation transport codes and their input databases can yield significant differences in dose predictions, even for the same input spectra. Different results may also be obtained for the same input spectra and transport codes if different spacecraft and body self-shielding distributions are assumed. Heretofore there have been no systematic investigations of the variations in dose and dose equivalent resulting from these assumptions and models. In this work we present a study of the variability in predictions of organ dose and dose equivalent arising from the use of different parameters to represent the same incident SPE proton data and from the use of equivalent sphere approximations to represent human body geometry. The study uses the BRYNTRN space radiation transport code to calculate dose and dose equivalent for the skin, ocular lens and bone marrow using the October 1989 SPE as a model event. Comparisons of organ dose and dose equivalent, obtained with a realistic human geometry model and with the oft-used equivalent sphere approximation, are also made. It is demonstrated that variations of 30-40% in organ dose and dose equivalent are obtained for slight variations in spectral fitting parameters obtained when various data points are included or excluded from the fitting procedure. It is further demonstrated that extrapolating spectra from low energy (≤30 MeV) proton fluence measurements, rather than using fluence data extending out to 100 MeV results in dose and dose equivalent predictions that are underestimated by factors as large as 2

  9. Uncertainty analysis of multiple canister repository model by large-scale calculation

    International Nuclear Information System (INIS)

    Tsujimoto, K.; Okuda, H.; Ahn, J.

    2007-01-01

    A prototype uncertainty analysis has been made by using the multiple-canister radionuclide transport code, VR, for performance assessment for the high-level radioactive waste repository. Fractures in the host rock determine main conduit of groundwater, and thus significantly affect the magnitude of radionuclide release rates from the repository. In this study, the probability distribution function (PDF) for the number of connected canisters in the same fracture cluster that bears water flow has been determined in a Monte-Carlo fashion by running the FFDF code with assumed PDFs for fracture geometry. The uncertainty for the release rate of 237 Np from a hypothetical repository containing 100 canisters has been quantitatively evaluated by using the VR code with PDFs for the number of connected canisters and the near field rock porosity. The calculation results show that the mass transport is greatly affected by (1) the magnitude of the radionuclide source determined by the number of connected canisters by the fracture cluster, and (2) the canister concentration effect in the same fracture network. The results also show the two conflicting tendencies that the more fractures in the repository model space, the greater average value but the smaller uncertainty of the peak fractional release rate is. To perform a vast amount of calculation, we have utilized the Earth Simulator and SR8000. The multi-level hybrid programming method is applied in the optimization to exploit high performance of the Earth Simulator. The Latin Hypercube Sampling has been utilized to reduce the number of samplings in Monte-Carlo calculation. (authors)

  10. Uncertainty estimates for theoretical atomic and molecular data

    International Nuclear Information System (INIS)

    Chung, H-K; Braams, B J; Bartschat, K; Császár, A G; Drake, G W F; Kirchner, T; Kokoouline, V; Tennyson, J

    2016-01-01

    Sources of uncertainty are reviewed for calculated atomic and molecular data that are important for plasma modeling: atomic and molecular structures and cross sections for electron-atom, electron-molecule, and heavy particle collisions. We concentrate on model uncertainties due to approximations to the fundamental many-body quantum mechanical equations and we aim to provide guidelines to estimate uncertainties as a routine part of computations of data for structure and scattering. (topical review)

  11. Uncertainty estimation with a small number of measurements, part II: a redefinition of uncertainty and an estimator method

    Science.gov (United States)

    Huang, Hening

    2018-01-01

    This paper is the second (Part II) in a series of two papers (Part I and Part II). Part I has quantitatively discussed the fundamental limitations of the t-interval method for uncertainty estimation with a small number of measurements. This paper (Part II) reveals that the t-interval is an ‘exact’ answer to a wrong question; it is actually misused in uncertainty estimation. This paper proposes a redefinition of uncertainty, based on the classical theory of errors and the theory of point estimation, and a modification of the conventional approach to estimating measurement uncertainty. It also presents an asymptotic procedure for estimating the z-interval. The proposed modification is to replace the t-based uncertainty with an uncertainty estimator (mean- or median-unbiased). The uncertainty estimator method is an approximate answer to the right question to uncertainty estimation. The modified approach provides realistic estimates of uncertainty, regardless of whether the population standard deviation is known or unknown, or if the sample size is small or large. As an application example of the modified approach, this paper presents a resolution to the Du-Yang paradox (i.e. Paradox 2), one of the three paradoxes caused by the misuse of the t-interval in uncertainty estimation.

  12. Local conditions and uncertainty bands for Semiscale Test S-02-9

    International Nuclear Information System (INIS)

    Varacalle, D.J. Jr.

    1979-01-01

    Analysis was performed to derive local conditions heat transfer parameters and their uncertainties using computer codes and experimentally derived boundary conditions for the Semiscale core for LOCA Test S-02-9. Calculations performed consisted of nominal code cases using best-estimate input parameters and cases where the specified input parameters were perturbed in accordance with the response surface method of uncertainty analysis. The output parameters of interest were those that are used in film boiling heat transfer correlations including enthalpy, pressure, quality, and coolant flow rate. Large uncertainty deviations occurred during low core mass flow periods where the relative flow uncertainties were large. Utilizing the derived local conditions and their associated uncertainties, a study was then made which showed the uncertainty in film boiling heat transfer coefficient varied between 5 and 250%

  13. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  14. Sources of uncertainties in modelling black carbon at the global scale

    NARCIS (Netherlands)

    Vignati, E.; Karl, M.; Krol, M.C.; Wilson, J.; Stier, P.; Cavalli, F.

    2010-01-01

    Our understanding of the global black carbon (BC) cycle is essentially qualitative due to uncertainties in our knowledge of its properties. This work investigates two source of uncertainties in modelling black carbon: those due to the use of different schemes for BC ageing and its removal rate in

  15. Adaptive policymaking under deep uncertainty : Optimal preparedness for the next pandemic

    NARCIS (Netherlands)

    Hamarat, C.; Kwakkel, J.H.; Pruyt, E.

    2012-01-01

    The recent flu pandemic in 2009 caused a panic about the possible consequences due to deep uncertainty about an unknown virus. Overstock of vaccines or unnecessary social measures to be taken were all due to uncertainty. However, what should be the necessary actions to take in such deeply uncertain

  16. Reducing uncertainty in geostatistical description with well testing pressure data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, A.C.; He, Nanqun [Univ. of Tulsa, OK (United States); Oliver, D.S. [Chevron Petroleum Technology Company, La Habra, CA (United States)

    1997-08-01

    Geostatistics has proven to be an effective tool for generating realizations of reservoir properties conditioned to static data, e.g., core and log data and geologic knowledge. Due to the lack of closely spaced data in the lateral directions, there will be significant variability in reservoir descriptions generated by geostatistical simulation, i.e., significant uncertainty in the reservoir descriptions. In past work, we have presented procedures based on inverse problem theory for generating reservoir descriptions (rock property fields) conditioned to pressure data and geostatistical information represented as prior means for log-permeability and porosity and variograms. Although we have shown that the incorporation of pressure data reduces the uncertainty below the level contained in the geostatistical model based only on static information (the prior model), our previous results assumed did not explicitly account for uncertainties in the prior means and the parameters defining the variogram model. In this work, we investigate how pressure data can help detect errors in the prior means. If errors in the prior means are large and are not taken into account, realizations conditioned to pressure data represent incorrect samples of the a posteriori probability density function for the rock property fields, whereas, if the uncertainty in the prior mean is incorporated properly into the model, one obtains realistic realizations of the rock property fields.

  17. Modeling and solving a large-scale generation expansion planning problem under uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shan; Ryan, Sarah M. [Iowa State University, Department of Industrial and Manufacturing Systems Engineering, Ames (United States); Watson, Jean-Paul [Sandia National Laboratories, Discrete Math and Complex Systems Department, Albuquerque (United States); Woodruff, David L. [University of California Davis, Graduate School of Management, Davis (United States)

    2011-11-15

    We formulate a generation expansion planning problem to determine the type and quantity of power plants to be constructed over each year of an extended planning horizon, considering uncertainty regarding future demand and fuel prices. Our model is expressed as a two-stage stochastic mixed-integer program, which we use to compute solutions independently minimizing the expected cost and the Conditional Value-at-Risk; i.e., the risk of significantly larger-than-expected operational costs. We introduce stochastic process models to capture demand and fuel price uncertainty, which are in turn used to generate trees that accurately represent the uncertainty space. Using a realistic problem instance based on the Midwest US, we explore two fundamental, unexplored issues that arise when solving any stochastic generation expansion model. First, we introduce and discuss the use of an algorithm for computing confidence intervals on obtained solution costs, to account for the fact that a finite sample of scenarios was used to obtain a particular solution. Second, we analyze the nature of solutions obtained under different parameterizations of this method, to assess whether the recommended solutions themselves are invariant to changes in costs. The issues are critical for decision makers who seek truly robust recommendations for generation expansion planning. (orig.)

  18. Assessment of uncertainties in core melt phenomenology and their impact on risk at the Z/IP facilities

    International Nuclear Information System (INIS)

    Pratt, W.T.; Ludewig, H.; Bari, R.A.; Meyer, J.F.

    1983-01-01

    An evaluation of core meltdown accidents in the Z/IP facilities has been performed. Containment event trees have been developed to relate the progression of a given accident to various potential containment building failure modes. An extensive uncertainty analysis related to core melt phenomenology has been performed. A major conclusion of the study is that large variations in parameters associated with major phenomenological uncertainties have a relatively minor impact on risk when external initiators are considered. This is due to the inherent capability fo the Z/IP containment buildings to contain a wide range of core meltdown accidents. 12 references, 2 tables

  19. Range uncertainties in proton therapy and the role of Monte Carlo simulations

    International Nuclear Information System (INIS)

    Paganetti, Harald

    2012-01-01

    The main advantages of proton therapy are the reduced total energy deposited in the patient as compared to photon techniques and the finite range of the proton beam. The latter adds an additional degree of freedom to treatment planning. The range in tissue is associated with considerable uncertainties caused by imaging, patient setup, beam delivery and dose calculation. Reducing the uncertainties would allow a reduction of the treatment volume and thus allow a better utilization of the advantages of protons. This paper summarizes the role of Monte Carlo simulations when aiming at a reduction of range uncertainties in proton therapy. Differences in dose calculation when comparing Monte Carlo with analytical algorithms are analyzed as well as range uncertainties due to material constants and CT conversion. Range uncertainties due to biological effects and the role of Monte Carlo for in vivo range verification are discussed. Furthermore, the current range uncertainty recipes used at several proton therapy facilities are revisited. We conclude that a significant impact of Monte Carlo dose calculation can be expected in complex geometries where local range uncertainties due to multiple Coulomb scattering will reduce the accuracy of analytical algorithms. In these cases Monte Carlo techniques might reduce the range uncertainty by several mm. (topical review)

  20. Nonlinear unbiased minimum-variance filter for Mars entry autonomous navigation under large uncertainties and unknown measurement bias.

    Science.gov (United States)

    Xiao, Mengli; Zhang, Yongbo; Fu, Huimin; Wang, Zhihua

    2018-05-01

    High-precision navigation algorithm is essential for the future Mars pinpoint landing mission. The unknown inputs caused by large uncertainties of atmospheric density and aerodynamic coefficients as well as unknown measurement biases may cause large estimation errors of conventional Kalman filters. This paper proposes a derivative-free version of nonlinear unbiased minimum variance filter for Mars entry navigation. This filter has been designed to solve this problem by estimating the state and unknown measurement biases simultaneously with derivative-free character, leading to a high-precision algorithm for the Mars entry navigation. IMU/radio beacons integrated navigation is introduced in the simulation, and the result shows that with or without radio blackout, our proposed filter could achieve an accurate state estimation, much better than the conventional unscented Kalman filter, showing the ability of high-precision Mars entry navigation algorithm. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Characterizing Epistemic Uncertainty for Launch Vehicle Designs

    Science.gov (United States)

    Novack, Steven D.; Rogers, Jim; Hark, Frank; Al Hassan, Mohammad

    2016-01-01

    NASA Probabilistic Risk Assessment (PRA) has the task of estimating the aleatory (randomness) and epistemic (lack of knowledge) uncertainty of launch vehicle loss of mission and crew risk and communicating the results. Launch vehicles are complex engineered systems designed with sophisticated subsystems that are built to work together to accomplish mission success. Some of these systems or subsystems are in the form of heritage equipment, while some have never been previously launched. For these cases, characterizing the epistemic uncertainty is of foremost importance, and it is anticipated that the epistemic uncertainty of a modified launch vehicle design versus a design of well understood heritage equipment would be greater. For reasons that will be discussed, standard uncertainty propagation methods using Monte Carlo simulation produce counter intuitive results and significantly underestimate epistemic uncertainty for launch vehicle models. Furthermore, standard PRA methods such as Uncertainty-Importance analyses used to identify components that are significant contributors to uncertainty are rendered obsolete since sensitivity to uncertainty changes are not reflected in propagation of uncertainty using Monte Carlo methods.This paper provides a basis of the uncertainty underestimation for complex systems and especially, due to nuances of launch vehicle logic, for launch vehicles. It then suggests several alternative methods for estimating uncertainty and provides examples of estimation results. Lastly, the paper shows how to implement an Uncertainty-Importance analysis using one alternative approach, describes the results, and suggests ways to reduce epistemic uncertainty by focusing on additional data or testing of selected components.

  2. Uncertainty calculations made easier

    International Nuclear Information System (INIS)

    Hogenbirk, A.

    1994-07-01

    The results are presented of a neutron cross section sensitivity/uncertainty analysis performed in a complicated 2D model of the NET shielding blanket design inside the ITER torus design, surrounded by the cryostat/biological shield as planned for ITER. The calculations were performed with a code system developed at ECN Petten, with which sensitivity/uncertainty calculations become relatively simple. In order to check the deterministic neutron transport calculations (performed with DORT), calculations were also performed with the Monte Carlo code MCNP. Care was taken to model the 2.0 cm wide gaps between two blanket segments, as the neutron flux behind the vacuum vessel is largely determined by neutrons streaming through these gaps. The resulting neutron flux spectra are in excellent agreement up to the end of the cryostat. It is noted, that at this position the attenuation of the neutron flux is about 1 l orders of magnitude. The uncertainty in the energy integrated flux at the beginning of the vacuum vessel and at the beginning of the cryostat was determined in the calculations. The uncertainty appears to be strongly dependent on the exact geometry: if the gaps are filled with stainless steel, the neutron spectrum changes strongly, which results in an uncertainty of 70% in the energy integrated flux at the beginning of the cryostat in the no-gap-geometry, compared to an uncertainty of only 5% in the gap-geometry. Therefore, it is essential to take into account the exact geometry in sensitivity/uncertainty calculations. Furthermore, this study shows that an improvement of the covariance data is urgently needed in order to obtain reliable estimates of the uncertainties in response parameters in neutron transport calculations. (orig./GL)

  3. Climate change impacts on tree ranges: model intercomparison facilitates understanding and quantification of uncertainty.

    Science.gov (United States)

    Cheaib, Alissar; Badeau, Vincent; Boe, Julien; Chuine, Isabelle; Delire, Christine; Dufrêne, Eric; François, Christophe; Gritti, Emmanuel S; Legay, Myriam; Pagé, Christian; Thuiller, Wilfried; Viovy, Nicolas; Leadley, Paul

    2012-06-01

    Model-based projections of shifts in tree species range due to climate change are becoming an important decision support tool for forest management. However, poorly evaluated sources of uncertainty require more scrutiny before relying heavily on models for decision-making. We evaluated uncertainty arising from differences in model formulations of tree response to climate change based on a rigorous intercomparison of projections of tree distributions in France. We compared eight models ranging from niche-based to process-based models. On average, models project large range contractions of temperate tree species in lowlands due to climate change. There was substantial disagreement between models for temperate broadleaf deciduous tree species, but differences in the capacity of models to account for rising CO(2) impacts explained much of the disagreement. There was good quantitative agreement among models concerning the range contractions for Scots pine. For the dominant Mediterranean tree species, Holm oak, all models foresee substantial range expansion. © 2012 Blackwell Publishing Ltd/CNRS.

  4. Confidence level in the calculations of HCDA consequences using large codes

    International Nuclear Information System (INIS)

    Nguyen, D.H.; Wilburn, N.P.

    1979-01-01

    The probabilistic approach to nuclear reactor safety is playing an increasingly significant role. For the liquid-metal fast breeder reactor (LMFBR) in particular, the ultimate application of this approach could be to determine the probability of achieving the goal of a specific line-of-assurance (LOA). Meanwhile a more pressing problem is one of quantifying the uncertainty in a calculated consequence for hypothetical core disruptive accident (HCDA) using large codes. Such uncertainty arises from imperfect modeling of phenomenology and/or from inaccuracy in input data. A method is presented to determine the confidence level in consequences calculated by a large computer code due to the known uncertainties in input invariables. A particular application was made to the initial time of pin failure in a transient overpower HCDA calculated by the code MELT-IIIA in order to demonstrate the method. A probability distribution function (pdf) for the time of failure was first constructed, then the confidence level for predicting this failure parameter within a desired range was determined

  5. Uncertainty analysis methods for estimation of reliability of passive system of VHTR

    International Nuclear Information System (INIS)

    Han, S.J.

    2012-01-01

    An estimation of reliability of passive system for the probabilistic safety assessment (PSA) of a very high temperature reactor (VHTR) is under development in Korea. The essential approach of this estimation is to measure the uncertainty of the system performance under a specific accident condition. The uncertainty propagation approach according to the simulation of phenomenological models (computer codes) is adopted as a typical method to estimate the uncertainty for this purpose. This presentation introduced the uncertainty propagation and discussed the related issues focusing on the propagation object and its surrogates. To achieve a sufficient level of depth of uncertainty results, the applicability of the propagation should be carefully reviewed. For an example study, Latin-hypercube sampling (LHS) method as a direct propagation was tested for a specific accident sequence of VHTR. The reactor cavity cooling system (RCCS) developed by KAERI was considered for this example study. This is an air-cooled type passive system that has no active components for its operation. The accident sequence is a low pressure conduction cooling (LPCC) accident that is considered as a design basis accident for the safety design of VHTR. This sequence is due to a large failure of the pressure boundary of the reactor system such as a guillotine break of coolant pipe lines. The presentation discussed the obtained insights (benefit and weakness) to apply an estimation of reliability of passive system

  6. The importance of input interactions in the uncertainty and sensitivity analysis of nuclear fuel behavior

    Energy Technology Data Exchange (ETDEWEB)

    Ikonen, T., E-mail: timo.ikonen@vtt.fi; Tulkki, V.

    2014-08-15

    Highlights: • Uncertainty and sensitivity analysis of modeled nuclear fuel behavior is performed. • Burnup dependency of the uncertainties and sensitivities is characterized. • Input interactions significantly increase output uncertainties for irradiated fuel. • Identification of uncertainty sources is greatly improved with higher order methods. • Results stress the importance of using methods that take interactions into account. - Abstract: The propagation of uncertainties in a PWR fuel rod under steady-state irradiation is analyzed by computational means. A hypothetical steady-state scenario of the Three Mile Island 1 reactor fuel rod is modeled with the fuel performance FRAPCON, using realistic input uncertainties for the fabrication and model parameters, boundary conditions and material properties. The uncertainty and sensitivity analysis is performed by extensive Monte Carlo sampling of the inputs’ probability distribution and by applying correlation coefficient and Sobol’ variance decomposition analyses. The latter includes evaluation of the second order and total effect sensitivity indices, allowing the study of interactions between input variables. The results show that the interactions play a large role in the propagation of uncertainties, and first order methods such as the correlation coefficient analyses are in general insufficient for sensitivity analysis of the fuel rod. Significant improvement over the first order methods can be achieved by using higher order methods. The results also show that both the magnitude of the uncertainties and their propagation depends not only on the output in question, but also on burnup. The latter is due to onset of new phenomena (such as the fission gas release) and the gradual closure of the pellet-cladding gap with increasing burnup. Increasing burnup also affects the importance of input interactions. Interaction effects are typically highest in the moderate burnup (of the order of 10–40 MWd

  7. Uncertainty analysis in safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)

    1997-12-31

    Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov

  8. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  9. Towards a different attitude to uncertainty

    Directory of Open Access Journals (Sweden)

    Guy Pe'er

    2014-10-01

    Full Text Available The ecological literature deals with uncertainty primarily from the perspective of how to reduce it to acceptable levels. However, the current rapid and ubiquitous environmental changes, as well as anticipated rates of change, pose novel conditions and complex dynamics due to which many sources of uncertainty are difficult or even impossible to reduce. These include both uncertainty in knowledge (epistemic uncertainty and societal responses to it. Under these conditions, an increasing number of studies ask how one can deal with uncertainty as it is. Here, we explore the question how to adopt an overall alternative attitude to uncertainty, which accepts or even embraces it. First, we show that seeking to reduce uncertainty may be counterproductive under some circumstances. It may yield overconfidence, ignoring early warning signs, policy- and societal stagnation, or irresponsible behaviour if personal certainty is offered by externalization of environmental costs. We then demonstrate that uncertainty can have positive impacts by driving improvements in knowledge, promoting cautious action, contributing to keeping societies flexible and adaptable, enhancing awareness, support and involvement of the public in nature conservation, and enhancing cooperation and communication. We discuss the risks of employing a certainty paradigm on uncertain knowledge, the potential benefits of adopting an alternative attitude to uncertainty, and the need to implement such an attitude across scales – from adaptive management at the local scale, to the evolving Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES at the global level.

  10. Approximating uncertainty of annual runoff and reservoir yield using stochastic replicates of global climate model data

    Science.gov (United States)

    Peel, M. C.; Srikanthan, R.; McMahon, T. A.; Karoly, D. J.

    2015-04-01

    Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between global climate models (GCMs) and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3) and phase 5 (CMIP5) data sets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to present a proof-of-concept approximation of within-GCM uncertainty for monthly precipitation and temperature projections and to assess the impact of within-GCM uncertainty on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. We adopt stochastic replicates of available GCM runs to approximate within-GCM uncertainty because large ensembles, hundreds of runs, for a given GCM and scenario are unavailable, other than the Climateprediction.net data set for the Hadley Centre GCM. To date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2015) sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from

  11. Investigation of the uncertainty of a validation experiment due to uncertainty in its boundary conditions

    International Nuclear Information System (INIS)

    Harris, J.; Nani, D.; Jones, K.; Khodier, M.; Smith, B.L.

    2011-01-01

    Elements contributing to uncertainty in experimental repeatability are quantified for data acquisition in a bank of cylinders. The cylinder bank resembles the lower plenum of a high temperature reactor with cylinders arranged on equilateral triangles with a pitch to diameter ratio of 1.7. The 3-D as-built geometry was measured by imaging reflections off the internal surfaces of the facility. This information is useful for building CFD grids for Validation studies. Time-averaged Particle Image Velocimetry (PIV) measurements were acquired daily over several months along with the pressure drop between two cylinders. The atmospheric pressure was measured along with the data set. The PIV data and pressure drop were correlated with atmospheric conditions and changes in experimental setup. It was found that atmospheric conditions play little role in the channel velocity, but impact the pressure drop significantly. The adjustments made to the experiment setup did not change the results. However, in some cases, the wake behind a cylinder was shifted significantly from one day to the next. These changes did not correlate with ambient pressure, room temperature, nor tear down/rebuilds of the facility. (author)

  12. ICYESS 2013: Understanding and Interpreting Uncertainty

    Science.gov (United States)

    Rauser, F.; Niederdrenk, L.; Schemann, V.; Schmidt, A.; Suesser, D.; Sonntag, S.

    2013-12-01

    We will report the outcomes and highlights of the Interdisciplinary Conference of Young Earth System Scientists (ICYESS) on Understanding and Interpreting Uncertainty in September 2013, Hamburg, Germany. This conference is aimed at early career scientists (Masters to Postdocs) from a large variety of scientific disciplines and backgrounds (natural, social and political sciences) and will enable 3 days of discussions on a variety of uncertainty-related aspects: 1) How do we deal with implicit and explicit uncertainty in our daily scientific work? What is uncertain for us, and for which reasons? 2) How can we communicate these uncertainties to other disciplines? E.g., is uncertainty in cloud parameterization and respectively equilibrium climate sensitivity a concept that is understood equally well in natural and social sciences that deal with Earth System questions? Or vice versa, is, e.g., normative uncertainty as in choosing a discount rate relevant for natural scientists? How can those uncertainties be reconciled? 3) How can science communicate this uncertainty to the public? Is it useful at all? How are the different possible measures of uncertainty understood in different realms of public discourse? Basically, we want to learn from all disciplines that work together in the broad Earth System Science community how to understand and interpret uncertainty - and then transfer this understanding to the problem of how to communicate with the public, or its different layers / agents. ICYESS is structured in a way that participation is only possible via presentation, so every participant will give their own professional input into how the respective disciplines deal with uncertainty. Additionally, a large focus is put onto communication techniques; there are no 'standard presentations' in ICYESS. Keynote lectures by renowned scientists and discussions will lead to a deeper interdisciplinary understanding of what we do not really know, and how to deal with it. Many

  13. The uncertainties in estimating measurement uncertainties

    International Nuclear Information System (INIS)

    Clark, J.P.; Shull, A.H.

    1994-01-01

    All measurements include some error. Whether measurements are used for accountability, environmental programs or process support, they are of little value unless accompanied by an estimate of the measurements uncertainty. This fact is often overlooked by the individuals who need measurements to make decisions. This paper will discuss the concepts of measurement, measurements errors (accuracy or bias and precision or random error), physical and error models, measurement control programs, examples of measurement uncertainty, and uncertainty as related to measurement quality. Measurements are comparisons of unknowns to knowns, estimates of some true value plus uncertainty; and are no better than the standards to which they are compared. Direct comparisons of unknowns that match the composition of known standards will normally have small uncertainties. In the real world, measurements usually involve indirect comparisons of significantly different materials (e.g., measuring a physical property of a chemical element in a sample having a matrix that is significantly different from calibration standards matrix). Consequently, there are many sources of error involved in measurement processes that can affect the quality of a measurement and its associated uncertainty. How the uncertainty estimates are determined and what they mean is as important as the measurement. The process of calculating the uncertainty of a measurement itself has uncertainties that must be handled correctly. Examples of chemistry laboratory measurement will be reviewed in this report and recommendations made for improving measurement uncertainties

  14. Uncertainty propagation in probabilistic risk assessment: A comparative study

    International Nuclear Information System (INIS)

    Ahmed, S.; Metcalf, D.R.; Pegram, J.W.

    1982-01-01

    Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)

  15. Propagation of dynamic measurement uncertainty

    International Nuclear Information System (INIS)

    Hessling, J P

    2011-01-01

    The time-dependent measurement uncertainty has been evaluated in a number of recent publications, starting from a known uncertain dynamic model. This could be defined as the 'downward' propagation of uncertainty from the model to the targeted measurement. The propagation of uncertainty 'upward' from the calibration experiment to a dynamic model traditionally belongs to system identification. The use of different representations (time, frequency, etc) is ubiquitous in dynamic measurement analyses. An expression of uncertainty in dynamic measurements is formulated for the first time in this paper independent of representation, joining upward as well as downward propagation. For applications in metrology, the high quality of the characterization may be prohibitive for any reasonably large and robust model to pass the whiteness test. This test is therefore relaxed by not directly requiring small systematic model errors in comparison to the randomness of the characterization. Instead, the systematic error of the dynamic model is propagated to the uncertainty of the measurand, analogously but differently to how stochastic contributions are propagated. The pass criterion of the model is thereby transferred from the identification to acceptance of the total accumulated uncertainty of the measurand. This increases the relevance of the test of the model as it relates to its final use rather than the quality of the calibration. The propagation of uncertainty hence includes the propagation of systematic model errors. For illustration, the 'upward' propagation of uncertainty is applied to determine if an appliance box is damaged in an earthquake experiment. In this case, relaxation of the whiteness test was required to reach a conclusive result

  16. The role of scientific uncertainty in compliance with the Kyoto Protocol to the Climate Change Convention

    International Nuclear Information System (INIS)

    Gupta, Joyeeta; Olsthoorn, Xander; Rotenberg, Edan

    2003-01-01

    Under the climate change treaties, developed countries are under a quantitative obligation to limit their emissions of greenhouse gases (GHG). This paper argues that although the climate change regime is setting up various measures and mechanisms, there will still be significant uncertainty about the actual emission reductions and the effectiveness of the regime will depend largely on how countries actually implement their obligations in practice. These uncertainties arise from the calculation of emissions from each source, the tallying up these emissions, adding or deducting changes due to land use change and forestry (LUCF) and finally from subtracting or adding emission reduction units (ERUs). Further, it points to the problem of uncertainty in the reductions as opposed to the uncertainty in the inventories themselves. The protocols have temporarily opted to deal with these problems through harmonisation in reporting methodologies and to seek transparency by calling on parties involved to use specific guidelines and to report on their uncertainty. This paper concludes that this harmonisation of reporting methodologies does not account for regional differences and that while transparency will indicate when countries are adopting strategies that have high uncertainty; it will not help to increase the effectiveness of the protocol. Uncertainty about compliance then becomes a critical issue. This paper proposes to reduce this uncertainty in compliance by setting a minimum requirement for the probability of compliance

  17. Understanding and reducing statistical uncertainties in nebular abundance determinations

    Science.gov (United States)

    Wesson, R.; Stock, D. J.; Scicluna, P.

    2012-06-01

    Whenever observations are compared to theories, an estimate of the uncertainties associated with the observations is vital if the comparison is to be meaningful. However, many or even most determinations of temperatures, densities and abundances in photoionized nebulae do not quote the associated uncertainty. Those that do typically propagate the uncertainties using analytical techniques which rely on assumptions that generally do not hold. Motivated by this issue, we have developed Nebular Empirical Analysis Tool (NEAT), a new code for calculating chemical abundances in photoionized nebulae. The code carries out a standard analysis of lists of emission lines using long-established techniques to estimate the amount of interstellar extinction, calculate representative temperatures and densities, compute ionic abundances from both collisionally excited lines and recombination lines, and finally to estimate total elemental abundances using an ionization correction scheme. NEATuses a Monte Carlo technique to robustly propagate uncertainties from line flux measurements through to the derived abundances. We show that, for typical observational data, this approach is superior to analytic estimates of uncertainties. NEAT also accounts for the effect of upward biasing on measurements of lines with low signal-to-noise ratio, allowing us to accurately quantify the effect of this bias on abundance determinations. We find not only that the effect can result in significant overestimates of heavy element abundances derived from weak lines, but also that taking it into account reduces the uncertainty of these abundance determinations. Finally, we investigate the effect of possible uncertainties in R, the ratio of selective-to-total extinction, on abundance determinations. We find that the uncertainty due to this parameter is negligible compared to the statistical uncertainties due to typical line flux measurement uncertainties.

  18. Uncertainty in projected point precipitation extremes for hydrological impact analysis of climate change

    Science.gov (United States)

    Van Uytven, Els; Willems, Patrick

    2017-04-01

    temperature and PET observations at Uccle and a large ensemble of 160 global climate model runs (CMIP5). They cover all four representative concentration pathway based greenhouse gas scenarios. While evaluating the downscaled meteorological series, particular attention was given to the performance of extreme value metrics (e.g. for precipitation, by means of intensity-duration-frequency statistics). Moreover, the total uncertainty was decomposed in the fractional uncertainties for each of the uncertainty sources considered. Research assessing the additional uncertainty due to parameter and structural uncertainties of the hydrological impact model is ongoing.

  19. Experimental Research Examining how People can Cope with Uncertainty through Soft Haptic Sensations

    NARCIS (Netherlands)

    Van Horen, F.; Mussweiler, T.

    2015-01-01

    Human beings are constantly surrounded by uncertainty and change. The question arises how people cope with such uncertainty. To date, most research has focused on the cognitive strategies people adopt to deal with uncertainty. However, especially when uncertainty is due to unpredictable societal

  20. Uncertainty Quantification in Numerical Aerodynamics

    KAUST Repository

    Litvinenko, Alexander

    2017-05-16

    We consider uncertainty quantification problem in aerodynamic simulations. We identify input uncertainties, classify them, suggest an appropriate statistical model and, finally, estimate propagation of these uncertainties into the solution (pressure, velocity and density fields as well as the lift and drag coefficients). The deterministic problem under consideration is a compressible transonic Reynolds-averaged Navier-Strokes flow around an airfoil with random/uncertain data. Input uncertainties include: uncertain angle of attack, the Mach number, random perturbations in the airfoil geometry, mesh, shock location, turbulence model and parameters of this turbulence model. This problem requires efficient numerical/statistical methods since it is computationally expensive, especially for the uncertainties caused by random geometry variations which involve a large number of variables. In numerical section we compares five methods, including quasi-Monte Carlo quadrature, polynomial chaos with coefficients determined by sparse quadrature and gradient-enhanced version of Kriging, radial basis functions and point collocation polynomial chaos, in their efficiency in estimating statistics of aerodynamic performance upon random perturbation to the airfoil geometry [D.Liu et al \\'17]. For modeling we used the TAU code, developed in DLR, Germany.

  1. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing......This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  2. Uncertainties in Nuclear Proliferation Modeling

    International Nuclear Information System (INIS)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok

    2015-01-01

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies

  3. Uncertainty of Forest Biomass Estimates in North Temperate Forests Due to Allometry: Implications for Remote Sensing

    Directory of Open Access Journals (Sweden)

    Razi Ahmed

    2013-06-01

    Full Text Available Estimates of above ground biomass density in forests are crucial for refining global climate models and understanding climate change. Although data from field studies can be aggregated to estimate carbon stocks on global scales, the sparsity of such field data, temporal heterogeneity and methodological variations introduce large errors. Remote sensing measurements from spaceborne sensors are a realistic alternative for global carbon accounting; however, the uncertainty of such measurements is not well known and remains an active area of research. This article describes an effort to collect field data at the Harvard and Howland Forest sites, set in the temperate forests of the Northeastern United States in an attempt to establish ground truth forest biomass for calibration of remote sensing measurements. We present an assessment of the quality of ground truth biomass estimates derived from three different sets of diameter-based allometric equations over the Harvard and Howland Forests to establish the contribution of errors in ground truth data to the error in biomass estimates from remote sensing measurements.

  4. Evaluation of uncertainties in MUF for a LWR fuel fabrication plant. Pt.2 - Pt.4

    International Nuclear Information System (INIS)

    Mennerdahl, D.

    1984-09-01

    MUF (Material Unaccounted For) is a parameter defined as the estimated loss of materials during a certain period of time. A suitable method for uncertainty and bias estimations has been developed. The method was specifically adjusted for a facility like the ASEA-ATOM fuel fabrication plant. Operations that are expected to contribute to the uncertainties have been compiled. Information that is required for the application of the developed method is described. Proposals for simplification of the required information without losing the accuracy are suggested. ASEA-ATOM had earlier determined uncertainty data for the scales that are used for nuclear materials. The statistical uncertainties included random errors, short-term and long-term systematic errors. Information for the determination of biases was also determined (constants and formulas). The method proposed by ASEA-ATOM for the determination of uncertainties due to the scales is compatible with the method proposed in this report. For other operations than weighing, the information from ASEA-ATOM is limited. Such operations are completely dominating the total uncertainty in MUF. Examples of calculations of uncertainties and bias are given for uranium oxide powders in large containers. Examples emphasize the differences between various statistical errors (random and systematic errors) and biases (known errors). The importance of correlations between different items in the inventories is explained. A specific correlation of great importance is the use of nominal factors (uranium concentration). A portable personal computer can be used to determine uncertainties in MUF. (author)

  5. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  6. Small break LOCA RELAP5/MOD3 uncertainty quantification: Bias and uncertainty evaluation for important phenomena

    International Nuclear Information System (INIS)

    Ortiz, M.G.; Ghan, L.S.; Vogl, J.

    1991-01-01

    The Nuclear Regulatory Commission (NRC) revised the Emergency Core Cooling System (ECCS) licensing rule to allow the use of Best Estimate (BE) computer codes, provided the uncertainty of the calculations are quantified and used in the licensing and regulation process. The NRC developed a generic methodology called Code Scaling, Applicability and Uncertainty (CSAU) to evaluate BE code uncertainties. The CSAU methodology was demonstrated with a specific application to a pressurized water reactor (PWR), experiencing a postulated large break loss-of-coolant accident (LBLOCA). The current work is part of an effort to adapt and demonstrate the CSAU methodology to a small break (SB) LOCA in a PWR of B and W design using RELAP5/MOD3 as the simulation tool. The subject of this paper is the Assessment and Ranging of Parameters (Element 2 of the CSAU methodology), which determines the contribution to uncertainty of specific models in the code

  7. Risk Assessment Uncertainties in Cybersecurity Investments

    Directory of Open Access Journals (Sweden)

    Andrew Fielder

    2018-06-01

    Full Text Available When undertaking cybersecurity risk assessments, it is important to be able to assign numeric values to metrics to compute the final expected loss that represents the risk that an organization is exposed to due to cyber threats. Even if risk assessment is motivated by real-world observations and data, there is always a high chance of assigning inaccurate values due to different uncertainties involved (e.g., evolving threat landscape, human errors and the natural difficulty of quantifying risk. Existing models empower organizations to compute optimal cybersecurity strategies given their financial constraints, i.e., available cybersecurity budget. Further, a general game-theoretic model with uncertain payoffs (probability-distribution-valued payoffs shows that such uncertainty can be incorporated in the game-theoretic model by allowing payoffs to be random. This paper extends previous work in the field to tackle uncertainties in risk assessment that affect cybersecurity investments. The findings from simulated examples indicate that although uncertainties in cybersecurity risk assessment lead, on average, to different cybersecurity strategies, they do not play a significant role in the final expected loss of the organization when utilising a game-theoretic model and methodology to derive these strategies. The model determines robust defending strategies even when knowledge regarding risk assessment values is not accurate. As a result, it is possible to show that the cybersecurity investments’ tool is capable of providing effective decision support.

  8. Quantifying and Reducing Curve-Fitting Uncertainty in Isc

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-06-14

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  9. Effects of High-Latitude Forcing Uncertainty on the Low-Latitude and Midlatitude Ionosphere

    Science.gov (United States)

    Pedatella, N. M.; Lu, G.; Richmond, A. D.

    2018-01-01

    Ensemble simulations are performed using the Thermosphere-Ionosphere-Electrodynamics General Circulation Model (TIE-GCM) in order to understand the role of high-latitude forcing uncertainty on the low-latitude and midlatitude ionosphere response to the April 2010 geomagnetic storm. The ensemble is generated by perturbing either the high-latitude electric potential or auroral energy flux in the assimilative mapping for ionosphere electrodynamics (AMIE). Simulations with perturbed high-latitude electric potential result in substantial intraensemble variability in the low-latitude and midlatitude ionosphere response to the geomagnetic storm, and the ensemble standard deviation for the change in NmF2 reaches 50-100% of the mean change. Such large intraensemble variability is not seen when perturbing the auroral energy flux. In this case, the effects of the forcing uncertainty are primarily confined to high latitudes. We therefore conclude that the specification of high-latitude electric fields is an important source of uncertainty when modeling the low-latitude and midlatitude ionosphere response to a geomagnetic storm. A multiple linear regression analysis of the results indicates that uncertainty in the storm time changes in the equatorial electric fields, neutral winds, and neutral composition can all contribute to the uncertainty in the ionosphere electron density. The results of the present study provide insight into the possible uncertainty in simulations of the low-latitude and midlatitude ionosphere response to geomagnetic storms due to imperfect knowledge of the high-latitude forcing.

  10. Piezoelectric energy harvesting with parametric uncertainty

    International Nuclear Information System (INIS)

    Ali, S F; Friswell, M I; Adhikari, S

    2010-01-01

    The design and analysis of energy harvesting devices is becoming increasing important in recent years. Most of the literature has focused on the deterministic analysis of these systems and the problem of uncertain parameters has received less attention. Energy harvesting devices exhibit parametric uncertainty due to errors in measurement, errors in modelling and variability in the parameters during manufacture. This paper investigates the effect of parametric uncertainty in the mechanical system on the harvested power, and derives approximate explicit formulae for the optimal electrical parameters that maximize the mean harvested power. The maximum of the mean harvested power decreases with increasing uncertainty, and the optimal frequency at which the maximum mean power occurs shifts. The effect of the parameter variance on the optimal electrical time constant and optimal coupling coefficient are reported. Monte Carlo based simulation results are used to further analyse the system under parametric uncertainty

  11. Uncertainty in simulating wheat yields under climate change

    DEFF Research Database (Denmark)

    Asseng, A; Ewert, F; Rosenzweig, C

    2013-01-01

    of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models...... than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi...

  12. Uncertainty analysis techniques

    International Nuclear Information System (INIS)

    Marivoet, J.; Saltelli, A.; Cadelli, N.

    1987-01-01

    The origin of the uncertainty affecting Performance Assessments, as well as their propagation to dose and risk results is discussed. The analysis is focused essentially on the uncertainties introduced by the input parameters, the values of which may range over some orders of magnitude and may be given as probability distribution function. The paper briefly reviews the existing sampling techniques used for Monte Carlo simulations and the methods for characterizing the output curves, determining their convergence and confidence limits. Annual doses, expectation values of the doses and risks are computed for a particular case of a possible repository in clay, in order to illustrate the significance of such output characteristics as the mean, the logarithmic mean and the median as well as their ratios. The report concludes that provisionally, due to its better robustness, such estimation as the 90th percentile may be substituted to the arithmetic mean for comparison of the estimated doses with acceptance criteria. In any case, the results obtained through Uncertainty Analyses must be interpreted with caution as long as input data distribution functions are not derived from experiments reasonably reproducing the situation in a well characterized repository and site

  13. Quantification of Safety-Critical Software Test Uncertainty

    International Nuclear Information System (INIS)

    Khalaquzzaman, M.; Cho, Jaehyun; Lee, Seung Jun; Jung, Wondea

    2015-01-01

    The method, conservatively assumes that the failure probability of a software for the untested inputs is 1, and the failure probability turns in 0 for successful testing of all test cases. However, in reality the chance of failure exists due to the test uncertainty. Some studies have been carried out to identify the test attributes that affect the test quality. Cao discussed the testing effort, testing coverage, and testing environment. Management of the test uncertainties was discussed in. In this study, the test uncertainty has been considered to estimate the software failure probability because the software testing process is considered to be inherently uncertain. A reliability estimation of software is very important for a probabilistic safety analysis of a digital safety critical system of NPPs. This study focused on the estimation of the probability of a software failure that considers the uncertainty in software testing. In our study, BBN has been employed as an example model for software test uncertainty quantification. Although it can be argued that the direct expert elicitation of test uncertainty is much simpler than BBN estimation, however the BBN approach provides more insights and a basis for uncertainty estimation

  14. Uncertainty in hydraulic tests in fractured rock

    International Nuclear Information System (INIS)

    Ji, Sung-Hoon; Koh, Yong-Kwon

    2014-01-01

    Interpretation of hydraulic tests in fractured rock has uncertainty because of the different hydraulic properties of a fractured rock to a porous medium. In this study, we reviewed several interesting phenomena which show uncertainty in a hydraulic test at a fractured rock and discussed their origins and the how they should be considered during site characterisation. Our results show that the estimated hydraulic parameters of a fractured rock from a hydraulic test are associated with uncertainty due to the changed aperture and non-linear groundwater flow during the test. Although the magnitude of these two uncertainties is site-dependent, the results suggest that it is recommended to conduct a hydraulic test with a little disturbance from the natural groundwater flow to consider their uncertainty. Other effects reported from laboratory and numerical experiments such as the trapping zone effect (Boutt, 2006) and the slip condition effect (Lee, 2014) can also introduce uncertainty to a hydraulic test, which should be evaluated in a field test. It is necessary to consider the way how to evaluate the uncertainty in the hydraulic property during the site characterisation and how to apply it to the safety assessment of a subsurface repository. (authors)

  15. Uncertainty analysis

    International Nuclear Information System (INIS)

    Thomas, R.E.

    1982-03-01

    An evaluation is made of the suitability of analytical and statistical sampling methods for making uncertainty analyses. The adjoint method is found to be well-suited for obtaining sensitivity coefficients for computer programs involving large numbers of equations and input parameters. For this purpose the Latin Hypercube Sampling method is found to be inferior to conventional experimental designs. The Latin hypercube method can be used to estimate output probability density functions, but requires supplementary rank transformations followed by stepwise regression to obtain uncertainty information on individual input parameters. A simple Cork and Bottle problem is used to illustrate the efficiency of the adjoint method relative to certain statistical sampling methods. For linear models of the form Ax=b it is shown that a complete adjoint sensitivity analysis can be made without formulating and solving the adjoint problem. This can be done either by using a special type of statistical sampling or by reformulating the primal problem and using suitable linear programming software

  16. Mitigating Provider Uncertainty in Service Provision Contracts

    Science.gov (United States)

    Smith, Chris; van Moorsel, Aad

    Uncertainty is an inherent property of open, distributed and multiparty systems. The viability of the mutually beneficial relationships which motivate these systems relies on rational decision-making by each constituent party under uncertainty. Service provision in distributed systems is one such relationship. Uncertainty is experienced by the service provider in his ability to deliver a service with selected quality level guarantees due to inherent non-determinism, such as load fluctuations and hardware failures. Statistical estimators utilized to model this non-determinism introduce additional uncertainty through sampling error. Inability of the provider to accurately model and analyze uncertainty in the quality level guarantees can result in the formation of sub-optimal service provision contracts. Emblematic consequences include loss of revenue, inefficient resource utilization and erosion of reputation and consumer trust. We propose a utility model for contract-based service provision to provide a systematic approach to optimal service provision contract formation under uncertainty. Performance prediction methods to enable the derivation of statistical estimators for quality level are introduced, with analysis of their resultant accuracy and cost.

  17. Sensitivity analysis of local uncertainties in large break loss-of-coolant accident (LB-LOCA) thermo-mechanical simulations

    Energy Technology Data Exchange (ETDEWEB)

    Arkoma, Asko, E-mail: asko.arkoma@vtt.fi; Ikonen, Timo

    2016-08-15

    Highlights: • A sensitivity analysis using the data from EPR LB-LOCA simulations is done. • A procedure to analyze such complex data is outlined. • Both visual and quantitative methods are used. • Input factors related to core design are identified as most significant. - Abstract: In this paper, a sensitivity analysis for the data originating from a large break loss-of-coolant accident (LB-LOCA) analysis of an EPR-type nuclear power plant is presented. In the preceding LOCA analysis, the number of failing fuel rods in the accident was established (Arkoma et al., 2015). However, the underlying causes for rod failures were not addressed. It is essential to bring out which input parameters and boundary conditions have significance to the outcome of the analysis, i.e. the ballooning and burst of the rods. Due to complexity of the existing data, the first part of the analysis consists of defining the relevant input parameters for the sensitivity analysis. Then, selected sensitivity measures are calculated between the chosen input and output parameters. The ultimate goal is to develop a systematic procedure for the sensitivity analysis of statistical LOCA simulation that takes into account the various sources of uncertainties in the calculation chain. In the current analysis, the most relevant parameters with respect to the cladding integrity are the decay heat power during the transient, the thermal hydraulic conditions in the rod’s location in reactor, and the steady-state irradiation history of the rod. Meanwhile, the tolerances in fuel manufacturing parameters were found to have negligible effect on cladding deformation.

  18. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  19. Uncertainty evaluation in normalization of isotope delta measurement results against international reference materials.

    Science.gov (United States)

    Meija, Juris; Chartrand, Michelle M G

    2018-01-01

    Isotope delta measurements are normalized against international reference standards. Although multi-point normalization is becoming a standard practice, the existing uncertainty evaluation practices are either undocumented or are incomplete. For multi-point normalization, we present errors-in-variables regression models for explicit accounting of the measurement uncertainty of the international standards along with the uncertainty that is attributed to their assigned values. This manuscript presents framework to account for the uncertainty that arises due to a small number of replicate measurements and discusses multi-laboratory data reduction while accounting for inevitable correlations between the laboratories due to the use of identical reference materials for calibration. Both frequentist and Bayesian methods of uncertainty analysis are discussed.

  20. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    Science.gov (United States)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  1. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    International Nuclear Information System (INIS)

    Kirchner, G.; Peterson, R.

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  2. Uncertainty and validation. Effect of user interpretation on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Kirchner, G. [Univ. of Bremen (Germany); Peterson, R. [AECL, Chalk River, ON (Canada)] [and others

    1996-11-01

    Uncertainty in predictions of environmental transfer models arises from, among other sources, the adequacy of the conceptual model, the approximations made in coding the conceptual model, the quality of the input data, the uncertainty in parameter values, and the assumptions made by the user. In recent years efforts to quantify the confidence that can be placed in predictions have been increasing, but have concentrated on a statistical propagation of the influence of parameter uncertainties on the calculational results. The primary objective of this Working Group of BIOMOVS II was to test user's influence on model predictions on a more systematic basis than has been done before. The main goals were as follows: To compare differences between predictions from different people all using the same model and the same scenario description with the statistical uncertainties calculated by the model. To investigate the main reasons for different interpretations by users. To create a better awareness of the potential influence of the user on the modeling results. Terrestrial food chain models driven by deposition of radionuclides from the atmosphere were used. Three codes were obtained and run with three scenarios by a maximum of 10 users. A number of conclusions can be drawn some of which are general and independent of the type of models and processes studied, while others are restricted to the few processes that were addressed directly: For any set of predictions, the variation in best estimates was greater than one order of magnitude. Often the range increased from deposition to pasture to milk probably due to additional transfer processes. The 95% confidence intervals about the predictions calculated from the parameter distributions prepared by the participants did not always overlap the observations; similarly, sometimes the confidence intervals on the predictions did not overlap. Often the 95% confidence intervals of individual predictions were smaller than the

  3. Citizen Candidates Under Uncertainty

    OpenAIRE

    Eguia, Jon X.

    2005-01-01

    In this paper we make two contributions to the growing literature on "citizen-candidate" models of representative democracy. First, we add uncertainty about the total vote count. We show that in a society with a large electorate, where the outcome of the election is uncertain and where winning candidates receive a large reward from holding office, there will be a two-candidate equilibrium and no equilibria with a single candidate. Second, we introduce a new concept of equilibrium, which we te...

  4. Uncertainty of the peak flow reconstruction of the 1907 flood in the Ebro River in Xerta (NE Iberian Peninsula)

    Science.gov (United States)

    Ruiz-Bellet, Josep Lluís; Castelltort, Xavier; Balasch, J. Carles; Tuset, Jordi

    2017-02-01

    There is no clear, unified and accepted method to estimate the uncertainty of hydraulic modelling results. In historical floods reconstruction, due to the lower precision of input data, the magnitude of this uncertainty could reach a high value. With the objectives of giving an estimate of the peak flow error of a typical historical flood reconstruction with the model HEC-RAS and of providing a quick, simple uncertainty assessment that an end user could easily apply, the uncertainty of the reconstructed peak flow of a major flood in the Ebro River (NE Iberian Peninsula) was calculated with a set of local sensitivity analyses on six input variables. The peak flow total error was estimated at ±31% and water height was found to be the most influential variable on peak flow, followed by Manning's n. However, the latter, due to its large uncertainty, was the greatest contributor to peak flow total error. Besides, the HEC-RAS resulting peak flow was compared to the ones obtained with the 2D model Iber and with Manning's equation; all three methods gave similar peak flows. Manning's equation gave almost the same result than HEC-RAS. The main conclusion is that, to ensure the lowest peak flow error, the reliability and precision of the flood mark should be thoroughly assessed.

  5. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Biros, George [Univ. of Texas, Austin, TX (United States)

    2018-01-12

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. These include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a

  6. Impacts of tides on tsunami propagation due to potential Nankai Trough earthquakes in the Seto Inland Sea, Japan

    Science.gov (United States)

    Lee, Han Soo; Shimoyama, Tomohisa; Popinet, Stéphane

    2015-10-01

    The impacts of tides on extreme tsunami propagation due to potential Nankai Trough earthquakes in the Seto Inland Sea (SIS), Japan, are investigated through numerical experiments. Tsunami experiments are conducted based on five scenarios that consider tides at four different phases, such as flood, high, ebb, and low tides. The probes that were selected arbitrarily in the Bungo and Kii Channels show less significant effects of tides on tsunami heights and the arrival times of the first waves than those that experience large tidal ranges in inner basins and bays of the SIS. For instance, the maximum tsunami height and the arrival time at Toyomaesi differ by more than 0.5 m and nearly 1 h, respectively, depending on the tidal phase. The uncertainties defined in terms of calculated maximum tsunami heights due to tides illustrate that the calculated maximum tsunami heights in the inner SIS with standing tides have much larger uncertainties than those of two channels with propagating tides. Particularly in Harima Nada, the uncertainties due to the impacts of tides are greater than 50% of the tsunami heights without tidal interaction. The results recommend simulate tsunamis together with tides in shallow water environments to reduce the uncertainties involved with tsunami modeling and predictions for tsunami hazards preparedness. This article was corrected on 26 OCT 2015. See the end of the full text for details.

  7. Impacts of Korea's Exchange Rate Uncertainty on Exports

    Directory of Open Access Journals (Sweden)

    Kwon Sik Kim

    2003-12-01

    Full Text Available This paper examines the effects of two types of uncertainty related to the real effective exchange rate (REER in Korea for export trends. To decompose uncertainties into two types of component, I propose an advanced generalized Markov switching model, as developed by Hamilton (1989 and then expanded by Kim and Kim (1996. The proposed model is useful in uncovering two sources of uncertainty: the permanent component of REER and the purely transitory component. I think that the two types of uncertainties have a different effect on export trends in Korea. The transitory component of REER has no effect on the export trend at 5-percent significance, but the permanent component has an effect at this level. In addition, the degree of uncertainty, consisting of low, medium and high uncertainty in the permanent component, and low, medium and high uncertainty in transitory component of REER, also has different effects on export trends in Korea. Only high uncertainty in permanent components effects export trends. The results show that when the policy authority intends to prevent the shrinkage of exports due to the deepening of uncertainties in the foreign exchange market, the economic impacts of its intervention could appear differently according to the characteristics and degree of the uncertainties. Therefore, they imply that its economic measures, which could not grasp the sources of uncertainties properly, may even bring economic costs.

  8. Optimization Under Uncertainty for Wake Steering Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-08-03

    Offsetting turbines' yaw orientations from incoming wind is a powerful tool that may be leveraged to reduce undesirable wake effects on downstream turbines. First, we examine a simple two-turbine case to gain intuition as to how inflow direction uncertainty affects the optimal solution. The turbines are modeled with unidirectional inflow such that one turbine directly wakes the other, using ten rotor diameter spacing. We perform optimization under uncertainty (OUU) via a parameter sweep of the front turbine. The OUU solution generally prefers less steering. We then do this optimization for a 60-turbine wind farm with unidirectional inflow, varying the degree of inflow uncertainty and approaching this OUU problem by nesting a polynomial chaos expansion uncertainty quantification routine within an outer optimization. We examined how different levels of uncertainty in the inflow direction effect the ratio of the expected values of deterministic and OUU solutions for steering strategies in the large wind farm, assuming the directional uncertainty used to reach said OUU solution (this ratio is defined as the value of the stochastic solution or VSS).

  9. Resolving uncertainty in chemical speciation determinations

    Science.gov (United States)

    Smith, D. Scott; Adams, Nicholas W. H.; Kramer, James R.

    1999-10-01

    Speciation determinations involve uncertainty in system definition and experimentation. Identification of appropriate metals and ligands from basic chemical principles, analytical window considerations, types of species and checking for consistency in equilibrium calculations are considered in system definition uncertainty. A systematic approach to system definition limits uncertainty in speciation investigations. Experimental uncertainty is discussed with an example of proton interactions with Suwannee River fulvic acid (SRFA). A Monte Carlo approach was used to estimate uncertainty in experimental data, resulting from the propagation of uncertainties in electrode calibration parameters and experimental data points. Monte Carlo simulations revealed large uncertainties present at high (>9-10) and low (monoprotic ligands. Least-squares fit the data with 21 sites, whereas linear programming fit the data equally well with 9 sites. Multiresponse fitting, involving simultaneous fluorescence and pH measurements, improved model discrimination. Deconvolution of the excitation versus emission fluorescence surface for SRFA establishes a minimum of five sites. Diprotic sites are also required for the five fluorescent sites, and one non-fluorescent monoprotic site was added to accommodate the pH data. Consistent with greater complexity, the multiresponse method had broader confidence limits than the uniresponse methods, but corresponded better with the accepted total carboxylic content for SRFA. Overall there was a 40% standard deviation in total carboxylic content for the multiresponse fitting, versus 10% and 1% for least-squares and linear programming, respectively.

  10. Construction strategies and lifetime uncertainties for nuclear projects: A real option analysis

    International Nuclear Information System (INIS)

    Jain, Shashi; Roelofs, Ferry; Oosterlee, Cornelis W.

    2013-01-01

    Highlights: • Real options can be used to value flexibility of modular reactors. • Value of NPPs increases with implementation of long term cost reductions. • Levels of uncertainties affect the choice between projects. -- Abstract: Small and medium sized reactors, SMRs (according to IAEA, ‘small’ are reactors with power less than 300 MWe, and ‘medium’ with power less than 700 MWe) are considered as an attractive option for investment in nuclear power plants. SMRs may benefit from flexibility of investment, reduced upfront expenditure, and easy integration with small sized grids. Large reactors on the other hand have been an attractive option due to economy of scale. In this paper we focus on the advantages of flexibility due to modular construction of SMRs. Using real option analysis (ROA) we help a utility determine the value of sequential modular SMRs. Numerical results under different considerations, like possibility of rare events, learning, uncertain lifetimes are reported for a single large unit and modular SMRs

  11. Construction strategies and lifetime uncertainties for nuclear projects: A real option analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Shashi, E-mail: s.jain@cwi.nl [TU Delft, Delft Institute of Applied Mathematics, Delft (Netherlands); Nuclear Research Group, Petten (Netherlands); Roelofs, Ferry, E-mail: roelofs@nrg.eu [Nuclear Research Group, Petten (Netherlands); Oosterlee, Cornelis W., E-mail: c.w.oosterlee@cwi.nl [CWI – Centrum Wiskunde and Informatica, Amsterdam (Netherlands); TU Delft, Delft Institute of Applied Mathematics, Delft (Netherlands)

    2013-12-15

    Highlights: • Real options can be used to value flexibility of modular reactors. • Value of NPPs increases with implementation of long term cost reductions. • Levels of uncertainties affect the choice between projects. -- Abstract: Small and medium sized reactors, SMRs (according to IAEA, ‘small’ are reactors with power less than 300 MWe, and ‘medium’ with power less than 700 MWe) are considered as an attractive option for investment in nuclear power plants. SMRs may benefit from flexibility of investment, reduced upfront expenditure, and easy integration with small sized grids. Large reactors on the other hand have been an attractive option due to economy of scale. In this paper we focus on the advantages of flexibility due to modular construction of SMRs. Using real option analysis (ROA) we help a utility determine the value of sequential modular SMRs. Numerical results under different considerations, like possibility of rare events, learning, uncertain lifetimes are reported for a single large unit and modular SMRs.

  12. Evaluation of uncertainty of adaptive radiation therapy

    International Nuclear Information System (INIS)

    Garcia Molla, R.; Gomez Martin, C.; Vidueira, L.; Juan-Senabre, X.; Garcia Gomez, R.

    2013-01-01

    This work is part of tests to perform to its acceptance in the clinical practice. The uncertainties of adaptive radiation, and which will separate the study, can be divided into two large parts: dosimetry in the CBCT and RDI. At each stage, their uncertainties are quantified and a level of action from which it would be reasonable to adapt the plan may be obtained with the total. (Author)

  13. Dosimetric uncertainty in prostate cancer proton radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Lin Liyong; Vargas, Carlos; Hsi Wen; Indelicato, Daniel; Slopsema, Roelf; Li Zuofeng; Yeung, Daniel; Horne, Dave; Palta, Jatinder [University of Florida Proton Therapy Institute, Jacksonville, Florida 32206 (United States)

    2008-11-15

    option producing a 2 mm sharper penumbra at the isocenter can reduce the magnitude of maximal doses to the RW by 2% compared to the alternate option utilizing the same block margin of 7 mm. The dose to 0.1 cc of the femoral head on the distal side of the lateral-posterior oblique beam is increased by 25 CGE for a patient with 25 cc of rectal gas. Conclusion: Variation in the rectal and bladder wall DVHs due to uncertainty in the position of the organs relative to the location of sharp dose falloff gradients should be accounted for when evaluating treatment plans. The proton beam delivery option producing a sharper penumbra reduces maximal doses to the rectal wall. Lateral-posterior oblique beams should be avoided in patients prone to develop a large amount of rectal gas.

  14. Dosimetric uncertainty in prostate cancer proton radiotherapy.

    Science.gov (United States)

    Lin, Liyong; Vargas, Carlos; Hsi, Wen; Indelicato, Daniel; Slopsema, Roelf; Li, Zuofeng; Yeung, Daniel; Horne, Dave; Palta, Jatinder

    2008-11-01

    magnitude of maximal doses to the RW by 2% compared to the alternate option utilizing the same block margin of 7 mm. The dose to 0.1 cc of the femoral head on the distal side of the lateral-posterior oblique beam is increased by 25 CGE for a patient with 25 cc of rectal gas. Variation in the rectal and bladder wall DVHs due to uncertainty in the position of the organs relative to the location of sharp dose falloff gradients should be accounted for when evaluating treatment plans. The proton beam delivery option producing a sharper penumbra reduces maximal doses to the rectal wall. Lateral-posterior oblique beams should be avoided in patients prone to develop a large amount of rectal gas.

  15. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Directory of Open Access Journals (Sweden)

    Villemereuil Pierre de

    2012-06-01

    Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible

  16. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  17. Quantifying the uncertainty in heritability.

    Science.gov (United States)

    Furlotte, Nicholas A; Heckerman, David; Lippert, Christoph

    2014-05-01

    The use of mixed models to determine narrow-sense heritability and related quantities such as SNP heritability has received much recent attention. Less attention has been paid to the inherent variability in these estimates. One approach for quantifying variability in estimates of heritability is a frequentist approach, in which heritability is estimated using maximum likelihood and its variance is quantified through an asymptotic normal approximation. An alternative approach is to quantify the uncertainty in heritability through its Bayesian posterior distribution. In this paper, we develop the latter approach, make it computationally efficient and compare it to the frequentist approach. We show theoretically that, for a sufficiently large sample size and intermediate values of heritability, the two approaches provide similar results. Using the Atherosclerosis Risk in Communities cohort, we show empirically that the two approaches can give different results and that the variance/uncertainty can remain large.

  18. Compilation of information on uncertainties involved in deposition modeling

    International Nuclear Information System (INIS)

    Lewellen, W.S.; Varma, A.K.; Sheng, Y.P.

    1985-04-01

    The current generation of dispersion models contains very simple parameterizations of deposition processes. The analysis here looks at the physical mechanisms governing these processes in an attempt to see if more valid parameterizations are available and what level of uncertainty is involved in either these simple parameterizations or any more advanced parameterization. The report is composed of three parts. The first, on dry deposition model sensitivity, provides an estimate of the uncertainty existing in current estimates of the deposition velocity due to uncertainties in independent variables such as meteorological stability, particle size, surface chemical reactivity and canopy structure. The range of uncertainty estimated for an appropriate dry deposition velocity for a plume generated by a nuclear power plant accident is three orders of magnitude. The second part discusses the uncertainties involved in precipitation scavenging rates for effluents resulting from a nuclear reactor accident. The conclusion is that major uncertainties are involved both as a result of the natural variability of the atmospheric precipitation process and due to our incomplete understanding of the underlying process. The third part involves a review of the important problems associated with modeling the interaction between the atmosphere and a forest. It gives an indication of the magnitude of the problem involved in modeling dry deposition in such environments. Separate analytics have been done for each section and are contained in the EDB

  19. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  20. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    Energy Technology Data Exchange (ETDEWEB)

    Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  1. Uncertainty quantification in flood risk assessment

    Science.gov (United States)

    Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto

    2017-04-01

    Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.

  2. An Adaptation Dilemma Caused by Impacts-Modeling Uncertainty

    Science.gov (United States)

    Frieler, K.; Müller, C.; Elliott, J. W.; Heinke, J.; Arneth, A.; Bierkens, M. F.; Ciais, P.; Clark, D. H.; Deryng, D.; Doll, P. M.; Falloon, P.; Fekete, B. M.; Folberth, C.; Friend, A. D.; Gosling, S. N.; Haddeland, I.; Khabarov, N.; Lomas, M. R.; Masaki, Y.; Nishina, K.; Neumann, K.; Oki, T.; Pavlick, R.; Ruane, A. C.; Schmid, E.; Schmitz, C.; Stacke, T.; Stehfest, E.; Tang, Q.; Wisser, D.

    2013-12-01

    Ensuring future well-being for a growing population under either strong climate change or an aggressive mitigation strategy requires a subtle balance of potentially conflicting response measures. In the case of competing goals, uncertainty in impact estimates plays a central role when high confidence in achieving a primary objective (such as food security) directly implies an increased probability of uncertainty induced failure with regard to a competing target (such as climate protection). We use cross sectoral consistent multi-impact model simulations from the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP, www.isi-mip.org) to illustrate this uncertainty dilemma: RCP projections from 7 global crop, 11 hydrological, and 7 biomes models are combined to analyze irrigation and land use changes as possible responses to climate change and increasing crop demand due to population growth and economic development. We show that - while a no-regrets option with regard to climate protection - additional irrigation alone is not expected to balance the demand increase by 2050. In contrast, a strong expansion of cultivated land closes the projected production-demand gap in some crop models. However, it comes at the expense of a loss of natural carbon sinks of order 50%. Given the large uncertainty of state of the art crop model projections even these strong land use changes would not bring us ';on the safe side' with respect to food supply. In a world where increasing carbon emissions continue to shrink the overall solution space, we demonstrate that current impacts-modeling uncertainty is a luxury we cannot afford. ISI-MIP is intended to provide cross sectoral consistent impact projections for model intercomparison and improvement as well as cross-sectoral integration. The results presented here were generated within the first Fast-Track phase of the project covering global impact projections. The second phase will also include regional projections. It is the aim

  3. Uncertainty and validation. Effect of model complexity on uncertainty estimates

    International Nuclear Information System (INIS)

    Elert, M.

    1996-09-01

    In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root

  4. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    -delay functions express travel time as a function of traffic flows and the theoretical capacity of the modeled facility. The U.S. Bureau of Public Roads (BPR) formula is one of the most extensively applied volume delay functions in practice. This study investigated uncertainty in the BPR parameters. Initially......-stage Danish national transport model. The results clearly highlight the importance to modeling purposes of taking into account BPR formula parameter uncertainty, expressed as a distribution of values rather than assumed point values. Indeed, the model output demonstrates a noticeable sensitivity to parameter...

  5. Uncertainties of flood frequency estimation approaches based on continuous simulation using data resampling

    Science.gov (United States)

    Arnaud, Patrick; Cantet, Philippe; Odry, Jean

    2017-11-01

    Flood frequency analyses (FFAs) are needed for flood risk management. Many methods exist ranging from classical purely statistical approaches to more complex approaches based on process simulation. The results of these methods are associated with uncertainties that are sometimes difficult to estimate due to the complexity of the approaches or the number of parameters, especially for process simulation. This is the case of the simulation-based FFA approach called SHYREG presented in this paper, in which a rainfall generator is coupled with a simple rainfall-runoff model in an attempt to estimate the uncertainties due to the estimation of the seven parameters needed to estimate flood frequencies. The six parameters of the rainfall generator are mean values, so their theoretical distribution is known and can be used to estimate the generator uncertainties. In contrast, the theoretical distribution of the single hydrological model parameter is unknown; consequently, a bootstrap method is applied to estimate the calibration uncertainties. The propagation of uncertainty from the rainfall generator to the hydrological model is also taken into account. This method is applied to 1112 basins throughout France. Uncertainties coming from the SHYREG method and from purely statistical approaches are compared, and the results are discussed according to the length of the recorded observations, basin size and basin location. Uncertainties of the SHYREG method decrease as the basin size increases or as the length of the recorded flow increases. Moreover, the results show that the confidence intervals of the SHYREG method are relatively small despite the complexity of the method and the number of parameters (seven). This is due to the stability of the parameters and takes into account the dependence of uncertainties due to the rainfall model and the hydrological calibration. Indeed, the uncertainties on the flow quantiles are on the same order of magnitude as those associated with

  6. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  7. Nuclear data sensitivity and uncertainty for the Canadian supercritical water-cooled reactor II: Full core analysis

    International Nuclear Information System (INIS)

    Langton, S.E.; Buijs, A.; Pencer, J.

    2015-01-01

    Highlights: • H-2, Pu-239, and Th-232 make large contributions to SCWR modelling sensitivity. • H-2, Pu-239, and Th-232 make large contributions to SCWR modelling uncertainty. • Isotopes of Zr make large contributions to SCWR modelling uncertainty. - Abstract: Uncertainties in nuclear data are a fundamental source of uncertainty in reactor physics calculations. To determine their contribution to uncertainties in calculated reactor physics parameters, a nuclear data sensitivity and uncertainty study is performed on the Canadian supercritical water reactor (SCWR) concept. The nuclear data uncertainty contributions to the neutron multiplication factor k eff are 6.31 mk for the SCWR at the beginning of cycle (BOC) and 6.99 mk at the end of cycle (EOC). Both of these uncertainties have a statistical uncertainty of 0.02 mk. The nuclear data uncertainty contributions to Coolant Void Reactivity (CVR) are 1.0 mk and 0.9 mk for BOC and EOC, respectively, both with statistical uncertainties of 0.1 mk. The nuclear data uncertainty contributions to other reactivity parameters range from as low as 3% of to as high as ten times the values of the reactivity coefficients. The largest contributors to the uncertainties in the reactor physics parameters are Pu-239, Th-232, H-2, and isotopes of zirconium

  8. BEPU methods and combining of uncertainties

    International Nuclear Information System (INIS)

    Prosek, A.; Mavko, B.

    2004-01-01

    After approval of the revised rule on the acceptance of emergency core cooling system (ECCS) performance in 1988 there has been significant interest in the development of codes and methodologies for best-estimate loss-of-coolant accident (LOCAs) analyses. The Code Scaling, Applicability and Uncertainty (CSAU) evaluation method was developed and demonstrated for large-break (LB) LOCA in a pressurized water reactor. Later several new best estimate plus uncertainty methods (BEPUs) were developed in the world. The purpose of the paper is to identify and compare the statistical approaches of BEPU methods and present their important plant and licensing applications. The study showed that uncertainty analysis with random sampling of input parameters and the use of order statistics for desired tolerance limits of output parameters is today commonly accepted approach. The existing BEPU methods seems mature enough while the future research may be focused on the codes with internal assessment of uncertainty. (author)

  9. Environmental impact and risk assessments and key factors contributing to the overall uncertainties

    International Nuclear Information System (INIS)

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  10. Nordic reference study on uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Hirschberg, S.; Jacobsson, P.; Pulkkinen, U.; Porn, K.

    1989-01-01

    This paper provides a review of the first phase of Nordic reference study on uncertainty and sensitivity analysis. The main objective of this study is to use experiences form previous Nordic Benchmark Exercises and reference studies concerning critical modeling issues such as common cause failures and human interactions, and to demonstrate the impact of associated uncertainties on the uncertainty of the investigated accident sequence. This has been done independently by three working groups which used different approaches to modeling and to uncertainty analysis. The estimated uncertainty interval for the analyzed accident sequence is large. Also the discrepancies between the groups are substantial but can be explained. Sensitivity analyses which have been carried out concern e.g. use of different CCF-quantification models, alternative handling of CCF-data, time windows for operator actions and time dependences in phase mission operation, impact of state-of-knowledge dependences and ranking of dominating uncertainty contributors. Specific findings with respect to these issues are summarized in the paper

  11. Displaying results of direct detection dark matter experiments free of astrophysical uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Ludwig [Max Planck Institut fuer Kernphysik, Heidelberg (Germany); Collaboration: Collaboration XENON 100

    2015-07-01

    A number of experiments try to measure WIMP interactions by using different detector technologies and target elements. Hence, energy thresholds and sensitivities to light or heavy WIMP masses differ. However, due to large systematic uncertainties in the parameters defining the dark matter halo, a comparison of detectors is demanding. By mapping experimental results from the traditional cross section vs. dark matter mass parameter-space into a dark matter halo independent phase space, direct comparisons between experiments can be made. This is possible due to the monotonicity of the velocity integral which enables to combine all astrophysical assumptions into one parameter common to all experiments. In this talk the motivation as well as the mapping method are explained based on the XENON100 data.

  12. On economic resolution and uncertainty in hydrocarbon exploration assessment

    International Nuclear Information System (INIS)

    Lerche, I.

    1998-01-01

    When assessment of parameters of a decision tree for a hydrocarbon exploration project can lie within estimated ranges, it is shown that the ensemble average expected value has two sorts of uncertainties: one is due to the expected value of each realization of the decision tree being different than the average; the second is due to intrinsic variance of each decision tree. The total standard error of the average expected value combines both sorts. The use of additional statistical measures, such as standard error, volatility, and cumulative probability of making a profit, provide insight into the selection process leading to a more appropriate decision. In addition, the use of relative contributions and relative importance for the uncertainty measures guides one to a better determination of those parameters that dominantly influence the total ensemble uncertainty. In this way one can concentrate resources on efforts to minimize the uncertainty ranges of such dominant parameters. A numerical illustration is provided to indicate how such calculations can be performed simply with a hand calculator. (author)

  13. Assessing the Roles of Regional Climate Uncertainty, Policy, and Economics on Future Risks to Water Stress: A Large-Ensemble Pilot Case for Southeast Asia

    Science.gov (United States)

    Schlosser, C. A.; Strzepek, K. M.; Gao, X.; Fant, C. W.; Blanc, E.; Monier, E.; Sokolov, A. P.; Paltsev, S.; Arndt, C.; Prinn, R. G.; Reilly, J. M.; Jacoby, H.

    2013-12-01

    The fate of natural and managed water resources is controlled to varying degrees by interlinked energy, agricultural, and environmental systems, as well as the hydro-climate cycles. The need for risk-based assessments of impacts and adaptation to regional change calls for likelihood quantification of outcomes via the representation of uncertainty - to the fullest extent possible. A hybrid approach of the MIT Integrated Global System Model (IGSM) framework provides probabilistic projections of regional climate change - generated in tandem with consistent socio-economic projections. A Water Resources System (WRS) then tracks water allocation and availability across these competing demands. As such, the IGSM-WRS is an integrated tool that provides quantitative insights on the risks and sustainability of water resources over large river basins. This pilot project focuses the IGSM-WRS on Southeast Asia (Figure 1). This region presents exceptional challenges toward sustainable water resources given its texture of basins that traverse and interconnect developing nations as well as large, ascending economies and populations - such as China and India. We employ the IGSM-WRS in a large ensemble of outcomes spanning hydro-climatic, economic, and policy uncertainties. For computational efficiency, a Gaussian Quadrature procedure sub-samples these outcomes (Figure 2). The IGSM-WRS impacts are quantified through frequency distributions of water stress changes. The results allow for interpretation of: the effects of policy measures; impacts on food production; and the value of design flexibility of infrastructure/institutions. An area of model development and exploration is the feedback of water-stress shocks to economic activity (i.e. GDP and land use). We discuss these further results (where possible) as well as other efforts to refine: uncertainty methods, greater basin-level and climate detail, and process-level representation glacial melt-water sources. Figure 1 Figure 2

  14. Quantifying and Reducing Curve-Fitting Uncertainty in Isc: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mark; Duck, Benjamin; Emery, Keith

    2015-09-28

    Current-voltage (I-V) curve measurements of photovoltaic (PV) devices are used to determine performance parameters and to establish traceable calibration chains. Measurement standards specify localized curve fitting methods, e.g., straight-line interpolation/extrapolation of the I-V curve points near short-circuit current, Isc. By considering such fits as statistical linear regressions, uncertainties in the performance parameters are readily quantified. However, the legitimacy of such a computed uncertainty requires that the model be a valid (local) representation of the I-V curve and that the noise be sufficiently well characterized. Using more data points often has the advantage of lowering the uncertainty. However, more data points can make the uncertainty in the fit arbitrarily small, and this fit uncertainty misses the dominant residual uncertainty due to so-called model discrepancy. Using objective Bayesian linear regression for straight-line fits for Isc, we investigate an evidence-based method to automatically choose data windows of I-V points with reduced model discrepancy. We also investigate noise effects. Uncertainties, aligned with the Guide to the Expression of Uncertainty in Measurement (GUM), are quantified throughout.

  15. Measuring uncertainty in dose delivered to the cochlea due to setup error during external beam treatment of patients with cancer of the head and neck

    Energy Technology Data Exchange (ETDEWEB)

    Yan, M.; Lovelock, D.; Hunt, M.; Mechalakos, J.; Hu, Y.; Pham, H.; Jackson, A., E-mail: jacksona@mskcc.org [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York 10065 (United States)

    2013-12-15

    , the standard deviation of setup error reduced by 31%, 42%, and 54% in RL, AP, and SI direction, respectively, and consequently, the uncertainty of the mean dose to cochlea reduced more than 50%. The authors estimate that the effects of these uncertainties on the probability of hearing loss for an individual patient could be as large as 10%.

  16. Uncertainties in Forecasting Streamflow using Entropy Theory

    Science.gov (United States)

    Cui, H.; Singh, V. P.

    2017-12-01

    Streamflow forecasting is essential in river restoration, reservoir operation, power generation, irrigation, navigation, and water management. However, there is always uncertainties accompanied in forecast, which may affect the forecasting results and lead to large variations. Therefore, uncertainties must be considered and be assessed properly when forecasting streamflow for water management. The aim of our work is to quantify the uncertainties involved in forecasting streamflow and provide reliable streamflow forecast. Despite that streamflow time series are stochastic, they exhibit seasonal and periodic patterns. Therefore, streamflow forecasting entails modeling seasonality, periodicity, and its correlation structure, and assessing uncertainties. This study applies entropy theory to forecast streamflow and measure uncertainties during the forecasting process. To apply entropy theory for streamflow forecasting, spectral analysis is combined to time series analysis, as spectral analysis can be employed to characterize patterns of streamflow variation and identify the periodicity of streamflow. That is, it permits to extract significant information for understanding the streamflow process and prediction thereof. Application of entropy theory for streamflow forecasting involves determination of spectral density, determination of parameters, and extension of autocorrelation function. The uncertainties brought by precipitation input, forecasting model and forecasted results are measured separately using entropy. With information theory, how these uncertainties transported and aggregated during these processes will be described.

  17. Uncertainty in BMP evaluation and optimization for watershed management

    Science.gov (United States)

    Chaubey, I.; Cibin, R.; Sudheer, K.; Her, Y.

    2012-12-01

    Use of computer simulation models have increased substantially to make watershed management decisions and to develop strategies for water quality improvements. These models are often used to evaluate potential benefits of various best management practices (BMPs) for reducing losses of pollutants from sources areas into receiving waterbodies. Similarly, use of simulation models in optimizing selection and placement of best management practices under single (maximization of crop production or minimization of pollutant transport) and multiple objective functions has increased recently. One of the limitations of the currently available assessment and optimization approaches is that the BMP strategies are considered deterministic. Uncertainties in input data (e.g. precipitation, streamflow, sediment, nutrient and pesticide losses measured, land use) and model parameters may result in considerable uncertainty in watershed response under various BMP options. We have developed and evaluated options to include uncertainty in BMP evaluation and optimization for watershed management. We have also applied these methods to evaluate uncertainty in ecosystem services from mixed land use watersheds. In this presentation, we will discuss methods to to quantify uncertainties in BMP assessment and optimization solutions due to uncertainties in model inputs and parameters. We have used a watershed model (Soil and Water Assessment Tool or SWAT) to simulate the hydrology and water quality in mixed land use watershed located in Midwest USA. The SWAT model was also used to represent various BMPs in the watershed needed to improve water quality. SWAT model parameters, land use change parameters, and climate change parameters were considered uncertain. It was observed that model parameters, land use and climate changes resulted in considerable uncertainties in BMP performance in reducing P, N, and sediment loads. In addition, climate change scenarios also affected uncertainties in SWAT

  18. Regime-dependent forecast uncertainty of convective precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Keil, Christian; Craig, George C. [Muenchen Univ. (Germany). Meteorologisches Inst.

    2011-04-15

    Forecast uncertainty of convective precipitation is influenced by all scales, but in different ways in different meteorological situations. Forecasts of the high resolution ensemble prediction system COSMO-DE-EPS of Deutscher Wetterdienst (DWD) are used to examine the dominant sources of uncertainty of convective precipitation. A validation with radar data using traditional as well as spatial verification measures highlights differences in precipitation forecast performance in differing weather regimes. When the forecast uncertainty can primarily be associated with local, small-scale processes individual members run with the same variation of the physical parameterisation driven by different global models outperform all other ensemble members. In contrast when the precipitation is governed by the large-scale flow all ensemble members perform similarly. Application of the convective adjustment time scale confirms this separation and shows a regime-dependent forecast uncertainty of convective precipitation. (orig.)

  19. Automated uncertainty analysis methods in the FRAP computer codes

    International Nuclear Information System (INIS)

    Peck, S.O.

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts

  20. A new system to quantify uncertainties in LEO satellite position determination due to space weather events

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a new system for quantitative assessment of uncertainties in LEO satellite position caused by storm time changes in space environmental...

  1. Radiotherapy for breast cancer: respiratory and set-up uncertainties

    International Nuclear Information System (INIS)

    Saliou, M.G.; Giraud, P.; Simon, L.; Fournier-Bidoz, N.; Fourquet, A.; Dendale, R.; Rosenwald, J.C.; Cosset, J.M.

    2005-01-01

    Adjuvant Radiotherapy has been shown to significantly reduce locoregional recurrence but this advantage is associated with increased cardiovascular and pulmonary morbidities. All uncertainties inherent to conformal radiation therapy must be identified in order to increase the precision of treatment; misestimation of these uncertainties increases the potential risk of geometrical misses with, as a consequence, under-dosage of the tumor and/or overdosage of healthy tissues. Geometric uncertainties due to respiratory movements or set-up errors are well known. Two strategies have been proposed to limit their effect: quantification of these uncertainties, which are then taken into account in the final calculation of safety margins and/or reduction of respiratory and set-up uncertainties by an efficient immobilization or gating systems. Measured on portal films with two tangential fields. CLD (central lung distance), defined as the distance between the deep field edge and the interior chest wall at the central axis, seems to be the best predictor of set-up uncertainties. Using CLD, estimated mean set-up errors from the literature are 3.8 and 3.2 mm for the systematic and random errors respectively. These depend partly on the type of immobilization device and could be reduced by the use of portal imaging systems. Furthermore, breast is mobile during respiration with motion amplitude as high as 0.8 to 10 mm in the anteroposterior direction. Respiratory gating techniques, currently on evaluation, have the potential to reduce effect of these movements. Each radiotherapy department should perform its own assessments and determine the geometric uncertainties with respect of the equipment used and its particular treatment practices. This paper is a review of the main geometric uncertainties in breast treatment, due to respiration and set-up, and solutions proposed to limit their impact. (author)

  2. Sensitivity and uncertainty analysis for fission product decay heat calculations

    International Nuclear Information System (INIS)

    Rebah, J.; Lee, Y.K.; Nimal, J.C.; Nimal, B.; Luneville, L.; Duchemin, B.

    1994-01-01

    The calculated uncertainty in decay heat due to the uncertainty in basic nuclear data given in the CEA86 Library, is presented. Uncertainties in summation calculation arise from several sources: fission product yields, half-lives and average decay energies. The correlation between basic data is taken into account. The uncertainty analysis were obtained for thermal-neutron-induced fission of U235 and Pu239 in the case of burst fission and irradiation time. The calculated decay heat in this study is compared with experimental results and with new calculation using the JEF2 Library. (from authors) 6 figs., 19 refs

  3. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...... the accounting of uncertainty are compared with respect to different objectives. (C) 2015 Elsevier Ltd. All rights reserved....

  4. Managing Uncertainty for an Integrated Fishery

    Directory of Open Access Journals (Sweden)

    MB Hasan

    2012-06-01

    Full Text Available This paper investigates ways to deal with the uncertainties in fishing trawler scheduling and production planning in a quota-based integrated commercial fishery. A commercial fishery faces uncertainty mainly from variation in catch rate, which may be due to weather, and other environmental factors. The firm tries to manage this uncertainty through planning co-ordination of fishing trawler scheduling, catch quota, processing and labour allocation, and inventory control. Scheduling must necessarily be done over some finite planning horizon, and the trawler schedule itself introduces man-made variability, which in turn induces inventory in the processing plant. This induced inventory must be managed, complicated by the inability to plan easily beyond the current planning horizon. We develop a surprisingly simple innovation in inventory, which we have not seen in other papers on production management, which of requiring beginning inventory to equal ending inventory. This tool gives management a way to calculate a profit-maximizing safety stock that counter-acts the man-made variability due to the trawler scheduling. We found that the variability of catch rate had virtually no effects on the profitability with inventory. We report numerical results for several planning horizon models, based on data for a major New Zealand fishery.

  5. A Study on the uncertainty and sensitivity in numerical simulation of parametric roll

    DEFF Research Database (Denmark)

    Choi, Ju-hyuck; Nielsen, Ulrik Dam; Jensen, Jørgen Juncher

    2016-01-01

    Uncertainties related to numerical modelling of parametric roll have been investigated by using a 6-DOFs model with nonlinear damping and roll restoring forces. At first, uncertainty on damping coefficients and its effect on the roll response is evaluated. Secondly, uncertainty due to the “effect...

  6. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  7. Uncertainties in Safety Analysis. A literature review

    International Nuclear Information System (INIS)

    Ekberg, C.

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs

  8. Uncertainties in Safety Analysis. A literature review

    Energy Technology Data Exchange (ETDEWEB)

    Ekberg, C [Chalmers Univ. of Technology, Goeteborg (Sweden). Dept. of Nuclear Chemistry

    1995-05-01

    The purpose of the presented work has been to give a short summary of the origins of many uncertainties arising in the designing and performance assessment of a repository for spent nuclear fuel. Some different methods to treat these uncertainties is also included. The methods and conclusions are in many cases general in the sense that they are applicable to many other disciplines where simulations are used. As a conclusion it may be noted that uncertainties of different origin have been discussed and debated, but one large group, e.g. computer simulations, where the methods to make a more explicit investigation exists, have not been investigated in a satisfying way. 50 refs.

  9. All wind farm uncertainty is not the same: The economics of common versus independent causes

    International Nuclear Information System (INIS)

    Veers, P.S.

    1994-01-01

    There is uncertainty in the performance of wind energy installations due to unknowns in the local wind environment, machine response to the environment, and the durability of materials. Some of the unknowns are inherently independent from machine to machine while other uncertainties are common to the entire fleet equally. The FAROW computer software for fatigue and reliability of wind turbines is used to calculate the probability of component failure due to a combination of all sources of uncertainty. Although the total probability of component failure due to all effects is sometimes interpreted as the percentage of components likely to fail, this perception is often far from correct. Different amounts of common versus independent uncertainty are reflected in economic risk due to either high probabilities that a small percentage of the fleet will experience problems or low probabilities that the entire fleet will have problems. The average, or expected cost is the same as would be calculated by combining all sources of uncertainty, but the risk to the fleet may be quite different in nature. Present values of replacement costs are compared for two examples reflecting different stages in the design and development process. Results emphasize that an engineering effort to test and evaluate the design assumptions is necessary to advance a design from the high uncertainty of the conceptual stages to the lower uncertainty of a well engineered and tested machine

  10. Uncertainty analysis of neutron transport calculation

    International Nuclear Information System (INIS)

    Oka, Y.; Furuta, K.; Kondo, S.

    1987-01-01

    A cross section sensitivity-uncertainty analysis code, SUSD was developed. The code calculates sensitivity coefficients for one and two-dimensional transport problems based on the first order perturbation theory. Variance and standard deviation of detector responses or design parameters can be obtained using cross section covariance matrix. The code is able to perform sensitivity-uncertainty analysis for secondary neutron angular distribution(SAD) and secondary neutron energy distribution(SED). Covariances of 6 Li and 7 Li neutron cross sections in JENDL-3PR1 were evaluated including SAD and SED. Covariances of Fe and Be were also evaluated. The uncertainty of tritium breeding ratio, fast neutron leakage flux and neutron heating was analysed on four types of blanket concepts for a commercial tokamak fusion reactor. The uncertainty of tritium breeding ratio was less than 6 percent. Contribution from SAD/SED uncertainties are significant for some parameters. Formulas to estimate the errors of numerical solution of the transport equation were derived based on the perturbation theory. This method enables us to deterministically estimate the numerical errors due to iterative solution, spacial discretization and Legendre polynomial expansion of transfer cross-sections. The calculational errors of the tritium breeding ratio and the fast neutron leakage flux of the fusion blankets were analysed. (author)

  11. Reducing uncertainty in dust monitoring to detect aeolian sediment transport responses to land cover change

    Science.gov (United States)

    Webb, N.; Chappell, A.; Van Zee, J.; Toledo, D.; Duniway, M.; Billings, B.; Tedela, N.

    2017-12-01

    Anthropogenic land use and land cover change (LULCC) influence global rates of wind erosion and dust emission, yet our understanding of the magnitude of the responses remains poor. Field measurements and monitoring provide essential data to resolve aeolian sediment transport patterns and assess the impacts of human land use and management intensity. Data collected in the field are also required for dust model calibration and testing, as models have become the primary tool for assessing LULCC-dust cycle interactions. However, there is considerable uncertainty in estimates of dust emission due to the spatial variability of sediment transport. Field sampling designs are currently rudimentary and considerable opportunities are available to reduce the uncertainty. Establishing the minimum detectable change is critical for measuring spatial and temporal patterns of sediment transport, detecting potential impacts of LULCC and land management, and for quantifying the uncertainty of dust model estimates. Here, we evaluate the effectiveness of common sampling designs (e.g., simple random sampling, systematic sampling) used to measure and monitor aeolian sediment transport rates. Using data from the US National Wind Erosion Research Network across diverse rangeland and cropland cover types, we demonstrate how only large changes in sediment mass flux (of the order 200% to 800%) can be detected when small sample sizes are used, crude sampling designs are implemented, or when the spatial variation is large. We then show how statistical rigour and the straightforward application of a sampling design can reduce the uncertainty and detect change in sediment transport over time and between land use and land cover types.

  12. Remediation of the Faultless Underground Nuclear Test: Moving Forward in the Face of Model Uncertainty

    International Nuclear Information System (INIS)

    Chapman, J. B.; Pohlmann, K.; Pohll, G.; Hassan, A.; Sanders, P.; Sanchez, M.; Jaunarajs, S.

    2002-01-01

    parameter values and the additive effects of multiple sources of uncertainty. Ultimately, the question was whether new data collection would substantially reduce uncertainty in the model. A Data Decision Analysis (DDA) was performed to quantify uncertainty in the existing model and determine the most cost-beneficial activities for reducing uncertainty, if reduction was needed. The DDA indicated that though there is large uncertainty present in some model parameters, the overall uncertainty in the calculated contaminant boundary during the 1,000-year regulatory timeframe is relatively small. As a result, limited uncertainty reduction can be expected from expensive characterization activities. With these results, DOE and NDEP have determined that the site model is suitable for moving forward in the corrective action process. Key to this acceptance is acknowledgment that the model requires independent validation data and the site requires long-term monitoring. Developing the validation and monitoring plans, and calculating contaminant boundaries are the tasks now being pursued for the site. The significant progress made for the site is due to the close cooperation and communication of the parties involved and an acceptance and understanding of the role of uncertainty

  13. Do oil shocks predict economic policy uncertainty?

    Science.gov (United States)

    Rehman, Mobeen Ur

    2018-05-01

    Oil price fluctuations have influential role in global economic policies for developed as well as emerging countries. I investigate the role of international oil prices disintegrated into structural (i) oil supply shock, (ii) aggregate demand shock and (iii) oil market specific demand shocks, based on the work of Kilian (2009) using structural VAR framework on economic policies uncertainty of sampled markets. Economic policy uncertainty, due to its non-linear behavior is modeled in a regime switching framework with disintegrated structural oil shocks. Our results highlight that Indian, Spain and Japanese economic policy uncertainty responds to the global oil price shocks, however aggregate demand shocks fail to induce any change. Oil specific demand shocks are significant only for China and India in high volatility state.

  14. Managing structural uncertainty in health economic decision models: a discrepancy approach

    OpenAIRE

    Strong, M.; Oakley, J.; Chilcott, J.

    2012-01-01

    Healthcare resource allocation decisions are commonly informed by computer model predictions of population mean costs and health effects. It is common to quantify the uncertainty in the prediction due to uncertain model inputs, but methods for quantifying uncertainty due to inadequacies in model structure are less well developed. We introduce an example of a model that aims to predict the costs and health effects of a physical activity promoting intervention. Our goal is to develop a framewor...

  15. Determination of uncertainties in energy and exergy analysis of a power plant

    International Nuclear Information System (INIS)

    Ege, Ahmet; Şahin, Hacı Mehmet

    2014-01-01

    Highlights: • Energy and exergy efficiency uncertainties in a large thermal power plant examined. • Sensitivity analysis shows importance of basic measurements on efficiency analysis. • A quick and practical approach is provided for determining efficiency uncertainties. • Extreme case analysis characterizes maximum possible boundaries of uncertainties. • Uncertainty determination in a plant is a dynamic process with real data. - Abstract: In this study, energy and exergy efficiency uncertainties of a large scale lignite fired power plant cycle and various measurement parameter sensitivities were investigated for five different design power outputs (100%, 85%, 80%, 60% and 40%) and with real data of the plant. For that purpose a black box method was employed considering coal flow with Lower Heating Value (LHV) as a single input and electricity produced as a single output of the plant. The uncertainty of energy and exergy efficiency of the plant was evaluated with this method by applying sensitivity analysis depending on the effect of measurement parameters such as LHV, coal mass flow rate, cell generator output voltage/current. In addition, an extreme case analysis was investigated to determine the maximum range of the uncertainties. Results of the black box method showed that uncertainties varied between 1.82–1.98% for energy efficiency and 1.32–1.43% for exergy efficiency of the plant at an operating power level of 40–100% of full power. It was concluded that LHV determination was the most important uncertainty source of energy and exergy efficiency of the plant. The uncertainties of the extreme case analysis were determined between 2.30% and 2.36% for energy efficiency while 1.66% and 1.70% for exergy efficiency for 40–100% power output respectively. Proposed method was shown to be an approach for understanding major uncertainties as well as effects of some measurement parameters in a large scale thermal power plant

  16. Large-scale determinants of diversity across Spanish forest habitats: accounting for model uncertainty in compositional and structural indicators

    Energy Technology Data Exchange (ETDEWEB)

    Martin-Quller, E.; Torras, O.; Alberdi, I.; Solana, J.; Saura, S.

    2011-07-01

    An integral understanding of forest biodiversity requires the exploration of the many aspects it comprises and of the numerous potential determinants of their distribution. The landscape ecological approach provides a necessary complement to conventional local studies that focus on individual plots or forest ownerships. However, most previous landscape studies used equally-sized cells as units of analysis to identify the factors affecting forest biodiversity distribution. Stratification of the analysis by habitats with a relatively homogeneous forest composition might be more adequate to capture the underlying patterns associated to the formation and development of a particular ensemble of interacting forest species. Here we used a landscape perspective in order to improve our understanding on the influence of large-scale explanatory factors on forest biodiversity indicators in Spanish habitats, covering a wide latitudinal and attitudinal range. We considered six forest biodiversity indicators estimated from more than 30,000 field plots in the Spanish national forest inventory, distributed in 213 forest habitats over 16 Spanish provinces. We explored biodiversity response to various environmental (climate and topography) and landscape configuration (fragmentation and shape complexity) variables through multiple linear regression models (built and assessed through the Akaike Information Criterion). In particular, we took into account the inherent model uncertainty when dealing with a complex and large set of variables, and considered different plausible models and their probability of being the best candidate for the observed data. Our results showed that compositional indicators (species richness and diversity) were mostly explained by environmental factors. Models for structural indicators (standing deadwood and stand complexity) had the worst fits and selection uncertainties, but did show significant associations with some configuration metrics. In general

  17. Strain gauge measurement uncertainties on hydraulic turbine runner blade

    International Nuclear Information System (INIS)

    Arpin-Pont, J; Gagnon, M; Tahan, S A; Coutu, A; Thibault, D

    2012-01-01

    Strains experimentally measured with strain gauges can differ from those evaluated using the Finite Element (FE) method. This difference is due mainly to the assumptions and uncertainties inherent to each method. To circumvent this difficulty, we developed a numerical method based on Monte Carlo simulations to evaluate measurement uncertainties produced by the behaviour of a unidirectional welded gauge, its position uncertainty and its integration effect. This numerical method uses the displacement fields of the studied part evaluated by an FE analysis. The paper presents a study case using in situ data measured on a hydraulic turbine runner. The FE analysis of the turbine runner blade was computed, and our numerical method used to evaluate uncertainties on strains measured at five locations with welded strain gauges. Then, measured strains and their uncertainty ranges are compared to the estimated strains. The uncertainty ranges obtained extended from 74 με to 165 με. Furthermore, the biases observed between the median of the uncertainty ranges and the FE strains varied from −36 to 36 με. Note that strain gauge measurement uncertainties depend mainly on displacement fields and gauge geometry.

  18. Greenhouse gas scenario sensitivity and uncertainties in precipitation projections for central Belgium

    Science.gov (United States)

    Van Uytven, E.; Willems, P.

    2018-03-01

    Climate change impact assessment on meteorological variables involves large uncertainties as a result of incomplete knowledge on the future greenhouse gas concentrations and climate model physics, next to the inherent internal variability of the climate system. Given that the alteration in greenhouse gas concentrations is the driver for the change, one expects the impacts to be highly dependent on the considered greenhouse gas scenario (GHS). In this study, we denote this behavior as GHS sensitivity. Due to the climate model related uncertainties, this sensitivity is, at local scale, not always that strong as expected. This paper aims to study the GHS sensitivity and its contributing role to climate scenarios for a case study in Belgium. An ensemble of 160 CMIP5 climate model runs is considered and climate change signals are studied for precipitation accumulation, daily precipitation intensities and wet day frequencies. This was done for the different seasons of the year and the scenario periods 2011-2040, 2031-2060, 2051-2081 and 2071-2100. By means of variance decomposition, the total variance in the climate change signals was separated in the contribution of the differences in GHSs and the other model-related uncertainty sources. These contributions were found dependent on the variable and season. Following the time of emergence concept, the GHS uncertainty contribution is found dependent on the time horizon and increases over time. For the most distinct time horizon (2071-2100), the climate model uncertainty accounts for the largest uncertainty contribution. The GHS differences explain up to 18% of the total variance in the climate change signals. The results point further at the importance of the climate model ensemble design, specifically the ensemble size and the combination of climate models, whereupon climate scenarios are based. The numerical noise, introduced at scales smaller than the skillful scale, e.g. at local scale, was not considered in this study.

  19. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  20. Uncertainties in human health risk assessment of environmental contaminants: A review and perspective.

    Science.gov (United States)

    Dong, Zhaomin; Liu, Yanju; Duan, Luchun; Bekele, Dawit; Naidu, Ravi

    2015-12-01

    Addressing uncertainties in human health risk assessment is a critical issue when evaluating the effects of contaminants on public health. A range of uncertainties exist through the source-to-outcome continuum, including exposure assessment, hazard and risk characterisation. While various strategies have been applied to characterising uncertainty, classical approaches largely rely on how to maximise the available resources. Expert judgement, defaults and tools for characterising quantitative uncertainty attempt to fill the gap between data and regulation requirements. The experiences of researching 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD) illustrated uncertainty sources and how to maximise available information to determine uncertainties, and thereby provide an 'adequate' protection to contaminant exposure. As regulatory requirements and recurring issues increase, the assessment of complex scenarios involving a large number of chemicals requires more sophisticated tools. Recent advances in exposure and toxicology science provide a large data set for environmental contaminants and public health. In particular, biomonitoring information, in vitro data streams and computational toxicology are the crucial factors in the NexGen risk assessment, as well as uncertainties minimisation. Although in this review we cannot yet predict how the exposure science and modern toxicology will develop in the long-term, current techniques from emerging science can be integrated to improve decision-making. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Communicating uncertainties in earth sciences in view of user needs

    Science.gov (United States)

    de Vries, Wim; Kros, Hans; Heuvelink, Gerard

    2014-05-01

    Uncertainties are inevitable in all results obtained in the earth sciences, regardless whether these are based on field observations, experimental research or predictive modelling. When informing decision and policy makers or stakeholders, it is important that these uncertainties are also communicated. In communicating results, it important to apply a "Progressive Disclosure of Information (PDI)" from non-technical information through more specialised information, according to the user needs. Generalized information is generally directed towards non-scientific audiences and intended for policy advice. Decision makers have to be aware of the implications of the uncertainty associated with results, so that they can account for it in their decisions. Detailed information on the uncertainties is generally intended for scientific audiences to give insight in underlying approaches and results. When communicating uncertainties, it is important to distinguish between scientific results that allow presentation in terms of probabilistic measures of uncertainty and more intrinsic uncertainties and errors that cannot be expressed in mathematical terms. Examples of earth science research that allow probabilistic measures of uncertainty, involving sophisticated statistical methods, are uncertainties in spatial and/or temporal variations in results of: • Observations, such as soil properties measured at sampling locations. In this case, the interpolation uncertainty, caused by a lack of data collected in space, can be quantified by e.g. kriging standard deviation maps or animations of conditional simulations. • Experimental measurements, comparing impacts of treatments at different sites and/or under different conditions. In this case, an indication of the average and range in measured responses to treatments can be obtained from a meta-analysis, summarizing experimental findings between replicates and across studies, sites, ecosystems, etc. • Model predictions due to

  2. Calibration and Measurement Uncertainty Estimation of Radiometric Data: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Habte, A.; Sengupta, M.; Reda, I.; Andreas, A.; Konings, J.

    2014-11-01

    Evaluating the performance of photovoltaic cells, modules, and arrays that form large solar deployments relies on accurate measurements of the available solar resource. Therefore, determining the accuracy of these solar radiation measurements provides a better understanding of investment risks. This paper provides guidelines and recommended procedures for estimating the uncertainty in calibrations and measurements by radiometers using methods that follow the International Bureau of Weights and Measures Guide to the Expression of Uncertainty (GUM). Standardized analysis based on these procedures ensures that the uncertainty quoted is well documented.

  3. The influence of climate change on flood risks in France ­- first estimates and uncertainty analysis

    OpenAIRE

    Dumas , Patrice; Hallegatte , Sréphane; Quintana-Seguí , Pere; Martin , Eric

    2013-01-01

    International audience; Abstract. This paper proposes a methodology to project the possible evolution of river flood damages due to climate change, and applies it to mainland France. Its main contributions are (i) to demonstrate a methodology to investigate the full causal chain from global climate change to local economic flood losses; (ii) to show that future flood losses may change in a very significant manner over France; (iii) to show that a very large uncertainty arises from the climate...

  4. Signal detection in global mean temperatures after "Paris": an uncertainty and sensitivity analysis

    Science.gov (United States)

    Visser, Hans; Dangendorf, Sönke; van Vuuren, Detlef P.; Bregman, Bram; Petersen, Arthur C.

    2018-02-01

    In December 2015, 195 countries agreed in Paris to hold the increase in global mean surface temperature (GMST) well below 2.0 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C. Since large financial flows will be needed to keep GMSTs below these targets, it is important to know how GMST has progressed since pre-industrial times. However, the Paris Agreement is not conclusive as regards methods to calculate it. Should trend progression be deduced from GCM simulations or from instrumental records by (statistical) trend methods? Which simulations or GMST datasets should be chosen, and which trend models? What is pre-industrial and, finally, are the Paris targets formulated for total warming, originating from both natural and anthropogenic forcing, or do they refer to anthropogenic warming only? To find answers to these questions we performed an uncertainty and sensitivity analysis where datasets and model choices have been varied. For all cases we evaluated trend progression along with uncertainty information. To do so, we analysed four trend approaches and applied these to the five leading observational GMST products. We find GMST progression to be largely independent of various trend model approaches. However, GMST progression is significantly influenced by the choice of GMST datasets. Uncertainties due to natural variability are largest in size. As a parallel path, we calculated GMST progression from an ensemble of 42 GCM simulations. Mean progression derived from GCM-based GMSTs appears to lie in the range of trend-dataset combinations. A difference between both approaches appears to be the width of uncertainty bands: GCM simulations show a much wider spread. Finally, we discuss various choices for pre-industrial baselines and the role of warming definitions. Based on these findings we propose an estimate for signal progression in GMSTs since pre-industrial.

  5. A statistical method for lung tumor segmentation uncertainty in PET images based on user inference.

    Science.gov (United States)

    Zheng, Chaojie; Wang, Xiuying; Feng, Dagan

    2015-01-01

    PET has been widely accepted as an effective imaging modality for lung tumor diagnosis and treatment. However, standard criteria for delineating tumor boundary from PET are yet to develop largely due to relatively low quality of PET images, uncertain tumor boundary definition, and variety of tumor characteristics. In this paper, we propose a statistical solution to segmentation uncertainty on the basis of user inference. We firstly define the uncertainty segmentation band on the basis of segmentation probability map constructed from Random Walks (RW) algorithm; and then based on the extracted features of the user inference, we use Principle Component Analysis (PCA) to formulate the statistical model for labeling the uncertainty band. We validated our method on 10 lung PET-CT phantom studies from the public RIDER collections [1] and 16 clinical PET studies where tumors were manually delineated by two experienced radiologists. The methods were validated using Dice similarity coefficient (DSC) to measure the spatial volume overlap. Our method achieved an average DSC of 0.878 ± 0.078 on phantom studies and 0.835 ± 0.039 on clinical studies.

  6. Additional challenges for uncertainty analysis in river engineering

    Science.gov (United States)

    Berends, Koen; Warmink, Jord; Hulscher, Suzanne

    2016-04-01

    the proposed intervention. The implicit assumption underlying such analysis is that both models are commensurable. We hypothesize that they are commensurable only to a certain extent. In an idealised study we have demonstrated that prediction performance loss should be expected with increasingly large engineering works. When accounting for parametric uncertainty of floodplain roughness in model identification, we see uncertainty bounds for predicted effects of interventions increase with increasing intervention scale. Calibration of these types of models therefore seems to have a shelf-life, beyond which calibration does not longer improves prediction. Therefore a qualification scheme for model use is required that can be linked to model validity. In this study, we characterize model use along three dimensions: extrapolation (using the model with different external drivers), extension (using the model for different output or indicators) and modification (using modified models). Such use of models is expected to have implications for the applicability of surrogating modelling for efficient uncertainty analysis as well, which is recommended for future research. Warmink, J. J.; Straatsma, M. W.; Huthoff, F.; Booij, M. J. & Hulscher, S. J. M. H. 2013. Uncertainty of design water levels due to combined bed form and vegetation roughness in the Dutch river Waal. Journal of Flood Risk Management 6, 302-318 . DOI: 10.1111/jfr3.12014

  7. The role of general relativity in the uncertainty principle

    International Nuclear Information System (INIS)

    Padmanabhan, T.

    1986-01-01

    The role played by general relativity in quantum mechanics (especially as regards the uncertainty principle) is investigated. It is confirmed that the validity of time-energy uncertainty does depend on gravitational time dilation. It is also shown that there exists an intrinsic lower bound to the accuracy with which acceleration due to gravity can be measured. The motion of equivalence principle in quantum mechanics is clarified. (author)

  8. Uncertainties in Climatological Seawater Density Calculations

    Science.gov (United States)

    Dai, Hao; Zhang, Xining

    2018-03-01

    In most applications, with seawater conductivity, temperature, and pressure data measured in situ by various observation instruments e.g., Conductivity-Temperature-Depth instruments (CTD), the density which has strong ties to ocean dynamics and so on is computed according to equations of state for seawater. This paper, based on density computational formulae in the Thermodynamic Equation of Seawater 2010 (TEOS-10), follows the Guide of the expression of Uncertainty in Measurement (GUM) and assesses the main sources of uncertainties. By virtue of climatological decades-average temperature/Practical Salinity/pressure data sets in the global ocean provided by the National Oceanic and Atmospheric Administration (NOAA), correlation coefficients between uncertainty sources are determined and the combined standard uncertainties uc>(ρ>) in seawater density calculations are evaluated. For grid points in the world ocean with 0.25° resolution, the standard deviations of uc>(ρ>) in vertical profiles cover the magnitude order of 10-4 kg m-3. The uc>(ρ>) means in vertical profiles of the Baltic Sea are about 0.028kg m-3 due to the larger scatter of Absolute Salinity anomaly. The distribution of the uc>(ρ>) means in vertical profiles of the world ocean except for the Baltic Sea, which covers the range of >(0.004,0.01>) kg m-3, is related to the correlation coefficient r>(SA,p>) between Absolute Salinity SA and pressure p. The results in the paper are based on sensors' measuring uncertainties of high accuracy CTD. Larger uncertainties in density calculations may arise if connected with lower sensors' specifications. This work may provide valuable uncertainty information required for reliability considerations of ocean circulation and global climate models.

  9. Reporting and analyzing statistical uncertainties in Monte Carlo-based treatment planning

    International Nuclear Information System (INIS)

    Chetty, Indrin J.; Rosu, Mihaela; Kessler, Marc L.; Fraass, Benedick A.; Haken, Randall K. ten; Kong, Feng-Ming; McShan, Daniel L.

    2006-01-01

    Purpose: To investigate methods of reporting and analyzing statistical uncertainties in doses to targets and normal tissues in Monte Carlo (MC)-based treatment planning. Methods and Materials: Methods for quantifying statistical uncertainties in dose, such as uncertainty specification to specific dose points, or to volume-based regions, were analyzed in MC-based treatment planning for 5 lung cancer patients. The effect of statistical uncertainties on target and normal tissue dose indices was evaluated. The concept of uncertainty volume histograms for targets and organs at risk was examined, along with its utility, in conjunction with dose volume histograms, in assessing the acceptability of the statistical precision in dose distributions. The uncertainty evaluation tools were extended to four-dimensional planning for application on multiple instances of the patient geometry. All calculations were performed using the Dose Planning Method MC code. Results: For targets, generalized equivalent uniform doses and mean target doses converged at 150 million simulated histories, corresponding to relative uncertainties of less than 2% in the mean target doses. For the normal lung tissue (a volume-effect organ), mean lung dose and normal tissue complication probability converged at 150 million histories despite the large range in the relative organ uncertainty volume histograms. For 'serial' normal tissues such as the spinal cord, large fluctuations exist in point dose relative uncertainties. Conclusions: The tools presented here provide useful means for evaluating statistical precision in MC-based dose distributions. Tradeoffs between uncertainties in doses to targets, volume-effect organs, and 'serial' normal tissues must be considered carefully in determining acceptable levels of statistical precision in MC-computed dose distributions

  10. Uncertainty quantification using evidence theory in multidisciplinary design optimization

    International Nuclear Information System (INIS)

    Agarwal, Harish; Renaud, John E.; Preston, Evan L.; Padmanabhan, Dhanesh

    2004-01-01

    Advances in computational performance have led to the development of large-scale simulation tools for design. Systems generated using such simulation tools can fail in service if the uncertainty of the simulation tool's performance predictions is not accounted for. In this research an investigation of how uncertainty can be quantified in multidisciplinary systems analysis subject to epistemic uncertainty associated with the disciplinary design tools and input parameters is undertaken. Evidence theory is used to quantify uncertainty in terms of the uncertain measures of belief and plausibility. To illustrate the methodology, multidisciplinary analysis problems are introduced as an extension to the epistemic uncertainty challenge problems identified by Sandia National Laboratories. After uncertainty has been characterized mathematically the designer seeks the optimum design under uncertainty. The measures of uncertainty provided by evidence theory are discontinuous functions. Such non-smooth functions cannot be used in traditional gradient-based optimizers because the sensitivities of the uncertain measures are not properly defined. In this research surrogate models are used to represent the uncertain measures as continuous functions. A sequential approximate optimization approach is used to drive the optimization process. The methodology is illustrated in application to multidisciplinary example problems

  11. A Quantitative Measure For Evaluating Project Uncertainty Under Variation And Risk Effects

    Directory of Open Access Journals (Sweden)

    A. Chenarani

    2017-10-01

    Full Text Available The effects of uncertainty on a project and the risk event as the consequence of uncertainty are analyzed. The uncertainty index is proposed as a quantitative measure for evaluating the uncertainty of a project. This is done by employing entropy as the indicator of system disorder and lack of information. By employing this index, the uncertainty of each activity and its increase due to risk effects as well as project uncertainty changes as a function of time can be assessed. The results are implemented and analyzed for a small turbojet engine development project as the case study. The results of this study can be useful for project managers and other stakeholders for selecting the most effective risk management and uncertainty controlling method.

  12. Assessing uncertainty and risk in exploited marine populations

    International Nuclear Information System (INIS)

    Fogarty, M.J.; Mayo, R.K.; O'Brien, L.; Serchuk, F.M.; Rosenberg, A.A.

    1996-01-01

    The assessment and management of exploited fish and invertebrate populations is subject to several types of uncertainty. This uncertainty translates into risk to the population in the development and implementation of fishery management advice. Here, we define risk as the probability that exploitation rates will exceed a threshold level where long term sustainability of the stock is threatened. We distinguish among several sources of error or uncertainty due to (a) stochasticity in demographic rates and processes, particularly in survival rates during the early fife stages; (b) measurement error resulting from sampling variation in the determination of population parameters or in model estimation; and (c) the lack of complete information on population and ecosystem dynamics. The first represents a form of aleatory uncertainty while the latter two factors represent forms of epistemic uncertainty. To illustrate these points, we evaluate the recent status of the Georges Bank cod stock in a risk assessment framework. Short term stochastic projections are made accounting for uncertainty in population size and for random variability in the number of young surviving to enter the fishery. We show that recent declines in this cod stock can be attributed to exploitation rates that have substantially exceeded sustainable levels

  13. Uncertainty in Simulating Wheat Yields Under Climate Change

    Science.gov (United States)

    Asseng, S.; Ewert, F.; Rosenzweig, Cynthia; Jones, J. W.; Hatfield, J. W.; Ruane, A. C.; Boote, K. J.; Thornburn, P. J.; Rotter, R. P.; Cammarano, D.; hide

    2013-01-01

    Projections of climate change impacts on crop yields are inherently uncertain1. Uncertainty is often quantified when projecting future greenhouse gas emissions and their influence on climate2. However, multi-model uncertainty analysis of crop responses to climate change is rare because systematic and objective comparisons among process-based crop simulation models1,3 are difficult4. Here we present the largest standardized model intercomparison for climate change impacts so far. We found that individual crop models are able to simulate measured wheat grain yields accurately under a range of environments, particularly if the input information is sufficient. However, simulated climate change impacts vary across models owing to differences in model structures and parameter values. A greater proportion of the uncertainty in climate change impact projections was due to variations among crop models than to variations among downscaled general circulation models. Uncertainties in simulated impacts increased with CO2 concentrations and associated warming. These impact uncertainties can be reduced by improving temperature and CO2 relationships in models and better quantified through use of multi-model ensembles. Less uncertainty in describing how climate change may affect agricultural productivity will aid adaptation strategy development and policymaking.

  14. Robustness of dynamic systems with parameter uncertainties

    CERN Document Server

    Balemi, S; Truöl, W

    1992-01-01

    Robust Control is one of the fastest growing and promising areas of research today. In many practical systems there exist uncertainties which have to be considered in the analysis and design of control systems. In the last decade methods were developed for dealing with dynamic systems with unstructured uncertainties such as HOO_ and £I-optimal control. For systems with parameter uncertainties, the seminal paper of V. L. Kharitonov has triggered a large amount of very promising research. An international workshop dealing with all aspects of robust control was successfully organized by S. P. Bhattacharyya and L. H. Keel in San Antonio, Texas, USA in March 1991. We organized the second international workshop in this area in Ascona, Switzer­ land in April 1992. However, this second workshop was restricted to robust control of dynamic systems with parameter uncertainties with the objective to concentrate on some aspects of robust control. This book contains a collection of papers presented at the International W...

  15. Policy Uncertainty and the US Ethanol Industry

    Directory of Open Access Journals (Sweden)

    Jason P. H. Jones

    2017-11-01

    Full Text Available The Renewable Fuel Standard (RFS2, as implemented, has introduced uncertainty into US ethanol producers and the supporting commodity market. First, the fixed mandate for what is mainly cornstarch-based ethanol has increased feedstock price volatility and exerts a general effect across the agricultural sector. Second, the large discrepancy between the original Energy Independence and Security Act (EISA intentions and the actual RFS2 implementation for some fuel classes has increased the investment uncertainty facing investors in biofuel production, distribution, and consumption. Here we discuss and analyze the sources of uncertainty and evaluate the effect of potential RFS2 adjustments as they influence these uncertainties. This includes the use of a flexible, production dependent mandate on corn starch ethanol. We find that a flexible mandate on cornstarch ethanol relaxed during drought could significantly reduce commodity price spikes and alleviate the decline of livestock production in cases of feedstock production shortfalls, but it would increase the risk for ethanol investors.

  16. Risk Management and Uncertainty in Infrastructure Projects

    DEFF Research Database (Denmark)

    Harty, Chris; Neerup Themsen, Tim; Tryggestad, Kjell

    2014-01-01

    The assumption that large complex projects should be managed in order to reduce uncertainty and increase predictability is not new. What is relatively new, however, is that uncertainty reduction can and should be obtained through formal risk management approaches. We question both assumptions...... by addressing a more fundamental question about the role of knowledge in current risk management practices. Inquiries into the predominant approaches to risk management in large infrastructure and construction projects reveal their assumptions about knowledge and we discuss the ramifications these have...... for project and construction management. Our argument and claim is that predominant risk management approaches tends to reinforce conventional ideas of project control whilst undermining other notions of value and relevance of built assets and project management process. These approaches fail to consider...

  17. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  18. Understanding uncertainty

    CERN Document Server

    Lindley, Dennis V

    2013-01-01

    Praise for the First Edition ""...a reference for everyone who is interested in knowing and handling uncertainty.""-Journal of Applied Statistics The critically acclaimed First Edition of Understanding Uncertainty provided a study of uncertainty addressed to scholars in all fields, showing that uncertainty could be measured by probability, and that probability obeyed three basic rules that enabled uncertainty to be handled sensibly in everyday life. These ideas were extended to embrace the scientific method and to show how decisions, containing an uncertain element, could be rationally made.

  19. Uncertainty modelling and code calibration for composite materials

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Branner, Kim; Mishnaevsky, Leon, Jr

    2013-01-01

    and measurement uncertainties which are introduced on the different scales. Typically, these uncertainties are taken into account in the design process using characteristic values and partial safety factors specified in a design standard. The value of the partial safety factors should reflect a reasonable balance...... to wind turbine blades are calibrated for two typical lay-ups using a large number of load cases and ratios between the aerodynamic forces and the inertia forces....

  20. Automatic Voltage Control (AVC) System under Uncertainty from Wind Power

    DEFF Research Database (Denmark)

    Qin, Nan; Abildgaard, Hans; Flynn, Damian

    2016-01-01

    An automatic voltage control (AVC) system maintains the voltage profile of a power system in an acceptable range and minimizes the operational cost by coordinating the regulation of controllable components. Typically, all of the parameters in the optimization problem are assumed to be certain...... and constant in the decision making process. However, for high shares of wind power, uncertainty in the decision process due to wind power variability may result in an infeasible AVC solution. This paper proposes a voltage control approach which considers the voltage uncertainty from wind power productions....... The proposed method improves the performance and the robustness of a scenario based approach by estimating the potential voltage variations due to fluctuating wind power production, and introduces a voltage margin to protect the decision against uncertainty for each scenario. The effectiveness of the proposed...

  1. Analogy as a strategy for supporting complex problem solving under uncertainty.

    Science.gov (United States)

    Chan, Joel; Paletz, Susannah B F; Schunn, Christian D

    2012-11-01

    Complex problem solving in naturalistic environments is fraught with uncertainty, which has significant impacts on problem-solving behavior. Thus, theories of human problem solving should include accounts of the cognitive strategies people bring to bear to deal with uncertainty during problem solving. In this article, we present evidence that analogy is one such strategy. Using statistical analyses of the temporal dynamics between analogy and expressed uncertainty in the naturalistic problem-solving conversations among scientists on the Mars Rover Mission, we show that spikes in expressed uncertainty reliably predict analogy use (Study 1) and that expressed uncertainty reduces to baseline levels following analogy use (Study 2). In addition, in Study 3, we show with qualitative analyses that this relationship between uncertainty and analogy is not due to miscommunication-related uncertainty but, rather, is primarily concentrated on substantive problem-solving issues. Finally, we discuss a hypothesis about how analogy might serve as an uncertainty reduction strategy in naturalistic complex problem solving.

  2. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    Science.gov (United States)

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  3. Covariance methodology applied to uncertainties in I-126 disintegration rate measurements

    International Nuclear Information System (INIS)

    Fonseca, K.A.; Koskinas, M.F.; Dias, M.S.

    1996-01-01

    The covariance methodology applied to uncertainties in 126 I disintegration rate measurements is described. Two different coincidence systems were used due to the complex decay scheme of this radionuclide. The parameters involved in the determination of the disintegration rate in each experimental system present correlated components. In this case, the conventional statistical methods to determine the uncertainties (law of propagation) result in wrong values for the final uncertainty. Therefore, use of the methodology of the covariance matrix is necessary. The data from both systems were combined taking into account all possible correlations between the partial uncertainties. (orig.)

  4. Nuclear data uncertainties for local power densities in the Martin-Hoogenboom benchmark

    International Nuclear Information System (INIS)

    Van der Marck, S.C.; Rochman, D.A.

    2013-01-01

    The recently developed method of fast Total Monte Carlo to propagate nuclear data uncertainties was applied to the Martin-Hoogenboom benchmark. This Martin- Hoogenboom benchmark prescribes that one calculates local pin powers (of light water cooled reactor) with a statistical uncertainty lower than 1% everywhere. Here we report, for the first time, an estimate of the nuclear data uncertainties for these local pin powers. For each of the more than 6 million local power tallies, the uncertainty due to nuclear data uncertainties was calculated, based on random variation of data for 235 U, 238 U, 239 Pu and H in H 2 O thermal scattering. In the center of the core region, the nuclear data uncertainty is 0.9%. Towards the edges of the core, this uncertainty increases to roughly 3%. The nuclear data uncertainties have been shown to be larger than the statistical uncertainties that the benchmark prescribes

  5. Uncertainties in Organ Burdens Estimated from PAS

    International Nuclear Information System (INIS)

    La Bone, T.R.

    2004-01-01

    To calculate committed effective dose equivalent, one needs to know the quantity of the radionuclide in all significantly irradiated organs (the organ burden) as a function of time following the intake. There are two major sources of uncertainty in an organ burden estimated from personal air sampling (PAS) data: (1) The uncertainty in going from the exposure measured with the PAS to the quantity of aerosol inhaled by the individual, and (2) The uncertainty in going from the intake to the organ burdens at any given time, taking into consideration the biological variability of the biokinetic models from person to person (interperson variability) and in one person over time (intra-person variability). We have been using biokinetic modeling methods developed by researchers at the University of Florida to explore the impact of inter-person variability on the uncertainty of organ burdens estimated from PAS data. These initial studies suggest that the uncertainties are so large that PAS might be considered to be a qualitative (rather than quantitative) technique. These results indicate that more studies should be performed to properly classify the reliability and usefulness of using PAS monitoring data to estimate organ burdens, organ dose, and ultimately CEDE

  6. Large abnormal peak on capillary zone electrophoresis due to contrast agent.

    Science.gov (United States)

    Wheeler, Rachel D; Zhang, Liqun; Sheldon, Joanna

    2017-01-01

    Background Some iodinated radio-contrast media absorb ultraviolet light and can therefore be detected by capillary zone electrophoresis. If seen, these peaks are typically small with 'quantifications' of below 5 g/L. Here, we describe the detection of a large peak on capillary zone electrophoresis that was due to the radio-contrast agent, Omnipaque™. Methods Serum from a patient was analysed by capillary zone electrophoresis, and the IgG, IgA, IgM and total protein concentrations were measured. The serum sample was further analysed by gel electrophoresis and immunofixation. Results Capillary zone electrophoresis results for the serum sample showed a large peak with a concentration high enough to warrant urgent investigation. However, careful interpretation alongside the serum immunoglobulin concentrations and total protein concentration showed that the abnormal peak was a pseudoparaprotein rather than a monoclonal immunoglobulin. This was confirmed by analysis with gel electrophoresis and also serum immunofixation. The patient had had a CT angiogram with the radio-contrast agent Omnipaque™; addition of Omnipaque™ to a normal serum sample gave a peak with comparable mobility to the pseudoparaprotein in the patient's serum. Conclusions Pseudoparaproteins can appear as a large band on capillary zone electrophoresis. This case highlights the importance of a laboratory process that detects significant electrophoretic abnormalities promptly and interprets them in the context of the immunoglobulin concentrations. This should avoid incorrect reporting of pseudoparaproteins which could result in the patient having unnecessary investigations.

  7. Exploring Higher Dimensional Black Holes at the Large Hadron Collider

    CERN Document Server

    Harris, C M; Parker, M A; Richardson, P; Sabetfakhri, A; Webber, Bryan R

    2005-01-01

    In some extra dimension theories with a TeV fundamental Planck scale, black holes could be produced in future collider experiments. Although cross sections can be large, measuring the model parameters is difficult due to the many theoretical uncertainties. Here we discuss those uncertainties and then we study the experimental characteristics of black hole production and decay at a typical detector using the ATLAS detector as a guide. We present a new technique for measuring the temperature of black holes that applies to many models. We apply this technique to a test case with four extra dimensions and, using an estimate of the parton-level production cross section error of 20\\%, determine the Planck mass to 15\\% and the number of extra dimensions to $\\pm$0.75.

  8. Exploring higher dimensional black holes at the Large Hadron Collider

    International Nuclear Information System (INIS)

    Harris, Christopher M.; Palmer, Matthew J.; Parker, Michael A.; Richardson, Peter; Sabetfakhri, Ali; Webber, Bryan R.

    2005-01-01

    In some extra dimension theories with a TeV fundamental Planck scale, black holes could be produced in future collider experiments. Although cross sections can be large, measuring the model parameters is difficult due to the many theoretical uncertainties. Here we discuss those uncertainties and then we study the experimental characteristics of black hole production and decay at a typical detector using the ATLAS detector as a guide. We present a new technique for measuring the temperature of black holes that applies to many models. We apply this technique to a test case with four extra dimensions and, using an estimate of the parton-level production cross section error of 20%, determine the Planck mass to 15% and the number of extra dimensions to ±0.75

  9. Fuzzy uncertainty modeling applied to AP1000 nuclear power plant LOCA

    International Nuclear Information System (INIS)

    Ferreira Guimaraes, Antonio Cesar; Franklin Lapa, Celso Marcelo; Lamego Simoes Filho, Francisco Fernando; Cabral, Denise Cunha

    2011-01-01

    Research highlights: → This article presents an uncertainty modelling study using a fuzzy approach. → The AP1000 Westinghouse NPP was used and it is provided of passive safety systems. → The use of advanced passive safety systems in NPP has limited operational experience. → Failure rates and basic events probabilities used on the fault tree analysis. → Fuzzy uncertainty approach was employed to reliability of the AP1000 large LOCA. - Abstract: This article presents an uncertainty modeling study using a fuzzy approach applied to the Westinghouse advanced nuclear reactor. The AP1000 Westinghouse Nuclear Power Plant (NPP) is provided of passive safety systems, based on thermo physics phenomenon, that require no operating actions, soon after an incident has been detected. The use of advanced passive safety systems in NPP has limited operational experience. As it occurs in any reliability study, statistically non-significant events report introduces a significant uncertainty level about the failure rates and basic events probabilities used on the fault tree analysis (FTA). In order to model this uncertainty, a fuzzy approach was employed to reliability analysis of the AP1000 large break Loss of Coolant Accident (LOCA). The final results have revealed that the proposed approach may be successfully applied to modeling of uncertainties in safety studies.

  10. Outcome and value uncertainties in global-change policy

    International Nuclear Information System (INIS)

    Hammitt, J.K.

    1995-01-01

    Choices among environmental policies can be informed by analysis of the potential physical, biological, and social outcomes of alternative choices, and analysis of social preferences among these outcomes. Frequently, however, the consequences of alternative policies cannot be accurately predicted because of substantial outcome uncertainties concerning physical, chemical, biological, and social processes linking policy choices to consequences. Similarly, assessments of social preferences among alternative outcomes are limited by value uncertainties arising from limitations of moral principles, the absence of economic markets for many environmental attributes, and other factors. Outcome and value uncertainties relevant to global-change policy are described and their magnitudes are examined for two cases: stratospheric-ozone depletion and global climate change. Analysis of information available in the mid 1980s, when international ozone regulations were adopted, suggests that contemporary uncertainties surrounding CFC emissions and the atmospheric response were so large that plausible ozone depletion, absent regulation, ranged from negligible to catastrophic, a range that exceeded the plausible effect of the regulations considered. Analysis of climate change suggests that, important as outcome uncertainties are, uncertainties about values may be even more important for policy choice. 53 refs., 3 figs., 3 tabs

  11. An uncertainty inventory demonstration - a primary step in uncertainty quantification

    Energy Technology Data Exchange (ETDEWEB)

    Langenbrunner, James R. [Los Alamos National Laboratory; Booker, Jane M [Los Alamos National Laboratory; Hemez, Francois M [Los Alamos National Laboratory; Salazar, Issac F [Los Alamos National Laboratory; Ross, Timothy J [UNM

    2009-01-01

    Tools, methods, and theories for assessing and quantifying uncertainties vary by application. Uncertainty quantification tasks have unique desiderata and circumstances. To realistically assess uncertainty requires the engineer/scientist to specify mathematical models, the physical phenomena of interest, and the theory or framework for assessments. For example, Probabilistic Risk Assessment (PRA) specifically identifies uncertainties using probability theory, and therefore, PRA's lack formal procedures for quantifying uncertainties that are not probabilistic. The Phenomena Identification and Ranking Technique (PIRT) proceeds by ranking phenomena using scoring criteria that results in linguistic descriptors, such as importance ranked with words, 'High/Medium/Low.' The use of words allows PIRT to be flexible, but the analysis may then be difficult to combine with other uncertainty theories. We propose that a necessary step for the development of a procedure or protocol for uncertainty quantification (UQ) is the application of an Uncertainty Inventory. An Uncertainty Inventory should be considered and performed in the earliest stages of UQ.

  12. Account of the uncertainty factor in forecasting nuclear power development

    International Nuclear Information System (INIS)

    Chernavskij, S.Ya.

    1979-01-01

    Minimization of total discounted costs for linear constraints is commonly used in forecasting nuclear energy growth. This approach is considered inadequate due to the uncertainty of exogenous variables of the model. A method of forecasting that takes into account the presence of uncertainty is elaborated. An example that demonstrates the expediency of the method and its advantage over the conventional approximation method used for taking uncertainty into account is given. In the framework of the example, the optimal strategy for nuclear energy growth over period of 500 years is determined

  13. Uncertainty in the inelastic resonant scattering assisted by phonons

    International Nuclear Information System (INIS)

    Garcia, N.; Garcia-Sanz, J.; Solana, J.

    1977-01-01

    We have analyzed the inelastic minima observed in new results of He atoms scattered from LiF(001) surfaces. This is done considering bound state resonance processes assisted by phonons. The analysis presents large uncertainties. In the range of uncertainty, we find two ''possible'' bands associated with the vibrations of F - and Li + , respectively. Many more experimental data are necessary to confirm the existence of these processes

  14. Recognizing and responding to uncertainty: a grounded theory of nurses' uncertainty.

    Science.gov (United States)

    Cranley, Lisa A; Doran, Diane M; Tourangeau, Ann E; Kushniruk, Andre; Nagle, Lynn

    2012-08-01

    There has been little research to date exploring nurses' uncertainty in their practice. Understanding nurses' uncertainty is important because it has potential implications for how care is delivered. The purpose of this study is to develop a substantive theory to explain how staff nurses experience and respond to uncertainty in their practice. Between 2006 and 2008, a grounded theory study was conducted that included in-depth semi-structured interviews. Fourteen staff nurses working in adult medical-surgical intensive care units at two teaching hospitals in Ontario, Canada, participated in the study. The theory recognizing and responding to uncertainty characterizes the processes through which nurses' uncertainty manifested and how it was managed. Recognizing uncertainty involved the processes of assessing, reflecting, questioning, and/or being unable to predict aspects of the patient situation. Nurses' responses to uncertainty highlighted the cognitive-affective strategies used to manage uncertainty. Study findings highlight the importance of acknowledging uncertainty and having collegial support to manage uncertainty. The theory adds to our understanding the processes involved in recognizing uncertainty, strategies and outcomes of managing uncertainty, and influencing factors. Tailored nursing education programs should be developed to assist nurses in developing skills in articulating and managing their uncertainty. Further research is needed to extend, test and refine the theory of recognizing and responding to uncertainty to develop strategies for managing uncertainty. This theory advances the nursing perspective of uncertainty in clinical practice. The theory is relevant to nurses who are faced with uncertainty and complex clinical decisions, to managers who support nurses in their clinical decision-making, and to researchers who investigate ways to improve decision-making and care delivery. ©2012 Sigma Theta Tau International.

  15. Data-Driven Model Uncertainty Estimation in Hydrologic Data Assimilation

    Science.gov (United States)

    Pathiraja, S.; Moradkhani, H.; Marshall, L.; Sharma, A.; Geenens, G.

    2018-02-01

    The increasing availability of earth observations necessitates mathematical methods to optimally combine such data with hydrologic models. Several algorithms exist for such purposes, under the umbrella of data assimilation (DA). However, DA methods are often applied in a suboptimal fashion for complex real-world problems, due largely to several practical implementation issues. One such issue is error characterization, which is known to be critical for a successful assimilation. Mischaracterized errors lead to suboptimal forecasts, and in the worst case, to degraded estimates even compared to the no assimilation case. Model uncertainty characterization has received little attention relative to other aspects of DA science. Traditional methods rely on subjective, ad hoc tuning factors or parametric distribution assumptions that may not always be applicable. We propose a novel data-driven approach (named SDMU) to model uncertainty characterization for DA studies where (1) the system states are partially observed and (2) minimal prior knowledge of the model error processes is available, except that the errors display state dependence. It includes an approach for estimating the uncertainty in hidden model states, with the end goal of improving predictions of observed variables. The SDMU is therefore suited to DA studies where the observed variables are of primary interest. Its efficacy is demonstrated through a synthetic case study with low-dimensional chaotic dynamics and a real hydrologic experiment for one-day-ahead streamflow forecasting. In both experiments, the proposed method leads to substantial improvements in the hidden states and observed system outputs over a standard method involving perturbation with Gaussian noise.

  16. A simplified analysis of uncertainty propagation in inherently controlled ATWS events

    International Nuclear Information System (INIS)

    Wade, D.C.

    1987-01-01

    The quasi static approach can be used to provide useful insight concerning the propagation of uncertainties in the inherent response to ATWS events. At issue is how uncertainties in the reactivity coefficients and in the thermal-hydraulics and materials properties propagate to yield uncertainties in the asymptotic temperatures attained upon inherent shutdown. The basic notion to be quantified is that many of the same physical phenomena contribute to both the reactivity increase of power reduction and the reactivity decrease of core temperature rise. Since these reactivities cancel by definition, a good deal of uncertainty cancellation must also occur of necessity. For example, if the Doppler coefficient is overpredicted, too large a positive reactivity insertion is predicted upon power reduction and collapse of the ΔT across the fuel pin. However, too large a negative reactivity is also predicted upon the compensating increase in the isothermal core average temperature - which includes the fuel Doppler effect

  17. Quantifying Uncertainty in Instantaneous Orbital Data Products of TRMM over Indian Subcontinent

    Science.gov (United States)

    Jayaluxmi, I.; Nagesh, D.

    2013-12-01

    In the last 20 years, microwave radiometers have taken satellite images of earth's weather proving to be a valuable tool for quantitative estimation of precipitation from space. However, along with the widespread acceptance of microwave based precipitation products, it has also been recognized that they contain large uncertainties. While most of the uncertainty evaluation studies focus on the accuracy of rainfall accumulated over time (e.g., season/year), evaluation of instantaneous rainfall intensities from satellite orbital data products are relatively rare. These instantaneous products are known to potentially cause large uncertainties during real time flood forecasting studies at the watershed scale. Especially over land regions, where the highly varying land surface emissivity offer a myriad of complications hindering accurate rainfall estimation. The error components of orbital data products also tend to interact nonlinearly with hydrologic modeling uncertainty. Keeping these in mind, the present study fosters the development of uncertainty analysis using instantaneous satellite orbital data products (version 7 of 1B11, 2A25, 2A23) derived from the passive and active sensors onboard Tropical Rainfall Measuring Mission (TRMM) satellite, namely TRMM microwave imager (TMI) and Precipitation Radar (PR). The study utilizes 11 years of orbital data from 2002 to 2012 over the Indian subcontinent and examines the influence of various error sources on the convective and stratiform precipitation types. Analysis conducted over the land regions of India investigates three sources of uncertainty in detail. These include 1) Errors due to improper delineation of rainfall signature within microwave footprint (rain/no rain classification), 2) Uncertainty offered by the transfer function linking rainfall with TMI low frequency channels and 3) Sampling errors owing to the narrow swath and infrequent visits of TRMM sensors. Case study results obtained during the Indian summer

  18. Uncertainty Quantification in High Throughput Screening ...

    Science.gov (United States)

    Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of biochemical and cellular processes, including endocrine disruption, cytotoxicity, and zebrafish development. Over 2.6 million concentration response curves are fit to models to extract parameters related to potency and efficacy. Models built on ToxCast results are being used to rank and prioritize the toxicological risk of tested chemicals and to predict the toxicity of tens of thousands of chemicals not yet tested in vivo. However, the data size also presents challenges. When fitting the data, the choice of models, model selection strategy, and hit call criteria must reflect the need for computational efficiency and robustness, requiring hard and somewhat arbitrary cutoffs. When coupled with unavoidable noise in the experimental concentration response data, these hard cutoffs cause uncertainty in model parameters and the hit call itself. The uncertainty will then propagate through all of the models built on the data. Left unquantified, this uncertainty makes it difficult to fully interpret the data for risk assessment. We used bootstrap resampling methods to quantify the uncertainty in fitting models to the concentration response data. Bootstrap resampling determines confidence intervals for

  19. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that ``The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.`` Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true? What kind of information should we include in a statement of uncertainty accompanying a calibrated value? How and where do we get the information to include in an uncertainty statement? How should we interpret and use measurement uncertainty information? This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  20. Principles and applications of measurement and uncertainty analysis in research and calibration

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.V.

    1992-11-01

    Interest in Measurement Uncertainty Analysis has grown in the past several years as it has spread to new fields of application, and research and development of uncertainty methodologies have continued. This paper discusses the subject from the perspectives of both research and calibration environments. It presents a history of the development and an overview of the principles of uncertainty analysis embodied in the United States National Standard, ANSI/ASME PTC 19.1-1985, Measurement Uncertainty. Examples are presented in which uncertainty analysis was utilized or is needed to gain further knowledge of a particular measurement process and to characterize final results. Measurement uncertainty analysis provides a quantitative estimate of the interval about a measured value or an experiment result within which the true value of that quantity is expected to lie. Years ago, Harry Ku of the United States National Bureau of Standards stated that The informational content of the statement of uncertainty determines, to a large extent, the worth of the calibrated value.'' Today, that statement is just as true about calibration or research results as it was in 1968. Why is that true What kind of information should we include in a statement of uncertainty accompanying a calibrated value How and where do we get the information to include in an uncertainty statement How should we interpret and use measurement uncertainty information This discussion will provide answers to these and other questions about uncertainty in research and in calibration. The methodology to be described has been developed by national and international groups over the past nearly thirty years, and individuals were publishing information even earlier. Yet the work is largely unknown in many science and engineering arenas. I will illustrate various aspects of uncertainty analysis with some examples drawn from the radiometry measurement and calibration discipline from research activities.

  1. W nano-fuzzes: A metastable state formed due to large-flux He"+ irradiation at an elevated temperature

    International Nuclear Information System (INIS)

    Wu, Yunfeng; Liu, Lu; Lu, Bing; Ni, Weiyuan; Liu, Dongping

    2016-01-01

    W nano-fuzzes have been formed due to the large-flux and low-energy (200eV) He"+ irradiation at W surface temperature of 1480 °C. Microscopic evolution of W nano-fuzzes during annealing or low-energy (200 eV) He"+ bombardments has been observed using scanning electron microscopy and thermal desorption spectroscopy. Our measurements show that both annealing and He"+ bombardments can significantly alter the structure of W nano-fuzzes. W nano-fuzzes are thermally unstable due to the He release during annealing, and they are easily sputtered during He"+ bombardments. The current study shows that W nano-fuzzes act as a metastable state during low-energy and large-flux He"+ irradiation at an elevated temperature. - Highlights: • W nano-fuzzes microscopic evolution during annealing or He"+ irradiated have been measured. • W nano-fuzzes are thermally unstable due to He release during annealing. • He are released from the top layer of W fuzzes by annealing. • Metastable W nano-fuzzes are formed due to He"+ irradiation at an elevated temperature.

  2. Development of a Dynamic Lidar Uncertainty Framework

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, Andrew [WindForS; Bonin, Timothy [CIRES/NOAA ESRL; Choukulkar, Aditya [CIRES/NOAA ESRL; Brewer, W. Alan [NOAA ESRL; Delgado, Ruben [University of Maryland Baltimore County

    2017-08-07

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote-sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote-sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote-sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards for quantifying remote sensing device uncertainty for power performance testing consider uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device and are generally fixed, leading to climatic uncertainty values that apply to the entire measurement campaign. However, real-world experience and a consideration of the fundamentals of the measurement process have shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we describe the development of a new dynamic lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from a field measurement site to assess the ability of the framework to predict

  3. Uncertainty of the 20th century sea-level rise due to vertical land motion errors

    Science.gov (United States)

    Santamaría-Gómez, Alvaro; Gravelle, Médéric; Dangendorf, Sönke; Marcos, Marta; Spada, Giorgio; Wöppelmann, Guy

    2017-09-01

    Assessing the vertical land motion (VLM) at tide gauges (TG) is crucial to understanding global and regional mean sea-level changes (SLC) over the last century. However, estimating VLM with accuracy better than a few tenths of a millimeter per year is not a trivial undertaking and many factors, including the reference frame uncertainty, must be considered. Using a novel reconstruction approach and updated geodetic VLM corrections, we found the terrestrial reference frame and the estimated VLM uncertainty may contribute to the global SLC rate error by ± 0.2 mmyr-1. In addition, a spurious global SLC acceleration may be introduced up to ± 4.8 ×10-3 mmyr-2. Regional SLC rate and acceleration errors may be inflated by a factor 3 compared to the global. The difference of VLM from two independent Glacio-Isostatic Adjustment models introduces global SLC rate and acceleration biases at the level of ± 0.1 mmyr-1 and 2.8 ×10-3 mmyr-2, increasing up to 0.5 mm yr-1 and 9 ×10-3 mmyr-2 for the regional SLC. Errors in VLM corrections need to be budgeted when considering past and future SLC scenarios.

  4. Synthesis of Optimal Processing Pathway for Microalgae-based Biorefinery under Uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Lee, Jay H.; Gani, Rafiqul

    2015-01-01

    decision making, we propose a systematic framework for the synthesis and optimal design of microalgae-based processing network under uncertainty. By incorporating major uncertainties into the biorefinery superstructure model we developed previously, a stochastic mixed integer nonlinear programming (s......The research in the field of microalgae-based biofuels and chemicals is in early phase of the development, and therefore a wide range of uncertainties exist due to inconsistencies among and shortage of technical information. In order to handle and address these uncertainties to ensure robust......MINLP) problem is formulated for determining the optimal biorefinery structure under given parameter uncertainties modelled as sampled scenarios. The solution to the sMINLP problem determines the optimal decisions with respect to processing technologies, material flows, and product portfolio in the presence...

  5. Modeling of uncertainties in biochemical reactions.

    Science.gov (United States)

    Mišković, Ljubiša; Hatzimanikatis, Vassily

    2011-02-01

    Mathematical modeling is an indispensable tool for research and development in biotechnology and bioengineering. The formulation of kinetic models of biochemical networks depends on knowledge of the kinetic properties of the enzymes of the individual reactions. However, kinetic data acquired from experimental observations bring along uncertainties due to various experimental conditions and measurement methods. In this contribution, we propose a novel way to model the uncertainty in the enzyme kinetics and to predict quantitatively the responses of metabolic reactions to the changes in enzyme activities under uncertainty. The proposed methodology accounts explicitly for mechanistic properties of enzymes and physico-chemical and thermodynamic constraints, and is based on formalism from systems theory and metabolic control analysis. We achieve this by observing that kinetic responses of metabolic reactions depend: (i) on the distribution of the enzymes among their free form and all reactive states; (ii) on the equilibrium displacements of the overall reaction and that of the individual enzymatic steps; and (iii) on the net fluxes through the enzyme. Relying on this observation, we develop a novel, efficient Monte Carlo sampling procedure to generate all states within a metabolic reaction that satisfy imposed constrains. Thus, we derive the statistics of the expected responses of the metabolic reactions to changes in enzyme levels and activities, in the levels of metabolites, and in the values of the kinetic parameters. We present aspects of the proposed framework through an example of the fundamental three-step reversible enzymatic reaction mechanism. We demonstrate that the equilibrium displacements of the individual enzymatic steps have an important influence on kinetic responses of the enzyme. Furthermore, we derive the conditions that must be satisfied by a reversible three-step enzymatic reaction operating far away from the equilibrium in order to respond to

  6. Multi-scenario modelling of uncertainty in stochastic chemical systems

    International Nuclear Information System (INIS)

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-01-01

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo

  7. Managing Measurement Uncertainty in Building Acoustics

    Directory of Open Access Journals (Sweden)

    Chiara Scrosati

    2015-12-01

    Full Text Available In general, uncertainties should preferably be determined following the principles laid down in ISO/IEC Guide 98-3, the Guide to the expression of uncertainty in measurement (GUM:1995. According to current knowledge, it seems impossible to formulate these models for the different quantities in building acoustics. Therefore, the concepts of repeatability and reproducibility are necessary to determine the uncertainty of building acoustics measurements. This study shows the uncertainty of field measurements of a lightweight wall, a heavyweight floor, a façade with a single glazing window and a façade with double glazing window that were analyzed by a Round Robin Test (RRT, conducted in a full-scale experimental building at ITC-CNR (Construction Technologies Institute of the National Research Council of Italy. The single number quantities and their uncertainties were evaluated in both narrow and enlarged range and it was shown that including or excluding the low frequencies leads to very significant differences, except in the case of the sound insulation of façades with single glazing window. The results obtained in these RRTs were compared with other results from literature, which confirm the increase of the uncertainty of single number quantities due to the low frequencies extension. Having stated the measurement uncertainty for a single measurement, in building acoustics, it is also very important to deal with sampling for the purposes of classification of buildings or building units. Therefore, this study also shows an application of the sampling included in the Italian Standard on the acoustic classification of building units on a serial type building consisting of 47 building units. It was found that the greatest variability is observed in the façade and it depends on both the great variability of window’s typologies and on workmanship. Finally, it is suggested how to manage the uncertainty in building acoustics, both for one single

  8. Sensitivity and uncertainty analyses for performance assessment modeling

    International Nuclear Information System (INIS)

    Doctor, P.G.

    1988-08-01

    Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs

  9. Sources of uncertainty in future changes in local precipitation

    Energy Technology Data Exchange (ETDEWEB)

    Rowell, David P. [Met Office Hadley Centre, Exeter (United Kingdom)

    2012-10-15

    This study considers the large uncertainty in projected changes in local precipitation. It aims to map, and begin to understand, the relative roles of uncertain modelling and natural variability, using 20-year mean data from four perturbed physics or multi-model ensembles. The largest - 280-member - ensemble illustrates a rich pattern in the varying contribution of modelling uncertainty, with similar features found using a CMIP3 ensemble (despite its limited sample size, which restricts it value in this context). The contribution of modelling uncertainty to the total uncertainty in local precipitation change is found to be highest in the deep tropics, particularly over South America, Africa, the east and central Pacific, and the Atlantic. In the moist maritime tropics, the highly uncertain modelling of sea-surface temperature changes is transmitted to a large uncertain modelling of local rainfall changes. Over tropical land and summer mid-latitude continents (and to a lesser extent, the tropical oceans), uncertain modelling of atmospheric processes, land surface processes and the terrestrial carbon cycle all appear to play an additional substantial role in driving the uncertainty of local rainfall changes. In polar regions, inter-model variability of anomalous sea ice drives an uncertain precipitation response, particularly in winter. In all these regions, there is therefore the potential to reduce the uncertainty of local precipitation changes through targeted model improvements and observational constraints. In contrast, over much of the arid subtropical and mid-latitude oceans, over Australia, and over the Sahara in winter, internal atmospheric variability dominates the uncertainty in projected precipitation changes. Here, model improvements and observational constraints will have little impact on the uncertainty of time means shorter than at least 20 years. Last, a supplementary application of the metric developed here is that it can be interpreted as a measure

  10. Uncertainty for calculating transport on Titan: A probabilistic description of bimolecular diffusion parameters

    Science.gov (United States)

    Plessis, S.; McDougall, D.; Mandt, K.; Greathouse, T.; Luspay-Kuti, A.

    2015-11-01

    Bimolecular diffusion coefficients are important parameters used by atmospheric models to calculate altitude profiles of minor constituents in an atmosphere. Unfortunately, laboratory measurements of these coefficients were never conducted at temperature conditions relevant to the atmosphere of Titan. Here we conduct a detailed uncertainty analysis of the bimolecular diffusion coefficient parameters as applied to Titan's upper atmosphere to provide a better understanding of the impact of uncertainty for this parameter on models. Because temperature and pressure conditions are much lower than the laboratory conditions in which bimolecular diffusion parameters were measured, we apply a Bayesian framework, a problem-agnostic framework, to determine parameter estimates and associated uncertainties. We solve the Bayesian calibration problem using the open-source QUESO library which also performs a propagation of uncertainties in the calibrated parameters to temperature and pressure conditions observed in Titan's upper atmosphere. Our results show that, after propagating uncertainty through the Massman model, the uncertainty in molecular diffusion is highly correlated to temperature and we observe no noticeable correlation with pressure. We propagate the calibrated molecular diffusion estimate and associated uncertainty to obtain an estimate with uncertainty due to bimolecular diffusion for the methane molar fraction as a function of altitude. Results show that the uncertainty in methane abundance due to molecular diffusion is in general small compared to eddy diffusion and the chemical kinetics description. However, methane abundance is most sensitive to uncertainty in molecular diffusion above 1200 km where the errors are nontrivial and could have important implications for scientific research based on diffusion models in this altitude range.

  11. A computationally inexpensive model for estimating dimensional measurement uncertainty due to x-ray computed tomography instrument misalignments

    Science.gov (United States)

    Ametova, Evelina; Ferrucci, Massimiliano; Chilingaryan, Suren; Dewulf, Wim

    2018-06-01

    The recent emergence of advanced manufacturing techniques such as additive manufacturing and an increased demand on the integrity of components have motivated research on the application of x-ray computed tomography (CT) for dimensional quality control. While CT has shown significant empirical potential for this purpose, there is a need for metrological research to accelerate the acceptance of CT as a measuring instrument. The accuracy in CT-based measurements is vulnerable to the instrument geometrical configuration during data acquisition, namely the relative position and orientation of x-ray source, rotation stage, and detector. Consistency between the actual instrument geometry and the corresponding parameters used in the reconstruction algorithm is critical. Currently available procedures provide users with only estimates of geometrical parameters. Quantification and propagation of uncertainty in the measured geometrical parameters must be considered to provide a complete uncertainty analysis and to establish confidence intervals for CT dimensional measurements. In this paper, we propose a computationally inexpensive model to approximate the influence of errors in CT geometrical parameters on dimensional measurement results. We use surface points extracted from a computer-aided design (CAD) model to model discrepancies in the radiographic image coordinates assigned to the projected edges between an aligned system and a system with misalignments. The efficacy of the proposed method was confirmed on simulated and experimental data in the presence of various geometrical uncertainty contributors.

  12. Uncertainty in reactive transport geochemical modelling

    International Nuclear Information System (INIS)

    Oedegaard-Jensen, A.; Ekberg, C.

    2005-01-01

    Full text of publication follows: Geochemical modelling is one way of predicting the transport of i.e. radionuclides in a rock formation. In a rock formation there will be fractures in which water and dissolved species can be transported. The composition of the water and the rock can either increase or decrease the mobility of the transported entities. When doing simulations on the mobility or transport of different species one has to know the exact water composition, the exact flow rates in the fracture and in the surrounding rock, the porosity and which minerals the rock is composed of. The problem with simulations on rocks is that the rock itself it not uniform i.e. larger fractures in some areas and smaller in other areas which can give different water flows. The rock composition can be different in different areas. In additions to this variance in the rock there are also problems with measuring the physical parameters used in a simulation. All measurements will perturb the rock and this perturbation will results in more or less correct values of the interesting parameters. The analytical methods used are also encumbered with uncertainties which in this case are added to the uncertainty from the perturbation of the analysed parameters. When doing simulation the effect of the uncertainties must be taken into account. As the computers are getting faster and faster the complexity of simulated systems are increased which also increase the uncertainty in the results from the simulations. In this paper we will show how the uncertainty in the different parameters will effect the solubility and mobility of different species. Small uncertainties in the input parameters can result in large uncertainties in the end. (authors)

  13. Problems due to icing of overhead lines - Part II

    International Nuclear Information System (INIS)

    Havard, D.G.; Pon, C.J.; Krishnasamy, S.G.

    1985-01-01

    A companion paper describes uncertainties in overhead line design due to the variability of ice and wind loads. This paper reviews two other effects due to icing; conductor galloping and torsional instability, which require further study. (author)

  14. Assignment of uncertainties to scientific data

    International Nuclear Information System (INIS)

    Froehner, F.H.

    1994-01-01

    Long-standing problems of uncertainty assignment to scientific data came into a sharp focus in recent years when uncertainty information ('covariance files') had to be added to application-oriented large libraries of evaluated nuclear data such as ENDF and JEF. Question arouse about the best way to express uncertainties, the meaning of statistical and systematic errors, the origin of correlation and construction of covariance matrices, the combination of uncertain data from different sources, the general usefulness of results that are strictly valid only for Gaussian or only for linear statistical models, etc. Conventional statistical theory is often unable to give unambiguous answers, and tends to fail when statistics is bad so that prior information becomes crucial. Modern probability theory, on the other hand, incorporating decision information becomes group-theoretic results, is shown to provide straight and unique answers to such questions, and to deal easily with prior information and small samples. (author). 10 refs

  15. Uncertainties as Barriers for Knowledge Sharing with Enterprise Social Media

    DEFF Research Database (Denmark)

    Trier, Matthias; Fung, Magdalene; Hansen, Abigail

    2017-01-01

    become a barrier for the participants’ adoption. There is only limited existing research studying the types of uncertainties that employees perceive and their impact on knowledge transfer via social media. To address this gap, this article presents a qualitative interview-based study of the adoption...... of the Enterprise Social Media tool Yammer for knowledge sharing in a large global organization. We identify and categorize nine uncertainties that were perceived as barriers by the respondents. The study revealed that the uncertainty types play an important role in affecting employees’ participation...

  16. A Research on Uncertainty Evaluation in Verification and Calibration on LSC facility

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung-Jin; Park, Eung-Seop; Kim, Hee-Gang [Yeong Gwang NPP Supervisory Center for Environment Radiation and Safety, Yeonggwang (Korea, Republic of); Han, Sang-Jun [Chosun Univ., Gwangju (Korea, Republic of)

    2007-10-15

    Compared with environmental sample existing around Nuclear Power Plant, the uncertainty due to geometry difference when the calibration about Liquid Scintillation Counter using the solid H-3 Standard Source of 200,000 DPM(Disintegration Per Minute) is executed exists. Therefore, this paper intends to investigate the root cause of uncertainty due to geometry difference using Quantulus 1220 instrument and H-3 Standard source of solid and liquid form. And Teflon vial was used as a measurement cell. In this paper, it is judged that main factors which can bring about uncertainty about geometry difference are a plastic cell existing into Teflon vial and activity difference, the configuration difference of H-3 Standard Source, and evaluation on these factors are performed through experiment and measurement.

  17. A Research on Uncertainty Evaluation in Verification and Calibration on LSC facility

    International Nuclear Information System (INIS)

    Lee, Seung-Jin; Park, Eung-Seop; Kim, Hee-Gang; Han, Sang-Jun

    2007-01-01

    Compared with environmental sample existing around Nuclear Power Plant, the uncertainty due to geometry difference when the calibration about Liquid Scintillation Counter using the solid H-3 Standard Source of 200,000 DPM(Disintegration Per Minute) is executed exists. Therefore, this paper intends to investigate the root cause of uncertainty due to geometry difference using Quantulus 1220 instrument and H-3 Standard source of solid and liquid form. And Teflon vial was used as a measurement cell. In this paper, it is judged that main factors which can bring about uncertainty about geometry difference are a plastic cell existing into Teflon vial and activity difference, the configuration difference of H-3 Standard Source, and evaluation on these factors are performed through experiment and measurement

  18. Reducing the top quark mass uncertainty with jet grooming

    Science.gov (United States)

    Andreassen, Anders; Schwartz, Matthew D.

    2017-10-01

    The measurement of the top quark mass has large systematic uncertainties coming from the Monte Carlo simulations that are used to match theory and experiment. We explore how much that uncertainty can be reduced by using jet grooming procedures. Using the ATLAS A14 tunes of pythia, we estimate the uncertainty from the choice of tuning parameters in what is meant by the Monte Carlo mass to be around 530 MeV without any corrections. This uncertainty can be reduced by 60% to 200 MeV by calibrating to the W mass and by 70% to 140 MeV by additionally applying soft-drop jet grooming (or to 170 MeV using trimming). At e + e - colliders, the associated uncertainty is around 110 MeV, reducing to 50 MeV after calibrating to the W mass. By analyzing the tuning parameters, we conclude that the importance of jet grooming after calibrating to the W -mass is to reduce sensitivity to the underlying event.

  19. Evidence-based quantification of uncertainties induced via simulation-based modeling

    International Nuclear Information System (INIS)

    Riley, Matthew E.

    2015-01-01

    The quantification of uncertainties in simulation-based modeling traditionally focuses upon quantifying uncertainties in the parameters input into the model, referred to as parametric uncertainties. Often neglected in such an approach are the uncertainties induced by the modeling process itself. This deficiency is often due to a lack of information regarding the problem or the models considered, which could theoretically be reduced through the introduction of additional data. Because of the nature of this epistemic uncertainty, traditional probabilistic frameworks utilized for the quantification of uncertainties are not necessarily applicable to quantify the uncertainties induced in the modeling process itself. This work develops and utilizes a methodology – incorporating aspects of Dempster–Shafer Theory and Bayesian model averaging – to quantify uncertainties of all forms for simulation-based modeling problems. The approach expands upon classical parametric uncertainty approaches, allowing for the quantification of modeling-induced uncertainties as well, ultimately providing bounds on classical probability without the loss of epistemic generality. The approach is demonstrated on two different simulation-based modeling problems: the computation of the natural frequency of a simple two degree of freedom non-linear spring mass system and the calculation of the flutter velocity coefficient for the AGARD 445.6 wing given a subset of commercially available modeling choices. - Highlights: • Modeling-induced uncertainties are often mishandled or ignored in the literature. • Modeling-induced uncertainties are epistemic in nature. • Probabilistic representations of modeling-induced uncertainties are restrictive. • Evidence theory and Bayesian model averaging are integrated. • Developed approach is applicable for simulation-based modeling problems

  20. Benchmarking observational uncertainties for hydrology (Invited)

    Science.gov (United States)

    McMillan, H. K.; Krueger, T.; Freer, J. E.; Westerberg, I.

    2013-12-01

    There is a pressing need for authoritative and concise information on the expected error distributions and magnitudes in hydrological data, to understand its information content. Many studies have discussed how to incorporate uncertainty information into model calibration and implementation, and shown how model results can be biased if uncertainty is not appropriately characterised. However, it is not always possible (for example due to financial or time constraints) to make detailed studies of uncertainty for every research study. Instead, we propose that the hydrological community could benefit greatly from sharing information on likely uncertainty characteristics and the main factors that control the resulting magnitude. In this presentation, we review the current knowledge of uncertainty for a number of key hydrological variables: rainfall, flow and water quality (suspended solids, nitrogen, phosphorus). We collated information on the specifics of the data measurement (data type, temporal and spatial resolution), error characteristics measured (e.g. standard error, confidence bounds) and error magnitude. Our results were primarily split by data type. Rainfall uncertainty was controlled most strongly by spatial scale, flow uncertainty was controlled by flow state (low, high) and gauging method. Water quality presented a more complex picture with many component errors. For all variables, it was easy to find examples where relative error magnitude exceeded 40%. We discuss some of the recent developments in hydrology which increase the need for guidance on typical error magnitudes, in particular when doing comparative/regionalisation and multi-objective analysis. Increased sharing of data, comparisons between multiple catchments, and storage in national/international databases can mean that data-users are far removed from data collection, but require good uncertainty information to reduce bias in comparisons or catchment regionalisation studies. Recently it has

  1. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang; Phillips, Jeff M.; Gur, Yaniv; Johnson, Chris R.

    2012-01-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  2. Uncertainty visualization in HARDI based on ensembles of ODFs

    KAUST Repository

    Jiao, Fangxiang

    2012-02-01

    In this paper, we propose a new and accurate technique for uncertainty analysis and uncertainty visualization based on fiber orientation distribution function (ODF) glyphs, associated with high angular resolution diffusion imaging (HARDI). Our visualization applies volume rendering techniques to an ensemble of 3D ODF glyphs, which we call SIP functions of diffusion shapes, to capture their variability due to underlying uncertainty. This rendering elucidates the complex heteroscedastic structural variation in these shapes. Furthermore, we quantify the extent of this variation by measuring the fraction of the volume of these shapes, which is consistent across all noise levels, the certain volume ratio. Our uncertainty analysis and visualization framework is then applied to synthetic data, as well as to HARDI human-brain data, to study the impact of various image acquisition parameters and background noise levels on the diffusion shapes. © 2012 IEEE.

  3. Quantification of margins and uncertainties: Alternative representations of epistemic uncertainty

    International Nuclear Information System (INIS)

    Helton, Jon C.; Johnson, Jay D.

    2011-01-01

    In 2001, the National Nuclear Security Administration of the U.S. Department of Energy in conjunction with the national security laboratories (i.e., Los Alamos National Laboratory, Lawrence Livermore National Laboratory and Sandia National Laboratories) initiated development of a process designated Quantification of Margins and Uncertainties (QMU) for the use of risk assessment methodologies in the certification of the reliability and safety of the nation's nuclear weapons stockpile. A previous presentation, 'Quantification of Margins and Uncertainties: Conceptual and Computational Basis,' describes the basic ideas that underlie QMU and illustrates these ideas with two notional examples that employ probability for the representation of aleatory and epistemic uncertainty. The current presentation introduces and illustrates the use of interval analysis, possibility theory and evidence theory as alternatives to the use of probability theory for the representation of epistemic uncertainty in QMU-type analyses. The following topics are considered: the mathematical structure of alternative representations of uncertainty, alternative representations of epistemic uncertainty in QMU analyses involving only epistemic uncertainty, and alternative representations of epistemic uncertainty in QMU analyses involving a separation of aleatory and epistemic uncertainty. Analyses involving interval analysis, possibility theory and evidence theory are illustrated with the same two notional examples used in the presentation indicated above to illustrate the use of probability to represent aleatory and epistemic uncertainty in QMU analyses.

  4. Uncertainty analysis guide

    International Nuclear Information System (INIS)

    Andres, T.H.

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  5. Uncertainty analysis guide

    Energy Technology Data Exchange (ETDEWEB)

    Andres, T.H

    2002-05-01

    This guide applies to the estimation of uncertainty in quantities calculated by scientific, analysis and design computer programs that fall within the scope of AECL's software quality assurance (SQA) manual. The guide weaves together rational approaches from the SQA manual and three other diverse sources: (a) the CSAU (Code Scaling, Applicability, and Uncertainty) evaluation methodology; (b) the ISO Guide,for the Expression of Uncertainty in Measurement; and (c) the SVA (Systems Variability Analysis) method of risk analysis. This report describes the manner by which random and systematic uncertainties in calculated quantities can be estimated and expressed. Random uncertainty in model output can be attributed to uncertainties of inputs. The propagation of these uncertainties through a computer model can be represented in a variety of ways, including exact calculations, series approximations and Monte Carlo methods. Systematic uncertainties emerge from the development of the computer model itself, through simplifications and conservatisms, for example. These must be estimated and combined with random uncertainties to determine the combined uncertainty in a model output. This report also addresses the method by which uncertainties should be employed in code validation, in order to determine whether experiments and simulations agree, and whether or not a code satisfies the required tolerance for its application. (author)

  6. Towards Thermodynamics with Generalized Uncertainty Principle

    International Nuclear Information System (INIS)

    Moussa, Mohamed; Farag Ali, Ahmed

    2014-01-01

    Various frameworks of quantum gravity predict a modification in the Heisenberg uncertainty principle to a so-called generalized uncertainty principle (GUP). Introducing quantum gravity effect makes a considerable change in the density of states inside the volume of the phase space which changes the statistical and thermodynamical properties of any physical system. In this paper we investigate the modification in thermodynamic properties of ideal gases and photon gas. The partition function is calculated and using it we calculated a considerable growth in the thermodynamical functions for these considered systems. The growth may happen due to an additional repulsive force between constitutes of gases which may be due to the existence of GUP, hence predicting a considerable increase in the entropy of the system. Besides, by applying GUP on an ideal gas in a trapped potential, it is found that GUP assumes a minimum measurable value of thermal wavelength of particles which agrees with discrete nature of the space that has been derived in previous studies from the GUP

  7. African anthropogenic combustion emission inventory: specificities and uncertainties

    Science.gov (United States)

    Sekou, K.; Liousse, C.; Eric-michel, A.; Veronique, Y.; Thierno, D.; Roblou, L.; Toure, E. N.; Julien, B.

    2015-12-01

    Fossil fuel and biofuel emissions of gases and particles in Africa are expected to significantly increase in the near future, particularly due to the growth of African cities. In addition, African large savannah fires occur each year during the dry season, mainly for socio-economical purposes. In this study, we will present the most recent developments of African anthropogenic combustion emission inventories, stressing African specificities. (1)A regional fossil fuel and biofuel inventory for gases and particulates will be presented for Africa at a resolution of 0.25° x 0.25° from 1990 to 2012. For this purpose, the original database of Liousse et al. (2014) has been used after modification for emission factors and for updated regional fuel consumption including new emitter categories (waste burning, flaring) and new activity sectors (i.e. disaggregation of transport into sub-sectors including two wheel ). In terms of emission factors, new measured values will be presented and compared to litterature with a focus on aerosols. They result from measurement campaigns organized in the frame of DACCIWA European program for each kind of African specific anthropogenic sources in 2015, in Abidjan (Ivory Coast), Cotonou (Benin) and in Laboratoire d'Aérologie combustion chamber. Finally, a more detailed spatial distribution of emissions will be proposed at a country level to better take into account road distributions and population densities. (2) Large uncertainties still remain in biomass burning emission inventories estimates, especially over Africa between different datasets such as GFED and AMMABB. Sensitivity tests will be presented to investigate uncertainties in the emission inventories, applying methodologies used for AMMABB and GFED inventories respectively. Then, the relative importance of each sources (fossil fuel, biofuel and biomass burning inventories) on the budgets of carbon monoxide, nitrogen oxides, sulfur dioxide, black and organic carbon, and volatile

  8. Uncertainty quantification an accelerated course with advanced applications in computational engineering

    CERN Document Server

    Soize, Christian

    2017-01-01

    This book presents the fundamental notions and advanced mathematical tools in the stochastic modeling of uncertainties and their quantification for large-scale computational models in sciences and engineering. In particular, it focuses in parametric uncertainties, and non-parametric uncertainties with applications from the structural dynamics and vibroacoustics of complex mechanical systems, from micromechanics and multiscale mechanics of heterogeneous materials. Resulting from a course developed by the author, the book begins with a description of the fundamental mathematical tools of probability and statistics that are directly useful for uncertainty quantification. It proceeds with a well carried out description of some basic and advanced methods for constructing stochastic models of uncertainties, paying particular attention to the problem of calibrating and identifying a stochastic model of uncertainty when experimental data is available. < This book is intended to be a graduate-level textbook for stu...

  9. Analysis of uncertainty in modeling perceived risks

    International Nuclear Information System (INIS)

    Melnyk, R.; Sandquist, G.M.

    2005-01-01

    Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)

  10. Uncertainty studies and risk assessment for CO2 storage in geological formations

    International Nuclear Information System (INIS)

    Walter, Lena Sophie

    2013-01-01

    Carbon capture and storage (CCS) in deep geological formations is one possible option to mitigate the greenhouse gas effect by reducing CO 2 emissions into the atmosphere. The assessment of the risks related to CO 2 storage is an important task. Events such as CO 2 leakage and brine displacement could result in hazards for human health and the environment. In this thesis, a systematic and comprehensive risk assessment concept is presented to investigate various levels of uncertainties and to assess risks using numerical simulations. Depending on the risk and the processes, which should be assessed, very complex models, large model domains, large time scales, and many simulations runs for estimating probabilities are required. To reduce the resulting high computational costs, a model reduction technique (the arbitrary polynomial chaos expansion) and a method for model coupling in space are applied. The different levels of uncertainties are: statistical uncertainty in parameter distributions, scenario uncertainty, e.g. different geological features, and recognized ignorance due to assumptions in the conceptual model set-up. Recognized ignorance and scenario uncertainty are investigated by simulating well defined model set-ups and scenarios. According to damage values, which are defined as a model output, the set-ups and scenarios can be compared and ranked. For statistical uncertainty probabilities can be determined by running Monte Carlo simulations with the reduced model. The results are presented in various ways: e.g., mean damage, probability density function, cumulative distribution function, or an overall risk value by multiplying the damage with the probability. If the model output (damage) cannot be compared to provided criteria (e.g. water quality criteria), analytical approximations are presented to translate the damage into comparable values. The overall concept is applied for the risks related to brine displacement and infiltration into drinking water

  11. Cost uncertainty for different levels of technology maturity

    International Nuclear Information System (INIS)

    DeMuth, S.F.; Franklin, A.L.

    1996-01-01

    It is difficult at best to apply a single methodology for estimating cost uncertainties related to technologies of differing maturity. While highly mature technologies may have significant performance and manufacturing cost data available, less well developed technologies may be defined in only conceptual terms. Regardless of the degree of technical maturity, often a cost estimate relating to application of the technology may be required to justify continued funding for development. Yet, a cost estimate without its associated uncertainty lacks the information required to assess the economic risk. For this reason, it is important for the developer to provide some type of uncertainty along with a cost estimate. This study demonstrates how different methodologies for estimating uncertainties can be applied to cost estimates for technologies of different maturities. For a less well developed technology an uncertainty analysis of the cost estimate can be based on a sensitivity analysis; whereas, an uncertainty analysis of the cost estimate for a well developed technology can be based on an error propagation technique from classical statistics. It was decided to demonstrate these uncertainty estimation techniques with (1) an investigation of the additional cost of remediation due to beyond baseline, nearly complete, waste heel retrieval from underground storage tanks (USTs) at Hanford; and (2) the cost related to the use of crystalline silico-titanate (CST) rather than the baseline CS100 ion exchange resin for cesium separation from UST waste at Hanford

  12. Analysis of the performance of a H-Darrieus rotor under uncertainty using Polynomial Chaos Expansion

    International Nuclear Information System (INIS)

    Daróczy, László; Janiga, Gábor; Thévenin, Dominique

    2016-01-01

    Due to the growing importance of wind energy, improving the efficiency of energy conversion is essential. Horizontal Axis Wind Turbines are the most well-spread, but H-Darrieus turbines are becoming popular as well due to their simple design and easier integration. Due to the high efficiency of existing wind turbines, further improvements require numerical optimization. One important aspect is to find a better configuration that is also robust, i.e., a configuration that retains its performance under uncertainties. For this purpose, forward uncertainty propagation has to be applied. In the present work, an Uncertainty Quantification (UQ) method, Polynomial Chaos Expansion, is applied to transient, turbulent flow simulations of a variable-speed H-Darrieus turbine, taking into account uncertainty in the preset pitch angle and in the angular velocity. The resulting uncertainty of the performance coefficient and of the quasi-periodic torque curve are quantified. In the presence of stall the instantaneous torque coefficients tend to show asymmetric distributions, meaning that error bars cannot be correctly reconstructed using only mean value and standard deviation. The expected performance was always found to be smaller than in computations without UQ techniques, corresponding to up to 10% of relative losses for λ = 2.5. - Highlights: • Uncertainty Quantification/Polynomial Chaos Expansion successfully applied to H-rotor. • Accounting simultaneously for uncertainty in pitch angle and angular velocity. • Performance coefficient decreases by up to 10% when accounting for uncertainty. • For low tip-speed-ratio, high-order polynomials are needed. • Polynomial order 4 is sufficient to reconstruct distribution at higher TSR.

  13. A New Framework for Quantifying Lidar Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Newman, Jennifer, F.; Clifton, Andrew; Bonin, Timothy A.; Churchfield, Matthew J.

    2017-03-24

    As wind turbine sizes increase and wind energy expands to more complex and remote sites, remote sensing devices such as lidars are expected to play a key role in wind resource assessment and power performance testing. The switch to remote sensing devices represents a paradigm shift in the way the wind industry typically obtains and interprets measurement data for wind energy. For example, the measurement techniques and sources of uncertainty for a remote sensing device are vastly different from those associated with a cup anemometer on a meteorological tower. Current IEC standards discuss uncertainty due to mounting, calibration, and classification of the remote sensing device, among other parameters. Values of the uncertainty are typically given as a function of the mean wind speed measured by a reference device. However, real-world experience has shown that lidar performance is highly dependent on atmospheric conditions, such as wind shear, turbulence, and aerosol content. At present, these conditions are not directly incorporated into the estimated uncertainty of a lidar device. In this presentation, we propose the development of a new lidar uncertainty framework that adapts to current flow conditions and more accurately represents the actual uncertainty inherent in lidar measurements under different conditions. In this new framework, sources of uncertainty are identified for estimation of the line-of-sight wind speed and reconstruction of the three-dimensional wind field. These sources are then related to physical processes caused by the atmosphere and lidar operating conditions. The framework is applied to lidar data from an operational wind farm to assess the ability of the framework to predict errors in lidar-measured wind speed.

  14. Uncertainty quantification for proton–proton fusion in chiral effective field theory

    Energy Technology Data Exchange (ETDEWEB)

    Acharya, B. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Carlsson, B.D. [Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Ekström, A. [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States); Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Forssén, C. [Department of Physics, Chalmers University of Technology, SE-412 96 Göteborg (Sweden); Platter, L., E-mail: lplatter@utk.edu [Department of Physics and Astronomy, University of Tennessee, Knoxville, TN 37996 (United States); Physics Division, Oak Ridge National Laboratory, Oak Ridge, TN 37831 (United States)

    2016-09-10

    We compute the S-factor of the proton–proton (pp) fusion reaction using chiral effective field theory (χEFT) up to next-to-next-to-leading order (NNLO) and perform a rigorous uncertainty analysis of the results. We quantify the uncertainties due to (i) the computational method used to compute the pp cross section in momentum space, (ii) the statistical uncertainties in the low-energy coupling constants of χEFT, (iii) the systematic uncertainty due to the χEFT cutoff, and (iv) systematic variations in the database used to calibrate the nucleon–nucleon interaction. We also examine the robustness of the polynomial extrapolation procedure, which is commonly used to extract the threshold S-factor and its energy-derivatives. By performing a statistical analysis of the polynomial fit of the energy-dependent S-factor at several different energy intervals, we eliminate a systematic uncertainty that can arise from the choice of the fit interval in our calculations. In addition, we explore the statistical correlations between the S-factor and few-nucleon observables such as the binding energies and point-proton radii of {sup 2,3}H and {sup 3}He as well as the D-state probability and quadrupole moment of {sup 2}H, and the β-decay of {sup 3}H. We find that, with the state-of-the-art optimization of the nuclear Hamiltonian, the statistical uncertainty in the threshold S-factor cannot be reduced beyond 0.7%.

  15. Uncertainties in the Norwegian greenhouse gas emission inventory

    Energy Technology Data Exchange (ETDEWEB)

    Flugsrud, Ketil; Hoem, Britta

    2011-11-15

    The national greenhouse gas (GHG) emission inventory is compiled from estimates based on emission factors and activity data and from direct measurements by plants. All these data and parameters will contribute to the overall inventory uncertainty. The uncertainties and probability distributions of the inventory input parameters have been assessed based on available data and expert judgements.Finally, the level and trend uncertainties of the national GHG emission inventory have been estimated using Monte Carlo simulation. The methods used in the analysis correspond to an IPCC tier 2 method, as described in the IPCC Good Practice Guidance (IPCC 2000) (IPCC 2000). Analyses have been made both excluding and including the sector LULUCF (land use, land-use change and forestry). The uncertainty analysis performed in 2011 is an update of the uncertainty analyses performed for the greenhouse gas inventory in 2006 and 2000. During the project we have been in contact with experts, and have collected information about uncertainty from them. Main focus has been on the source categories where changes have occured since the last uncertainty analysis was performed in 2006. This includes new methodology for several source categories (for example for solvents and road traffic) as well as revised uncertainty estimates. For the installations included in the emission trading system, new information from the annual ETS reports about uncertainty in activity data and CO2 emission factor (and N2O emission factor for nitric acid production) has been used. This has improved the quality of the uncertainty estimates for the energy and manufacturing sectors. The results show that the uncertainty level in the total calculated greenhouse gas emissions for 2009 is around 4 per cent. When including the LULUCF sector, the total uncertainty is around 17 per cent in 2009. The uncertainty estimate is lower now than previous analyses have shown. This is partly due to a considerable work made to improve

  16. Uncertainty estimation of Intensity-Duration-Frequency relationships: A regional analysis

    Science.gov (United States)

    Mélèse, Victor; Blanchet, Juliette; Molinié, Gilles

    2018-03-01

    We propose in this article a regional study of uncertainties in IDF curves derived from point-rainfall maxima. We develop two generalized extreme value models based on the simple scaling assumption, first in the frequentist framework and second in the Bayesian framework. Within the frequentist framework, uncertainties are obtained i) from the Gaussian density stemming from the asymptotic normality theorem of the maximum likelihood and ii) with a bootstrap procedure. Within the Bayesian framework, uncertainties are obtained from the posterior densities. We confront these two frameworks on the same database covering a large region of 100, 000 km2 in southern France with contrasted rainfall regime, in order to be able to draw conclusion that are not specific to the data. The two frameworks are applied to 405 hourly stations with data back to the 1980's, accumulated in the range 3 h-120 h. We show that i) the Bayesian framework is more robust than the frequentist one to the starting point of the estimation procedure, ii) the posterior and the bootstrap densities are able to better adjust uncertainty estimation to the data than the Gaussian density, and iii) the bootstrap density give unreasonable confidence intervals, in particular for return levels associated to large return period. Therefore our recommendation goes towards the use of the Bayesian framework to compute uncertainty.

  17. Generalized Likelihood Uncertainty Estimation (GLUE) Using Multi-Optimization Algorithm as Sampling Method

    Science.gov (United States)

    Wang, Z.

    2015-12-01

    For decades, distributed and lumped hydrological models have furthered our understanding of hydrological system. The development of hydrological simulation in large scale and high precision elaborated the spatial descriptions and hydrological behaviors. Meanwhile, the new trend is also followed by the increment of model complexity and number of parameters, which brings new challenges of uncertainty quantification. Generalized Likelihood Uncertainty Estimation (GLUE) has been widely used in uncertainty analysis for hydrological models referring to Monte Carlo method coupled with Bayesian estimation. However, the stochastic sampling method of prior parameters adopted by GLUE appears inefficient, especially in high dimensional parameter space. The heuristic optimization algorithms utilizing iterative evolution show better convergence speed and optimality-searching performance. In light of the features of heuristic optimization algorithms, this study adopted genetic algorithm, differential evolution, shuffled complex evolving algorithm to search the parameter space and obtain the parameter sets of large likelihoods. Based on the multi-algorithm sampling, hydrological model uncertainty analysis is conducted by the typical GLUE framework. To demonstrate the superiority of the new method, two hydrological models of different complexity are examined. The results shows the adaptive method tends to be efficient in sampling and effective in uncertainty analysis, providing an alternative path for uncertainty quantilization.

  18. Uncertainty Principles on Two Step Nilpotent Lie Groups

    Indian Academy of Sciences (India)

    Abstract. We extend an uncertainty principle due to Cowling and Price to two step nilpotent Lie groups, which generalizes a classical theorem of Hardy. We also prove an analogue of Heisenberg inequality on two step nilpotent Lie groups.

  19. Uncertainties in Future Regional Sea Level Trends: How to Deal with the Internal Climate Variability?

    Science.gov (United States)

    Becker, M.; Karpytchev, M.; Hu, A.; Deser, C.; Lennartz-Sassinek, S.

    2017-12-01

    Today, the Climate models (CM) are the main tools for forecasting sea level rise (SLR) at global and regional scales. The CM forecasts are accompanied by inherent uncertainties. Understanding and reducing these uncertainties is becoming a matter of increasing urgency in order to provide robust estimates of SLR impact on coastal societies, which need sustainable choices of climate adaptation strategy. These CM uncertainties are linked to structural model formulation, initial conditions, emission scenario and internal variability. The internal variability is due to complex non-linear interactions within the Earth Climate System and can induce diverse quasi-periodic oscillatory modes and long-term persistences. To quantify the effects of internal variability, most studies used multi-model ensembles or sea level projections from a single model ran with perturbed initial conditions. However, large ensembles are not generally available, or too small, and computationally expensive. In this study, we use a power-law scaling of sea level fluctuations, as observed in many other geophysical signals and natural systems, which can be used to characterize the internal climate variability. From this specific statistical framework, we (1) use the pre-industrial control run of the National Center for Atmospheric Research Community Climate System Model (NCAR-CCSM) to test the robustness of the power-law scaling hypothesis; (2) employ the power-law statistics as a tool for assessing the spread of regional sea level projections due to the internal climate variability for the 21st century NCAR-CCSM; (3) compare the uncertainties in predicted sea level changes obtained from a NCAR-CCSM multi-member ensemble simulations with estimates derived for power-law processes, and (4) explore the sensitivity of spatial patterns of the internal variability and its effects on regional sea level projections.

  20. Number of deaths due to lung diseases: How large is the problem?

    International Nuclear Information System (INIS)

    Wagener, D.K.

    1990-01-01

    The importance of lung disease as an indicator of environmentally induced adverse health effects has been recognized by inclusion among the Health Objectives for the Nation. The 1990 Health Objectives for the Nation (US Department of Health and Human Services, 1986) includes an objective that there should be virtually no new cases among newly exposed workers for four preventable occupational lung diseases-asbestosis, byssinosis, silicosis, and coal workers' pneumoconiosis. This brief communication describes two types of cause-of-death statistics- underlying and multiple cause-and demonstrates the differences between the two statistics using lung disease deaths among adult men. The choice of statistic has a large impact on estimated lung disease mortality rates. The choice of statistics also may have large effect on the estimated mortality rates due to other chromic diseases thought to be environmentally mediated. Issues of comorbidity and the way causes of death are reported become important in the interpretation of these statistics. The choice of which statistic to use when comparing data from a study population with national statistics may greatly affect the interpretations of the study findings

  1. Uncertainty of Energy Consumption Assessment of Domestic Buildings

    DEFF Research Database (Denmark)

    Brohus, Henrik; Heiselberg, Per; Simonsen, A.

    2009-01-01

    In order to assess the influence of energy reduction initiatives, to determine the expected annual cost, to calculate life cycle cost, emission impact, etc. it is crucial to be able to assess the energy consumption reasonably accurate. The present work undertakes a theoretical and empirical study...... of the uncertainty of energy consumption assessment of domestic buildings. The calculated energy consumption of a number of almost identical domestic buildings in Denmark is compared with the measured energy consumption. Furthermore, the uncertainty is determined by means of stochastic modelling based on input...... to correspond reasonably well; however, it is also found that significant differences may occur between calculated and measured energy consumption due to the spread and due to the fact that the result can only be determined with a certain probability. It is found that occupants' behaviour is the major...

  2. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  3. Quantifying phenomenological importance in best-estimate plus uncertainty analyses

    International Nuclear Information System (INIS)

    Martin, Robert P.

    2009-01-01

    This paper describes a general methodology for quantifying the importance of specific phenomenological elements to analysis measures evaluated from non-parametric best-estimate plus uncertainty evaluation methodologies. The principal objective of an importance analysis is to reveal those uncertainty contributors having the greatest influence on key analysis measures. This characterization supports the credibility of the uncertainty analysis, the applicability of the analytical tools, and even the generic evaluation methodology through the validation of the engineering judgments that guided the evaluation methodology development. A demonstration of the importance analysis is provided using data from a sample problem considered in the development of AREVA's Realistic LBLOCA methodology. The results are presented against the original large-break LOCA Phenomena Identification and Ranking Table developed by the Technical Program Group responsible for authoring the Code Scaling, Applicability and Uncertainty methodology. (author)

  4. The neurobiology of uncertainty: implications for statistical learning.

    Science.gov (United States)

    Hasson, Uri

    2017-01-05

    The capacity for assessing the degree of uncertainty in the environment relies on estimating statistics of temporally unfolding inputs. This, in turn, allows calibration of predictive and bottom-up processing, and signalling changes in temporally unfolding environmental features. In the last decade, several studies have examined how the brain codes for and responds to input uncertainty. Initial neurobiological experiments implicated frontoparietal and hippocampal systems, based largely on paradigms that manipulated distributional features of visual stimuli. However, later work in the auditory domain pointed to different systems, whose activation profiles have interesting implications for computational and neurobiological models of statistical learning (SL). This review begins by briefly recapping the historical development of ideas pertaining to the sensitivity to uncertainty in temporally unfolding inputs. It then discusses several issues at the interface of studies of uncertainty and SL. Following, it presents several current treatments of the neurobiology of uncertainty and reviews recent findings that point to principles that serve as important constraints on future neurobiological theories of uncertainty, and relatedly, SL. This review suggests it may be useful to establish closer links between neurobiological research on uncertainty and SL, considering particularly mechanisms sensitive to local and global structure in inputs, the degree of input uncertainty, the complexity of the system generating the input, learning mechanisms that operate on different temporal scales and the use of learnt information for online prediction.This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).

  5. Estimation of errors due to inhomogeneous distribution of radionuclides in lungs

    International Nuclear Information System (INIS)

    Pelled, O.; German, U.; Pollak, G.; Alfassi, Z.B.

    2006-01-01

    The uncertainty in the activity determination of uranium contamination due to real inhomogeneous distribution and assumption of homogenous distribution can reach more than one order of magnitude when using one detector in a set of 4 detectors covering most of the whole lungs. Using the information from several detectors may improve the accuracy, as obtained by summing the responses from the 3 or 4 detectors. However, even with this improvement, the errors are still very large, up to almost a factor of 10 when the analysis is based on the 92 keV energy peak and up to 7 for the 185 keV peak

  6. Application of code scaling applicability and uncertainty methodology to the large break loss of coolant

    International Nuclear Information System (INIS)

    Young, M.Y.; Bajorek, S.M.; Nissley, M.E.

    1998-01-01

    In the late 1980s, after completion of an extensive research program, the United States Nuclear Regulatory Commission (USNRC) amended its regulations (10CFR50.46) to allow the use of realistic physical models to analyze the loss of coolant accident (LOCA) in a light water reactors. Prior to this time, the evaluation of this accident was subject to a prescriptive set of rules (appendix K of the regulations) requiring conservative models and assumptions to be applied simultaneously, leading to very pessimistic estimates of the impact of this accident on the reactor core. The rule change therefore promised to provide significant benefits to owners of power reactors, allowing them to increase output. In response to the rule change, a method called code scaling, applicability and uncertainty (CSAU) was developed to apply realistic methods, while properly taking into account data uncertainty, uncertainty in physical modeling and plant variability. The method was claimed to be structured, traceable, and practical, but was met with some criticism when first demonstrated. In 1996, the USNRC approved a methodology, based on CSAU, developed by a group led by Westinghouse. The lessons learned in this application of CSAU will be summarized. Some of the issues raised concerning the validity and completeness of the CSAU methodology will also be discussed. (orig.)

  7. Ground Motion Uncertainty and Variability (single-station sigma): Insights from Euroseistest, Greece

    Science.gov (United States)

    Ktenidou, O. J.; Roumelioti, Z.; Abrahamson, N. A.; Cotton, F.; Pitilakis, K.

    2014-12-01

    Despite recent improvements in networks and data, the global aleatory uncertainty (sigma) in GMPEs is still large. One reason is the ergodic approach, where we combine data in space to make up for lack of data in time. By estimating the systematic site response, we can make site-specific GMPEs and use a lower, site-specific uncertainty: single-station sigma. In this study we use the EUROSEISTEST database (http://euroseisdb.civil.auth.gr), which has two distinct advantages: good existing knowledge of site conditions at all stations, and careful relocation of the recorded events. Constraining the site and source parameters as best we can, we minimise the within- and between-events components of the global, ergodic sigma. Following that, knowledge of the site response from empirical and theoretical approaches permits us to move on to single-station sigma. The variability per site is not clearly correlated to the site class. We show that in some cases knowledge of Vs30 is not sufficient, and that site-specific data are needed to capture the response, possibly due to 2D/3D effects from complex geometry. Our values of single-station sigma are low compared to the literature. This may be due to the good ray coverage we have in all directions for small, nearby records. Indeed, our single-station sigma values are similar to published single-path values, which means that they may correspond to a fully -rather than partially- non-ergodic approach. We find larger ground motion variability for short distances and small magnitudes. This may be related to the uncertainty in the depth affecting nearby records more, or to stress drop and causing trade-offs between the source and site terms for small magnitudes.

  8. Costs of travel time uncertainty and benefits of travel time information: Conceptual model and numerical examples

    NARCIS (Netherlands)

    Ettema, D.F.; Timmermans, H.J.P.

    2006-01-01

    A negative effect of congestion that tends to be overlooked is travel time uncertainty. Travel time uncertainty causes scheduling costs due to early or late arrival. The negative effects of travel time uncertainty can be reduced by providing travellers with travel time information, which improves

  9. Individual uncertainty and the uncertainty of science: The impact of perceived conflict and general self-efficacy on the perception of tentativeness and credibility of scientific information

    Directory of Open Access Journals (Sweden)

    Danny eFlemming

    2015-12-01

    Full Text Available We examined in two empirical studies how situational and personal aspects of uncertainty influence laypeople’s understanding of the uncertainty of scientific information, with focus on the detection of tentativeness and perception of scientific credibility. In the first study (N = 48, we investigated the impact of a perceived conflict due to contradicting information as a situational, text-inherent aspect of uncertainty. The aim of the second study (N = 61 was to explore the role of general self-efficacy as an intra-personal uncertainty factor. In Study 1, participants read one of two versions of an introductory text in a between-group design. This text provided them with an overview about the neurosurgical procedure of deep brain stimulation (DBS. The text expressed a positive attitude toward DBS in one experimental condition or focused on the negative aspects of this method in the other condition. Then participants in both conditions read the same text that dealt with a study about DBS as experimental treatment in a small sample of patients with major depression. Perceived conflict between the two texts was found to increase the perception of tentativeness and to decrease the perception of scientific credibility, implicating that text-inherent aspects have significant effects on critical appraisal. The results of Study 2 demonstrated that participants with higher general self-efficacy detected the tentativeness to a lesser degree and assumed a higher level of scientific credibility, indicating a more naïve understanding of scientific information. This appears to be contradictory to large parts of previous findings that showed positive effects of high self-efficacy on learning. Both studies showed that perceived tentativeness and perceived scientific credibility of medical information contradicted each other. We conclude that there is a need for supporting laypeople in understanding the uncertainty of scientific information and that

  10. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  11. Quantifying the Effects of Historical Land Cover Conversion Uncertainty on Global Carbon and Climate Estimates

    Science.gov (United States)

    Di Vittorio, A. V.; Mao, J.; Shi, X.; Chini, L.; Hurtt, G.; Collins, W. D.

    2018-01-01

    Previous studies have examined land use change as a driver of global change, but the translation of land use change into land cover conversion has been largely unconstrained. Here we quantify the effects of land cover conversion uncertainty on the global carbon and climate system using the integrated Earth System Model. Our experiments use identical land use change data and vary land cover conversions to quantify associated uncertainty in carbon and climate estimates. Land cover conversion uncertainty is large, constitutes a 5 ppmv range in estimated atmospheric CO2 in 2004, and generates carbon uncertainty that is equivalent to 80% of the net effects of CO2 and climate and 124% of the effects of nitrogen deposition during 1850-2004. Additionally, land cover uncertainty generates differences in local surface temperature of over 1°C. We conclude that future studies addressing land use, carbon, and climate need to constrain and reduce land cover conversion uncertainties.

  12. Insights into water managers' perception and handling of uncertainties - a study of the role of uncertainty in practitioners' planning and decision-making

    Science.gov (United States)

    Höllermann, Britta; Evers, Mariele

    2017-04-01

    Planning and decision-making under uncertainty is common in water management due to climate variability, simplified models, societal developments, planning restrictions just to name a few. Dealing with uncertainty can be approached from two sites, hereby affecting the process and form of communication: Either improve the knowledge base by reducing uncertainties or apply risk-based approaches to acknowledge uncertainties throughout the management process. Current understanding is that science more strongly focusses on the former approach, while policy and practice are more actively applying a risk-based approach to handle incomplete and/or ambiguous information. The focus of this study is on how water managers perceive and handle uncertainties at the knowledge/decision interface in their daily planning and decision-making routines. How they evaluate the role of uncertainties for their decisions and how they integrate this information into the decision-making process. Expert interviews and questionnaires among practitioners and scientists provided an insight into their perspectives on uncertainty handling allowing a comparison of diverse strategies between science and practice as well as between different types of practitioners. Our results confirmed the practitioners' bottom up approach from potential measures upwards instead of impact assessment downwards common in science-based approaches. This science-practice gap may hinder effective uncertainty integration and acknowledgement in final decisions. Additionally, the implementation of an adaptive and flexible management approach acknowledging uncertainties is often stalled by rigid regulations favouring a predict-and-control attitude. However, the study showed that practitioners' level of uncertainty recognition varies with respect to his or her affiliation to type of employer and business unit, hence, affecting the degree of the science-practice-gap with respect to uncertainty recognition. The level of working

  13. Reliability ensemble averaging of 21st century projections of terrestrial net primary productivity reduces global and regional uncertainties

    Science.gov (United States)

    Exbrayat, Jean-François; Bloom, A. Anthony; Falloon, Pete; Ito, Akihiko; Smallman, T. Luke; Williams, Mathew

    2018-02-01

    Multi-model averaging techniques provide opportunities to extract additional information from large ensembles of simulations. In particular, present-day model skill can be used to evaluate their potential performance in future climate simulations. Multi-model averaging methods have been used extensively in climate and hydrological sciences, but they have not been used to constrain projected plant productivity responses to climate change, which is a major uncertainty in Earth system modelling. Here, we use three global observationally orientated estimates of current net primary productivity (NPP) to perform a reliability ensemble averaging (REA) method using 30 global simulations of the 21st century change in NPP based on the Inter-Sectoral Impact Model Intercomparison Project (ISIMIP) business as usual emissions scenario. We find that the three REA methods support an increase in global NPP by the end of the 21st century (2095-2099) compared to 2001-2005, which is 2-3 % stronger than the ensemble ISIMIP mean value of 24.2 Pg C y-1. Using REA also leads to a 45-68 % reduction in the global uncertainty of 21st century NPP projection, which strengthens confidence in the resilience of the CO2 fertilization effect to climate change. This reduction in uncertainty is especially clear for boreal ecosystems although it may be an artefact due to the lack of representation of nutrient limitations on NPP in most models. Conversely, the large uncertainty that remains on the sign of the response of NPP in semi-arid regions points to the need for better observations and model development in these regions.

  14. Starling flock networks manage uncertainty in consensus at low cost.

    Directory of Open Access Journals (Sweden)

    George F Young

    Full Text Available Flocks of starlings exhibit a remarkable ability to maintain cohesion as a group in highly uncertain environments and with limited, noisy information. Recent work demonstrated that individual starlings within large flocks respond to a fixed number of nearest neighbors, but until now it was not understood why this number is seven. We analyze robustness to uncertainty of consensus in empirical data from multiple starling flocks and show that the flock interaction networks with six or seven neighbors optimize the trade-off between group cohesion and individual effort. We can distinguish these numbers of neighbors from fewer or greater numbers using our systems-theoretic approach to measuring robustness of interaction networks as a function of the network structure, i.e., who is sensing whom. The metric quantifies the disagreement within the network due to disturbances and noise during consensus behavior and can be evaluated over a parameterized family of hypothesized sensing strategies (here the parameter is number of neighbors. We use this approach to further show that for the range of flocks studied the optimal number of neighbors does not depend on the number of birds within a flock; rather, it depends on the shape, notably the thickness, of the flock. The results suggest that robustness to uncertainty may have been a factor in the evolution of flocking for starlings. More generally, our results elucidate the role of the interaction network on uncertainty management in collective behavior, and motivate the application of our approach to other biological networks.

  15. Starling Flock Networks Manage Uncertainty in Consensus at Low Cost

    Science.gov (United States)

    Young, George F.; Scardovi, Luca; Cavagna, Andrea; Giardina, Irene; Leonard, Naomi E.

    2013-01-01

    Flocks of starlings exhibit a remarkable ability to maintain cohesion as a group in highly uncertain environments and with limited, noisy information. Recent work demonstrated that individual starlings within large flocks respond to a fixed number of nearest neighbors, but until now it was not understood why this number is seven. We analyze robustness to uncertainty of consensus in empirical data from multiple starling flocks and show that the flock interaction networks with six or seven neighbors optimize the trade-off between group cohesion and individual effort. We can distinguish these numbers of neighbors from fewer or greater numbers using our systems-theoretic approach to measuring robustness of interaction networks as a function of the network structure, i.e., who is sensing whom. The metric quantifies the disagreement within the network due to disturbances and noise during consensus behavior and can be evaluated over a parameterized family of hypothesized sensing strategies (here the parameter is number of neighbors). We use this approach to further show that for the range of flocks studied the optimal number of neighbors does not depend on the number of birds within a flock; rather, it depends on the shape, notably the thickness, of the flock. The results suggest that robustness to uncertainty may have been a factor in the evolution of flocking for starlings. More generally, our results elucidate the role of the interaction network on uncertainty management in collective behavior, and motivate the application of our approach to other biological networks. PMID:23382667

  16. Structural reliability in context of statistical uncertainties and modelling discrepancies

    International Nuclear Information System (INIS)

    Pendola, Maurice

    2000-01-01

    Structural reliability methods have been largely improved during the last years and have showed their ability to deal with uncertainties during the design stage or to optimize the functioning and the maintenance of industrial installations. They are based on a mechanical modeling of the structural behavior according to the considered failure modes and on a probabilistic representation of input parameters of this modeling. In practice, only limited statistical information is available to build the probabilistic representation and different sophistication levels of the mechanical modeling may be introduced. Thus, besides the physical randomness, other uncertainties occur in such analyses. The aim of this work is triple: 1. at first, to propose a methodology able to characterize the statistical uncertainties due to the limited number of data in order to take them into account in the reliability analyses. The obtained reliability index measures the confidence in the structure considering the statistical information available. 2. Then, to show a methodology leading to reliability results evaluated from a particular mechanical modeling but by using a less sophisticated one. The objective is then to decrease the computational efforts required by the reference modeling. 3. Finally, to propose partial safety factors that are evolving as a function of the number of statistical data available and as a function of the sophistication level of the mechanical modeling that is used. The concepts are illustrated in the case of a welded pipe and in the case of a natural draught cooling tower. The results show the interest of the methodologies in an industrial context. [fr

  17. Climate change impacts on extreme events in the United States: an uncertainty analysis

    Science.gov (United States)

    Extreme weather and climate events, such as heat waves, droughts and severe precipitation events, have substantial impacts on ecosystems and the economy. However, future climate simulations display large uncertainty in mean changes. As a result, the uncertainty in future changes ...

  18. Added Value of uncertainty Estimates of SOurce term and Meteorology (AVESOME)

    DEFF Research Database (Denmark)

    Sørensen, Jens Havskov; Schönfeldt, Fredrik; Sigg, Robert

    In the early phase of a nuclear accident, two large sources of uncertainty exist: one related to the source term and one associated with the meteorological data. Operational methods are being developed in AVESOME for quantitative estimation of uncertainties in atmospheric dispersion prediction.......g. at national meteorological services, the proposed methodology is feasible for real-time use, thereby adding value to decision support. In the recent NKS-B projects MUD, FAUNA and MESO, the implications of meteorological uncertainties for nuclear emergency preparedness and management have been studied...... uncertainty in atmospheric dispersion model forecasting stemming from both the source term and the meteorological data is examined. Ways to implement the uncertainties of forecasting in DSSs, and the impacts on real-time emergency management are described. The proposed methodology allows for efficient real...

  19. A statistical approach to determining the uncertainty of peat thickness

    Directory of Open Access Journals (Sweden)

    J. Torppa

    2011-06-01

    Full Text Available This paper presents statistical studies of peat thickness to define its expected maximum variation (∆dm(∆r as a function of separation distance Δr. The aim was to provide an estimate of the observational uncertainty in peat depth due to positioning error, and the prediction uncertainty of the computed model. The data were GPS position and ground penetrating radar depth measurements of six mires in different parts of Finland. The calculated observational uncertainty for Finnish mires in general caused, for example, by a 20 m positioning error, is 43 cm in depth with 95 % confidence. The peat depth statistics differed among the six mires, and it is recommended that the mire specific function ∆dm(∆r is defined for each individual mire to obtain the best estimate of observational uncertainty. Knowledge of the observational error and function ∆dm(∆r should be used in peat depth modelling for defining the uncertainty of depth predictions.

  20. The uncertainty of reference standards--a guide to understanding factors impacting uncertainty, uncertainty calculations, and vendor certifications.

    Science.gov (United States)

    Gates, Kevin; Chang, Ning; Dilek, Isil; Jian, Huahua; Pogue, Sherri; Sreenivasan, Uma

    2009-10-01

    Certified solution standards are widely used in forensic toxicological, clinical/diagnostic, and environmental testing. Typically, these standards are purchased as ampouled solutions with a certified concentration. Vendors present concentration and uncertainty differently on their Certificates of Analysis. Understanding the factors that impact uncertainty and which factors have been considered in the vendor's assignment of uncertainty are critical to understanding the accuracy of the standard and the impact on testing results. Understanding these variables is also important for laboratories seeking to comply with ISO/IEC 17025 requirements and for those preparing reference solutions from neat materials at the bench. The impact of uncertainty associated with the neat material purity (including residual water, residual solvent, and inorganic content), mass measurement (weighing techniques), and solvent addition (solution density) on the overall uncertainty of the certified concentration is described along with uncertainty calculations.

  1. Heisenberg's principle of uncertainty and the uncertainty relations

    International Nuclear Information System (INIS)

    Redei, Miklos

    1987-01-01

    The usual verbal form of the Heisenberg uncertainty principle and the usual mathematical formulation (the so-called uncertainty theorem) are not equivalent. The meaning of the concept 'uncertainty' is not unambiguous and different interpretations are used in the literature. Recently a renewed interest has appeared to reinterpret and reformulate the precise meaning of Heisenberg's principle and to find adequate mathematical form. The suggested new theorems are surveyed and critically analyzed. (D.Gy.) 20 refs

  2. Measurement Uncertainty

    Science.gov (United States)

    Koch, Michael

    Measurement uncertainty is one of the key issues in quality assurance. It became increasingly important for analytical chemistry laboratories with the accreditation to ISO/IEC 17025. The uncertainty of a measurement is the most important criterion for the decision whether a measurement result is fit for purpose. It also delivers help for the decision whether a specification limit is exceeded or not. Estimation of measurement uncertainty often is not trivial. Several strategies have been developed for this purpose that will shortly be described in this chapter. In addition the different possibilities to take into account the uncertainty in compliance assessment are explained.

  3. Parton shower and NLO-matching uncertainties in Higgs boson pair production

    Science.gov (United States)

    Jones, Stephen; Kuttimalai, Silvan

    2018-02-01

    We perform a detailed study of NLO parton shower matching uncertainties in Higgs boson pair production through gluon fusion at the LHC based on a generic and process independent implementation of NLO subtraction and parton shower matching schemes for loop-induced processes in the Sherpa event generator. We take into account the full top-quark mass dependence in the two-loop virtual corrections and compare the results to an effective theory approximation. In the full calculation, our findings suggest large parton shower matching uncertainties that are absent in the effective theory approximation. We observe large uncertainties even in regions of phase space where fixed-order calculations are theoretically well motivated and parton shower effects expected to be small. We compare our results to NLO matched parton shower simulations and analytic resummation results that are available in the literature.

  4. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao

    2016-05-27

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model\\'s estimates of the plume\\'s trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate\\'s contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  5. Uncertainty analysis technique of dynamic response and cumulative damage properties of piping system

    International Nuclear Information System (INIS)

    Suzuki, Kohei; Aoki, Shigeru; Hara, Fumio; Hanaoka, Masaaki; Yamashita, Tadashi.

    1982-01-01

    It is a technologically important subject to establish the method of uncertainty analysis statistically examining the variation of the earthquake response and damage properties of equipment and piping system due to the change of input load and the parameters of structural system, for evaluating the aseismatic capability and dynamic structural reliability of these systems. The uncertainty in the response and damage properties when equipment and piping system are subjected to excessive vibration load is mainly dependent on the irregularity of acting input load such as the unsteady vibration of earthquakes, and structural uncertainty in forms and dimensions. This study is the basic one to establish the method for evaluating the uncertainty in the cumulative damage property at the time of resonant vibration of piping system due to the disperse of structural parameters with a simple model. First, the piping models with simple form were broken by resonant vibration, and the uncertainty in the cumulative damage property was evaluated. Next, the response analysis using an elasto-plastic mechanics model was performed by numerical simulation. Finally, the method of uncertainty analysis for response and damage properties by the perturbation method utilizing equivalent linearization was proposed, and its propriety was proved. (Kako, I.)

  6. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao; Iskandarani, Mohamed; Srinivasan, Ashwanth; Thacker, W. Carlisle; Winokur, Justin; Knio, Omar

    2016-01-01

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model's estimates of the plume's trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate's contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  7. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    Energy Technology Data Exchange (ETDEWEB)

    Vanhanen, R., E-mail: risto.vanhanen@aalto.fi

    2015-03-15

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of {sup 16}O is problematic due to lack of correlation between total and elastic reactions.

  8. Uncertainty analysis of infinite homogeneous lead and sodium cooled fast reactors at beginning of life

    International Nuclear Information System (INIS)

    Vanhanen, R.

    2015-01-01

    The objective of the present work is to estimate breeding ratio, radiation damage rate and minor actinide transmutation rate of infinite homogeneous lead and sodium cooled fast reactors. Uncertainty analysis is performed taking into account uncertainty in nuclear data and composition of the reactors. We use the recently released ENDF/B-VII.1 nuclear data library and restrict the work to the beginning of reactor life. We work under multigroup approximation. The Bondarenko method is used to acquire effective cross sections for the homogeneous reactor. Modeling error and numerical error are estimated. The adjoint sensitivity analysis is performed to calculate generalized adjoint fluxes for the responses. The generalized adjoint fluxes are used to calculate first order sensitivities of the responses to model parameters. The acquired sensitivities are used to propagate uncertainties in the input data to find out uncertainties in the responses. We show that the uncertainty in model parameters is the dominant source of uncertainty, followed by modeling error, input data precision and numerical error. The uncertainty due to composition of the reactor is low. We identify main sources of uncertainty and note that the low-fidelity evaluation of 16 O is problematic due to lack of correlation between total and elastic reactions

  9. Offshore wind farms for hydrogen production subject to uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Kassem, Nabil [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Energy Processes

    2002-07-01

    Wind power is a source of clean, nonpolluting electricity, which is fully competitive, if installed at favorable wind sites, with fossil fuel and nuclear power generation. Major technical growth has been in Europe, where government policies and high conventional energy costs favor the use of wind power. As part of its strategy, the EU-Commission has launched a target to increase the installed capacity of Wind power from 7 GWe, in 1998 to 40 GWe by year 2012. Wind power is an intermittent electricity generator, thus it does not provide electric power on an 'as needed' basis. Off-peak power generated from offshore wind farms can be utilized for hydrogen production using water electrolysis. Like electricity, hydrogen is a second energy carrier, which will pave the way for future sustainable energy systems. It is environmentally friendly, versatile, with great potentials in stationary and mobile power applications. Water electrolysis is a well-established technology, which depends on the availability of cheap electrical power. Offshore wind farms have longer lifetime due to lower mechanical fatigue loads, yet to be economic, they have to be of sizes greater than 150 MW using large turbines (> 1.5 MW). The major challenge in wind energy assessment is how accurately the wind speed and hence the error in wind energy can be predicted. Therefore, wind power is subject to a great deal of uncertainties, which should be accounted for in order to provide meaningful and reliable estimates of performance and economic figures-of-merit. Failure to account for uncertainties would result in deterministic estimates that tend to overstate performance and underestimate costs. This study uses methods of risk analysis to evaluate the simultaneous effect of multiple input uncertainties, and provide Life Cycle Assessment (LCA) of the-economic viability of offshore wind systems for hydrogen production subject to technical and economical uncertainties (Published in summary form only)

  10. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, C. J.

    2017-12-01

    How to derive and present uncertainty in climate data records (CDRs) has been debated within the European Space Agency Climate Change Initiative, in search of common principles applicable across a range of essential climate variables. Various points of consensus have been reached, including the importance of improving provision of uncertainty information and the benefit of adopting international norms of metrology for language around the distinct concepts of uncertainty and error. Providing an estimate of standard uncertainty per datum (or the means to readily calculate it) emerged as baseline good practice, and should be highly relevant to users of CDRs when the uncertainty in data is variable (the usual case). Given this baseline, the role of quality flags is clarified as being complementary to and not repetitive of uncertainty information. Data with high uncertainty are not poor quality if a valid estimate of the uncertainty is available. For CDRs and their applications, the error correlation properties across spatio-temporal scales present important challenges that are not fully solved. Error effects that are negligible in the uncertainty of a single pixel may dominate uncertainty in the large-scale and long-term. A further principle is that uncertainty estimates should themselves be validated. The concepts of estimating and propagating uncertainty are generally acknowledged in geophysical sciences, but less widely practised in Earth observation and development of CDRs. Uncertainty in a CDR depends in part (and usually significantly) on the error covariance of the radiances and auxiliary data used in the retrieval. Typically, error covariance information is not available in the fundamental CDR (FCDR) (i.e., with the level-1 radiances), since provision of adequate level-1 uncertainty information is not yet standard practice. Those deriving CDRs thus cannot propagate the radiance uncertainty to their geophysical products. The FIDUCEO project (www.fiduceo.eu) is

  11. Proposed standardized definitions for vertical resolution and uncertainty in the NDACC lidar ozone and temperature algorithms - Part 2: Ozone DIAL uncertainty budget

    Science.gov (United States)

    Leblanc, Thierry; Sica, Robert J.; van Gijsel, Joanna A. E.; Godin-Beekmann, Sophie; Haefele, Alexander; Trickl, Thomas; Payen, Guillaume; Liberti, Gianluigi

    2016-08-01

    A standardized approach for the definition, propagation, and reporting of uncertainty in the ozone differential absorption lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One essential aspect of the proposed approach is the propagation in parallel of all independent uncertainty components through the data processing chain before they are combined together to form the ozone combined standard uncertainty. The independent uncertainty components contributing to the overall budget include random noise associated with signal detection, uncertainty due to saturation correction, background noise extraction, the absorption cross sections of O3, NO2, SO2, and O2, the molecular extinction cross sections, and the number densities of the air, NO2, and SO2. The expression of the individual uncertainty components and their step-by-step propagation through the ozone differential absorption lidar (DIAL) processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which requires knowledge of the covariance matrix when the lidar signal is vertically filtered. In addition, the covariance terms must be taken into account if the same detection hardware is shared by the lidar receiver channels at the absorbed and non-absorbed wavelengths. The ozone uncertainty budget is presented as much as possible in a generic form (i.e., as a function of instrument performance and wavelength) so that all NDACC ozone DIAL investigators across the network can estimate, for their own instrument and in a straightforward manner, the expected impact of each reviewed uncertainty component. In addition, two actual examples of full uncertainty budget are provided, using nighttime measurements from the tropospheric ozone DIAL located at the Jet Propulsion Laboratory (JPL) Table Mountain Facility, California, and nighttime measurements from the JPL

  12. Practical aspects of the uncertainty and traceability of spectrochemical measurement results by electrothermal atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Duta, S.; Robouch, P.; Barbu, L.; Taylor, P.

    2007-01-01

    The determination of trace elements concentration in water by electrothermal atomic absorption spectrometry (ETAAS) is a common and well established technique in many chemical testing laboratories. However, the evaluation of measurement uncertainty results is not systematically implemented. The paper presents an easy step-by-step example leading to the evaluation of the combined standard uncertainty of copper determination in water using ETAAS. The major contributors to the overall measurement uncertainty are identified due to amount of copper in water sample that mainly depends on the absorbance measurements, due to certified reference material and due to auto-sampler volume measurements. The practical aspects how the traceability of copper concentration in water can be established and demonstrated are also pointed out

  13. Practical aspects of the uncertainty and traceability of spectrochemical measurement results by electrothermal atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Duta, S. [Institute for Reference Materials and Measurements, Joint Research Centre, European Commission, Retieseweg 111, B-2440 Geel (Belgium); National Institute of Metrology, 042122 Vitan Barzesti 11, sector 4 Bucharest (Romania)], E-mail: steluta.duta@inm.ro; Robouch, P. [Institute for Reference Materials and Measurements, Joint Research Centre, European Commission, Retieseweg 111, B-2440 Geel (Belgium)], E-mail: Piotr.Robouch@ec.europa.eu; Barbu, L. [Coca-Cola Entreprise, Analytical Department, Bucharest (Romania); Taylor, P. [Institute for Reference Materials and Measurements, Joint Research Centre, European Commission, Retieseweg 111, B-2440 Geel (Belgium)], E-mail: Philip.Taylor@ec.europa.eu

    2007-04-15

    The determination of trace elements concentration in water by electrothermal atomic absorption spectrometry (ETAAS) is a common and well established technique in many chemical testing laboratories. However, the evaluation of measurement uncertainty results is not systematically implemented. The paper presents an easy step-by-step example leading to the evaluation of the combined standard uncertainty of copper determination in water using ETAAS. The major contributors to the overall measurement uncertainty are identified due to amount of copper in water sample that mainly depends on the absorbance measurements, due to certified reference material and due to auto-sampler volume measurements. The practical aspects how the traceability of copper concentration in water can be established and demonstrated are also pointed out.

  14. Effect of Generalized Uncertainty Principle on Main-Sequence Stars and White Dwarfs

    Directory of Open Access Journals (Sweden)

    Mohamed Moussa

    2015-01-01

    Full Text Available This paper addresses the effect of generalized uncertainty principle, emerged from different approaches of quantum gravity within Planck scale, on thermodynamic properties of photon, nonrelativistic ideal gases, and degenerate fermions. A modification in pressure, particle number, and energy density are calculated. Astrophysical objects such as main-sequence stars and white dwarfs are examined and discussed as an application. A modification in Lane-Emden equation due to a change in a polytropic relation caused by the presence of quantum gravity is investigated. The applicable range of quantum gravity parameters is estimated. The bounds in the perturbed parameters are relatively large but they may be considered reasonable values in the astrophysical regime.

  15. Uncertainties in real-world decisions on medical technologies.

    Science.gov (United States)

    Lu, C Y

    2014-08-01

    Patients, clinicians, payers and policy makers face substantial uncertainties in their respective healthcare decisions as they attempt to achieve maximum value, or the greatest level of benefit possible at a given cost. Uncertainties largely come from incomplete information at the time that decisions must be made. This is true in all areas of medicine because evidence from clinical trials is often incongruent with real-world patient care. This article highlights key uncertainties around the (comparative) benefits and harms of medical technologies. Initiatives and strategies such as comparative effectiveness research and coverage with evidence development may help to generate reliable and relevant evidence for decisions on coverage and treatment. These efforts could result in better decisions that improve patient outcomes and better use of scarce medical resources. © 2014 John Wiley & Sons Ltd.

  16. Simulating space-time uncertainty in continental-scale gridded precipitation fields for agrometeorological modelling

    NARCIS (Netherlands)

    Wit, de A.J.W.; Bruin, de S.

    2006-01-01

    Previous analyses of the effects of uncertainty in precipitation fields on the output of EU Crop Growth Monitoring System (CGMS) demonstrated that the influence on simulated crop yield was limited at national scale, but considerable at local and regional scales. We aim to propagate uncertainty due

  17. Entropy-power uncertainty relations: towards a tight inequality for all Gaussian pure states

    International Nuclear Information System (INIS)

    Hertz, Anaelle; Jabbour, Michael G; Cerf, Nicolas J

    2017-01-01

    We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate continuous variables relies on entropy power, a standard notion in Shannon information theory for real-valued signals. The resulting entropy-power uncertainty relation is equivalent to the entropic formulation of the uncertainty relation due to Bialynicki-Birula and Mycielski, but can be further extended to rotated variables. Hence, based on a reasonable assumption, we give a partial proof of a tighter form of the entropy-power uncertainty relation taking correlations into account and provide extensive numerical evidence of its validity. Interestingly, it implies the generalized (rotation-invariant) Schrödinger–Robertson uncertainty relation exactly as the original entropy-power uncertainty relation implies Heisenberg relation. It is saturated for all Gaussian pure states, in contrast with hitherto known entropic formulations of the uncertainty principle. (paper)

  18. W nano-fuzzes: A metastable state formed due to large-flux He{sup +} irradiation at an elevated temperature

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Yunfeng; Liu, Lu; Lu, Bing; Ni, Weiyuan; Liu, Dongping, E-mail: dongping.liu@dlnu.edu.cn

    2016-12-15

    W nano-fuzzes have been formed due to the large-flux and low-energy (200eV) He{sup +} irradiation at W surface temperature of 1480 °C. Microscopic evolution of W nano-fuzzes during annealing or low-energy (200 eV) He{sup +} bombardments has been observed using scanning electron microscopy and thermal desorption spectroscopy. Our measurements show that both annealing and He{sup +} bombardments can significantly alter the structure of W nano-fuzzes. W nano-fuzzes are thermally unstable due to the He release during annealing, and they are easily sputtered during He{sup +} bombardments. The current study shows that W nano-fuzzes act as a metastable state during low-energy and large-flux He{sup +} irradiation at an elevated temperature. - Highlights: • W nano-fuzzes microscopic evolution during annealing or He{sup +} irradiated have been measured. • W nano-fuzzes are thermally unstable due to He release during annealing. • He are released from the top layer of W fuzzes by annealing. • Metastable W nano-fuzzes are formed due to He{sup +} irradiation at an elevated temperature.

  19. Reliability of Coulomb stress changes inferred from correlated uncertainties of finite-fault source models

    KAUST Repository

    Woessner, J.

    2012-07-14

    Static stress transfer is one physical mechanism to explain triggered seismicity. Coseismic stress-change calculations strongly depend on the parameterization of the causative finite-fault source model. These models are uncertain due to uncertainties in input data, model assumptions, and modeling procedures. However, fault model uncertainties have usually been ignored in stress-triggering studies and have not been propagated to assess the reliability of Coulomb failure stress change (ΔCFS) calculations. We show how these uncertainties can be used to provide confidence intervals for co-seismic ΔCFS-values. We demonstrate this for the MW = 5.9 June 2000 Kleifarvatn earthquake in southwest Iceland and systematically map these uncertainties. A set of 2500 candidate source models from the full posterior fault-parameter distribution was used to compute 2500 ΔCFS maps. We assess the reliability of the ΔCFS-values from the coefficient of variation (CV) and deem ΔCFS-values to be reliable where they are at least twice as large as the standard deviation (CV ≤ 0.5). Unreliable ΔCFS-values are found near the causative fault and between lobes of positive and negative stress change, where a small change in fault strike causes ΔCFS-values to change sign. The most reliable ΔCFS-values are found away from the source fault in the middle of positive and negative ΔCFS-lobes, a likely general pattern. Using the reliability criterion, our results support the static stress-triggering hypothesis. Nevertheless, our analysis also suggests that results from previous stress-triggering studies not considering source model uncertainties may have lead to a biased interpretation of the importance of static stress-triggering.

  20. Robustness for slope stability modelling under deep uncertainty

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2015-04-01

    Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.

  1. A review on the CIRCE methodology to quantify the uncertainty of the physical models of a code

    International Nuclear Information System (INIS)

    Jeon, Seong Su; Hong, Soon Joon; Bang, Young Seok

    2012-01-01

    In the field of nuclear engineering, recent regulatory audit calculations of large break loss of coolant accident (LBLOCA) have been performed with the best estimate code such as MARS, RELAP5 and CATHARE. Since the credible regulatory audit calculation is very important in the evaluation of the safety of the nuclear power plant (NPP), there have been many researches to develop rules and methodologies for the use of best estimate codes. One of the major points is to develop the best estimate plus uncertainty (BEPU) method for uncertainty analysis. As a representative BEPU method, NRC proposes the CSAU (Code scaling, applicability and uncertainty) methodology, which clearly identifies the different steps necessary for an uncertainty analysis. The general idea is 1) to determine all the sources of uncertainty in the code, also called basic uncertainties, 2) quantify them and 3) combine them in order to obtain the final uncertainty for the studied application. Using the uncertainty analysis such as CSAU methodology, an uncertainty band for the code response (calculation result), important from the safety point of view is calculated and the safety margin of the NPP is quantified. An example of such a response is the peak cladding temperature (PCT) for a LBLOCA. However, there is a problem in the uncertainty analysis with the best estimate codes. Generally, it is very difficult to determine the uncertainties due to the empiricism of closure laws (also called correlations or constitutive relationships). So far the only proposed approach is based on the expert judgment. For this case, the uncertainty range of important parameters can be wide and inaccurate so that the confidence level of the BEPU calculation results can be decreased. In order to solve this problem, recently CEA (France) proposes a statistical method of data analysis, called CIRCE. The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment

  2. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    Science.gov (United States)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  3. Uncertainty characterization of HOAPS 3.3 latent heat-flux-related parameters

    Science.gov (United States)

    Liman, Julian; Schröder, Marc; Fennig, Karsten; Andersson, Axel; Hollmann, Rainer

    2018-03-01

    Latent heat flux (LHF) is one of the main contributors to the global energy budget. As the density of in situ LHF measurements over the global oceans is generally poor, the potential of remotely sensed LHF for meteorological applications is enormous. However, to date none of the available satellite products have included estimates of systematic, random, and sampling uncertainties, all of which are essential for assessing their quality. Here, the challenge is taken on by matching LHF-related pixel-level data of the Hamburg Ocean Atmosphere Parameters and Fluxes from Satellite (HOAPS) climatology (version 3.3) to in situ measurements originating from a high-quality data archive of buoys and selected ships. Assuming the ground reference to be bias-free, this allows for deriving instantaneous systematic uncertainties as a function of four atmospheric predictor variables. The approach is regionally independent and therefore overcomes the issue of sparse in situ data densities over large oceanic areas. Likewise, random uncertainties are derived, which include not only a retrieval component but also contributions from in situ measurement noise and the collocation procedure. A recently published random uncertainty decomposition approach is applied to isolate the random retrieval uncertainty of all LHF-related HOAPS parameters. It makes use of two combinations of independent data triplets of both satellite and in situ data, which are analysed in terms of their pairwise variances of differences. Instantaneous uncertainties are finally aggregated, allowing for uncertainty characterizations on monthly to multi-annual timescales. Results show that systematic LHF uncertainties range between 15 and 50 W m-2 with a global mean of 25 W m-2. Local maxima are mainly found over the subtropical ocean basins as well as along the western boundary currents. Investigations indicate that contributions from qa (U) to the overall LHF uncertainty are on the order of 60 % (25 %). From an

  4. Assessing flood forecast uncertainty with fuzzy arithmetic

    Directory of Open Access Journals (Sweden)

    de Bruyn Bertrand

    2016-01-01

    Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out

  5. Multi-objective optimization under uncertainty for sheet metal forming

    Directory of Open Access Journals (Sweden)

    Lafon Pascal

    2016-01-01

    Full Text Available Aleatory uncertainties in material properties, blank thickness and friction condition are inherent and irreducible variabilities in sheet metal forming. Optimal design configurations, which are obtained by conventional design optimization methods, are not always able to meet the desired targets due to the effect of uncertainties. This paper proposes a multi-objective robust design optimization that aims to tackle this problem. Results obtained on a U shape draw bending benchmark show that spring-back effect can be controlled by optimizing process parameters.

  6. Risk, unexpected uncertainty, and estimation uncertainty: Bayesian learning in unstable settings.

    Directory of Open Access Journals (Sweden)

    Elise Payzan-LeNestour

    Full Text Available Recently, evidence has emerged that humans approach learning using Bayesian updating rather than (model-free reinforcement algorithms in a six-arm restless bandit problem. Here, we investigate what this implies for human appreciation of uncertainty. In our task, a Bayesian learner distinguishes three equally salient levels of uncertainty. First, the Bayesian perceives irreducible uncertainty or risk: even knowing the payoff probabilities of a given arm, the outcome remains uncertain. Second, there is (parameter estimation uncertainty or ambiguity: payoff probabilities are unknown and need to be estimated. Third, the outcome probabilities of the arms change: the sudden jumps are referred to as unexpected uncertainty. We document how the three levels of uncertainty evolved during the course of our experiment and how it affected the learning rate. We then zoom in on estimation uncertainty, which has been suggested to be a driving force in exploration, in spite of evidence of widespread aversion to ambiguity. Our data corroborate the latter. We discuss neural evidence that foreshadowed the ability of humans to distinguish between the three levels of uncertainty. Finally, we investigate the boundaries of human capacity to implement Bayesian learning. We repeat the experiment with different instructions, reflecting varying levels of structural uncertainty. Under this fourth notion of uncertainty, choices were no better explained by Bayesian updating than by (model-free reinforcement learning. Exit questionnaires revealed that participants remained unaware of the presence of unexpected uncertainty and failed to acquire the right model with which to implement Bayesian updating.

  7. UNCERTAINTY IN THE PROCESS INTEGRATION FOR THE BIOREFINERIES DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Meilyn González Cortés

    2015-07-01

    Full Text Available This paper presents how the design approaches with high level of flexibility can reduce the additional costs of the strategies that apply overdesign factors to consider parameters with uncertainty that impact on the economic feasibility of a project. The elements with associate uncertainties and that are important in the configurations of the process integration under a biorefinery scheme are: raw material, raw material technologies of conversion, and variety of products that can be obtained. From the analysis it is obtained that in the raw materials and products with potentialities in a biorefinery scheme, there are external uncertainties such as availability, demands and prices in the market. Those external uncertainties can determine their impact on the biorefinery and also in the product prices we can find minimum and maximum limits that can be identified in intervals which should be considered for the project economic evaluation and the sensibility analysis due to varied conditions.

  8. arXiv Uncertainties in WIMP Dark Matter Scattering Revisited

    CERN Document Server

    Ellis, John; Olive, Keith A.

    We revisit the uncertainties in the calculation of spin-independent scattering matrix elements for the scattering of WIMP dark matter particles on nuclear matter. In addition to discussing the uncertainties due to limitations in our knowledge of the nucleonic matrix elements of the light quark scalar densities , we also discuss the importances of heavy quark scalar densities , and comment on uncertainties in quark mass ratios. We analyze estimates of the light-quark densities made over the past decade using lattice calculations and/or phenomenological inputs. We find an uncertainty in the combination that is larger than has been assumed in some phenomenological analyses, and a range of that is smaller but compatible with earlier estimates. We also analyze the importance of the {\\cal O}(\\alpha_s^3) calculations of the heavy-quark matrix elements that are now available, which provide an important refinement of the calculation of the spin-independent scattering cross section. We use for illustration a benchmar...

  9. Manufacturing Data Uncertainties Propagation Method in Burn-Up Problems

    Directory of Open Access Journals (Sweden)

    Thomas Frosio

    2017-01-01

    Full Text Available A nuclear data-based uncertainty propagation methodology is extended to enable propagation of manufacturing/technological data (TD uncertainties in a burn-up calculation problem, taking into account correlation terms between Boltzmann and Bateman terms. The methodology is applied to reactivity and power distributions in a Material Testing Reactor benchmark. Due to the inherent statistical behavior of manufacturing tolerances, Monte Carlo sampling method is used for determining output perturbations on integral quantities. A global sensitivity analysis (GSA is performed for each manufacturing parameter and allows identifying and ranking the influential parameters whose tolerances need to be better controlled. We show that the overall impact of some TD uncertainties, such as uranium enrichment, or fuel plate thickness, on the reactivity is negligible because the different core areas induce compensating effects on the global quantity. However, local quantities, such as power distributions, are strongly impacted by TD uncertainty propagations. For isotopic concentrations, no clear trends appear on the results.

  10. Uncertainty Monitoring by Young Children in a Computerized Task

    Directory of Open Access Journals (Sweden)

    Michael J. Beran

    2012-01-01

    Full Text Available Adult humans show sophisticated metacognitive abilities, including the ability to monitor uncertainty. Unfortunately, most measures of uncertainty monitoring are limited to use with adults due to their general complexity and dependence on explicit verbalization. However, recent research with nonhuman animals has successfully developed measures of uncertainty monitoring that are simple and do not require explicit verbalization. The purpose of this study was to investigate metacognition in young children using uncertainty monitoring tests developed for nonhumans. Children judged whether stimuli were more pink or blue—stimuli nearest the pink-blue midpoint were the most uncertain and the most difficult to classify. Children also had an option to acknowledge difficulty and gain the necessary information for correct classification. As predicted, children most often asked for help on the most difficult stimuli. This result confirms that some metacognitive abilities appear early in cognitive development. The tasks of animal metacognition research clearly have substantial utility for exploring the early developmental roots of human metacognition.

  11. Water shortage risk assessment considering large-scale regional transfers: a copula-based uncertainty case study in Lunan, China.

    Science.gov (United States)

    Gao, Xueping; Liu, Yinzhu; Sun, Bowen

    2018-06-05

    The risk of water shortage caused by uncertainties, such as frequent drought, varied precipitation, multiple water resources, and different water demands, brings new challenges to the water transfer projects. Uncertainties exist for transferring water and local surface water; therefore, the relationship between them should be thoroughly studied to prevent water shortage. For more effective water management, an uncertainty-based water shortage risk assessment model (UWSRAM) is developed to study the combined effect of multiple water resources and analyze the shortage degree under uncertainty. The UWSRAM combines copula-based Monte Carlo stochastic simulation and the chance-constrained programming-stochastic multiobjective optimization model, using the Lunan water-receiving area in China as an example. Statistical copula functions are employed to estimate the joint probability of available transferring water and local surface water and sampling from the multivariate probability distribution, which are used as inputs for the optimization model. The approach reveals the distribution of water shortage and is able to emphasize the importance of improving and updating transferring water and local surface water management, and examine their combined influence on water shortage risk assessment. The possible available water and shortages can be calculated applying the UWSRAM, also with the corresponding allocation measures under different water availability levels and violating probabilities. The UWSRAM is valuable for mastering the overall multi-water resource and water shortage degree, adapting to the uncertainty surrounding water resources, establishing effective water resource planning policies for managers and achieving sustainable development.

  12. Implementation of unscented transform to estimate the uncertainty of a liquid flow standard system

    Energy Technology Data Exchange (ETDEWEB)

    Chun, Sejong; Choi, Hae-Man; Yoon, Byung-Ro; Kang, Woong [Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2017-03-15

    First-order partial derivatives of a mathematical model are an essential part of evaluating the measurement uncertainty of a liquid flow standard system according to the Guide to the expression of uncertainty in measurement (GUM). Although the GUM provides a straightforward method to evaluate the measurement uncertainty of volume flow rate, the first-order partial derivatives can be complicated. The mathematical model of volume flow rate in a liquid flow standard system has a cross-correlation between liquid density and buoyancy correction factor. This cross-correlation can make derivation of the first-order partial derivatives difficult. Monte Carlo simulation can be used as an alternative method to circumvent the difficulty in partial derivation. However, the Monte Carlo simulation requires large computational resources for a correct simulation because it considers the completeness issue whether an ideal or a real operator conducts an experiment to evaluate the measurement uncertainty. Thus, the Monte Carlo simulation needs a large number of samples to ensure that the uncertainty evaluation is as close to the GUM as possible. Unscented transform can alleviate this problem because unscented transform can be regarded as a Monte Carlo simulation with an infinite number of samples. This idea means that unscented transform considers the uncertainty evaluation with respect to the ideal operator. Thus, unscented transform can evaluate the measurement uncertainty the same as the uncertainty that the GUM provides.

  13. Assessing concentration uncertainty estimates from passive microwave sea ice products

    Science.gov (United States)

    Meier, W.; Brucker, L.; Miller, J. A.

    2017-12-01

    Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.

  14. Essential information: Uncertainty and optimal control of Ebola outbreaks.

    Science.gov (United States)

    Li, Shou-Li; Bjørnstad, Ottar N; Ferrari, Matthew J; Mummah, Riley; Runge, Michael C; Fonnesbeck, Christopher J; Tildesley, Michael J; Probert, William J M; Shea, Katriona

    2017-05-30

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  15. Essential information: Uncertainty and optimal control of Ebola outbreaks

    Science.gov (United States)

    Li, Shou-Li; Bjornstad, Ottar; Ferrari, Matthew J.; Mummah, Riley; Runge, Michael C.; Fonnesbeck, Christopher J.; Tildesley, Michael J.; Probert, William J. M.; Shea, Katriona

    2017-01-01

    Early resolution of uncertainty during an epidemic outbreak can lead to rapid and efficient decision making, provided that the uncertainty affects prioritization of actions. The wide range in caseload projections for the 2014 Ebola outbreak caused great concern and debate about the utility of models. By coding and running 37 published Ebola models with five candidate interventions, we found that, despite this large variation in caseload projection, the ranking of management options was relatively consistent. Reducing funeral transmission and reducing community transmission were generally ranked as the two best options. Value of information (VoI) analyses show that caseloads could be reduced by 11% by resolving all model-specific uncertainties, with information about model structure accounting for 82% of this reduction and uncertainty about caseload only accounting for 12%. Our study shows that the uncertainty that is of most interest epidemiologically may not be the same as the uncertainty that is most relevant for management. If the goal is to improve management outcomes, then the focus of study should be to identify and resolve those uncertainties that most hinder the choice of an optimal intervention. Our study further shows that simplifying multiple alternative models into a smaller number of relevant groups (here, with shared structure) could streamline the decision-making process and may allow for a better integration of epidemiological modeling and decision making for policy.

  16. Renormalisation scale uncertainty in the DIS 2+1 jet cross-section

    International Nuclear Information System (INIS)

    Ingelman, G.

    1994-05-01

    The Deep Inelastic Scattering 2+1 jet cross-section is a useful observable for precision tests of QCD, e.g. measuring the strong coupling constant α s . A consistent analysis requires a good understanding of the theoretical uncertainties and one of the fundamental ones in QCD is due to the renormalisation scheme and scale ambiguity. Different methods, which have been proposed to resolve the scale ambiguity, are applied to the 2+1 jet cross-section and the uncertainty is estimated. It is shown that the uncertainty can be made smaller by choosing the jet definition in a suitable way. (orig.)

  17. Uncertainties in criticality analysis which affect the storage and transportation of LWR fuel

    International Nuclear Information System (INIS)

    Napolitani, D.G.

    1989-01-01

    Satisfying the design criteria for subcriticality with uncertainties affects: the capacity of LWR storage arrays, maximum allowable enrichment, minimum allowable burnup and economics of various storage options. There are uncertainties due to: calculational method, data libraries, geometric limitations, modelling bias, the number and quality of benchmarks performed and mechanical uncertainties in the array. Yankee Atomic Electric Co. (YAEC) has developed and benchmarked methods to handle: high density storage rack designs, pin consolidation, low density moderation and burnup credit. The uncertainties associated with such criticality analysis are quantified on the basis of clean criticals, power reactor criticals and intercomparison of independent analysis methods

  18. An Efficient Deterministic Approach to Model-based Prediction Uncertainty

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the...

  19. Uncertainty and sensitivity analysis of biokinetic models for radiopharmaceuticals used in nuclear medicine

    International Nuclear Information System (INIS)

    Li, W. B.; Hoeschen, C.

    2010-01-01

    Mathematical models for kinetics of radiopharmaceuticals in humans were developed and are used to estimate the radiation absorbed dose for patients in nuclear medicine by the International Commission on Radiological Protection and the Medical Internal Radiation Dose (MIRD) Committee. However, due to the fact that the residence times used were derived from different subjects, partially even with different ethnic backgrounds, a large variation in the model parameters propagates to a high uncertainty of the dose estimation. In this work, a method was developed for analysing the uncertainty and sensitivity of biokinetic models that are used to calculate the residence times. The biokinetic model of 18 F-FDG (FDG) developed by the MIRD Committee was analysed by this developed method. The sources of uncertainty of all model parameters were evaluated based on the experiments. The Latin hypercube sampling technique was used to sample the parameters for model input. Kinetic modelling of FDG in humans was performed. Sensitivity of model parameters was indicated by combining the model input and output, using regression and partial correlation analysis. The transfer rate parameter of plasma to other tissue fast is the parameter with the greatest influence on the residence time of plasma. Optimisation of biokinetic data acquisition in the clinical practice by exploitation of the sensitivity of model parameters obtained in this study is discussed. (authors)

  20. Uncertainty studies and risk assessment for CO{sub 2} storage in geological formations

    Energy Technology Data Exchange (ETDEWEB)

    Walter, Lena Sophie

    2013-07-01

    Carbon capture and storage (CCS) in deep geological formations is one possible option to mitigate the greenhouse gas effect by reducing CO{sub 2} emissions into the atmosphere. The assessment of the risks related to CO{sub 2} storage is an important task. Events such as CO{sub 2} leakage and brine displacement could result in hazards for human health and the environment. In this thesis, a systematic and comprehensive risk assessment concept is presented to investigate various levels of uncertainties and to assess risks using numerical simulations. Depending on the risk and the processes, which should be assessed, very complex models, large model domains, large time scales, and many simulations runs for estimating probabilities are required. To reduce the resulting high computational costs, a model reduction technique (the arbitrary polynomial chaos expansion) and a method for model coupling in space are applied. The different levels of uncertainties are: statistical uncertainty in parameter distributions, scenario uncertainty, e.g. different geological features, and recognized ignorance due to assumptions in the conceptual model set-up. Recognized ignorance and scenario uncertainty are investigated by simulating well defined model set-ups and scenarios. According to damage values, which are defined as a model output, the set-ups and scenarios can be compared and ranked. For statistical uncertainty probabilities can be determined by running Monte Carlo simulations with the reduced model. The results are presented in various ways: e.g., mean damage, probability density function, cumulative distribution function, or an overall risk value by multiplying the damage with the probability. If the model output (damage) cannot be compared to provided criteria (e.g. water quality criteria), analytical approximations are presented to translate the damage into comparable values. The overall concept is applied for the risks related to brine displacement and infiltration into

  1. [Influence of Uncertainty and Uncertainty Appraisal on Self-management in Hemodialysis Patients].

    Science.gov (United States)

    Jang, Hyung Suk; Lee, Chang Suk; Yang, Young Hee

    2015-04-01

    This study was done to examine the relation of uncertainty, uncertainty appraisal, and self-management in patients undergoing hemodialysis, and to identify factors influencing self-management. A convenience sample of 92 patients receiving hemodialysis was selected. Data were collected using a structured questionnaire and medical records. The collected data were analyzed using descriptive statistics, t-test, ANOVA, Pearson correlations and multiple regression analysis with the SPSS/WIN 20.0 program. The participants showed a moderate level of uncertainty with the highest score being for ambiguity among the four uncertainty subdomains. Scores for uncertainty danger or opportunity appraisals were under the mid points. The participants were found to perform a high level of self-management such as diet control, management of arteriovenous fistula, exercise, medication, physical management, measurements of body weight and blood pressure, and social activity. The self-management of participants undergoing hemodialysis showed a significant relationship with uncertainty and uncertainty appraisal. The significant factors influencing self-management were uncertainty, uncertainty opportunity appraisal, hemodialysis duration, and having a spouse. These variables explained 32.8% of the variance in self-management. The results suggest that intervention programs to reduce the level of uncertainty and to increase the level of uncertainty opportunity appraisal among patients would improve the self-management of hemodialysis patients.

  2. A pseudo-statistical approach to treat choice uncertainty: the example of partitioning allocation methods

    NARCIS (Netherlands)

    Mendoza Beltran, A.; Heijungs, R.; Guinée, J.; Tukker, A.

    2016-01-01

    Purpose: Despite efforts to treat uncertainty due to methodological choices in life cycle assessment (LCA) such as standardization, one-at-a-time (OAT) sensitivity analysis, and analytical and statistical methods, no method exists that propagate this source of uncertainty for all relevant processes

  3. Model structures amplify uncertainty in predicted soil carbon responses to climate change.

    Science.gov (United States)

    Shi, Zheng; Crowell, Sean; Luo, Yiqi; Moore, Berrien

    2018-06-04

    Large model uncertainty in projected future soil carbon (C) dynamics has been well documented. However, our understanding of the sources of this uncertainty is limited. Here we quantify the uncertainties arising from model parameters, structures and their interactions, and how those uncertainties propagate through different models to projections of future soil carbon stocks. Both the vertically resolved model and the microbial explicit model project much greater uncertainties to climate change than the conventional soil C model, with both positive and negative C-climate feedbacks, whereas the conventional model consistently predicts positive soil C-climate feedback. Our findings suggest that diverse model structures are necessary to increase confidence in soil C projection. However, the larger uncertainty in the complex models also suggests that we need to strike a balance between model complexity and the need to include diverse model structures in order to forecast soil C dynamics with high confidence and low uncertainty.

  4. Validation/Uncertainty Quantification for Large Eddy Simulations of the heat flux in the Tangentially Fired Oxy-Coal Alstom Boiler Simulation Facility

    Energy Technology Data Exchange (ETDEWEB)

    Smith, P.J.; Eddings, E.G.; Ring, T.; Thornock, J.; Draper, T.; Isaac, B.; Rezeai, D.; Toth, P.; Wu, Y.; Kelly, K.

    2014-08-01

    The objective of this task is to produce predictive capability with quantified uncertainty bounds for the heat flux in commercial-scale, tangentially fired, oxy-coal boilers. Validation data came from the Alstom Boiler Simulation Facility (BSF) for tangentially fired, oxy-coal operation. This task brings together experimental data collected under Alstom’s DOE project for measuring oxy-firing performance parameters in the BSF with this University of Utah project for large eddy simulation (LES) and validation/uncertainty quantification (V/UQ). The Utah work includes V/UQ with measurements in the single-burner facility where advanced strategies for O2 injection can be more easily controlled and data more easily obtained. Highlights of the work include: • Simulations of Alstom’s 15 megawatt (MW) BSF, exploring the uncertainty in thermal boundary conditions. A V/UQ analysis showed consistency between experimental results and simulation results, identifying uncertainty bounds on the quantities of interest for this system (Subtask 9.1) • A simulation study of the University of Utah’s oxy-fuel combustor (OFC) focused on heat flux (Subtask 9.2). A V/UQ analysis was used to show consistency between experimental and simulation results. • Measurement of heat flux and temperature with new optical diagnostic techniques and comparison with conventional measurements (Subtask 9.3). Various optical diagnostics systems were created to provide experimental data to the simulation team. The final configuration utilized a mid-wave infrared (MWIR) camera to measure heat flux and temperature, which was synchronized with a high-speed, visible camera to utilize two-color pyrometry to measure temperature and soot concentration. • Collection of heat flux and temperature measurements in the University of Utah’s OFC for use is subtasks 9.2 and 9.3 (Subtask 9.4). Several replicates were carried to better assess the experimental error. Experiments were specifically designed for the

  5. Practical low dose limits for passive personal dosemeters and the implications for uncertainties close to the limit of detection

    International Nuclear Information System (INIS)

    Gilvin, P. J.; Perks, C. A.

    2011-01-01

    Recent years have seen the increasing use of passive dosemeters that have high sensitivities and, in laboratory conditions, detection limits of <10 μSv. However, in real operational use the detection limits will be markedly higher, because a large fraction of the accrued dose will be due to natural background, and this must be subtracted in order to obtain the desired occupational dose. No matter how well known the natural background is, the measurement uncertainty on doses of a few tens of microsieverts will be large. Individual monitoring services need to recognise this and manage the expectations of their clients by providing sufficient information. (authors)

  6. Improvement of Statistical Decisions under Parametric Uncertainty

    Science.gov (United States)

    Nechval, Nicholas A.; Nechval, Konstantin N.; Purgailis, Maris; Berzins, Gundars; Rozevskis, Uldis

    2011-10-01

    A large number of problems in production planning and scheduling, location, transportation, finance, and engineering design require that decisions be made in the presence of uncertainty. Decision-making under uncertainty is a central problem in statistical inference, and has been formally studied in virtually all approaches to inference. The aim of the present paper is to show how the invariant embedding technique, the idea of which belongs to the authors, may be employed in the particular case of finding the improved statistical decisions under parametric uncertainty. This technique represents a simple and computationally attractive statistical method based on the constructive use of the invariance principle in mathematical statistics. Unlike the Bayesian approach, an invariant embedding technique is independent of the choice of priors. It allows one to eliminate unknown parameters from the problem and to find the best invariant decision rule, which has smaller risk than any of the well-known decision rules. To illustrate the proposed technique, application examples are given.

  7. Uncertainty in Simulating Wheat Yields Under Climate Change

    Energy Technology Data Exchange (ETDEWEB)

    Asseng, S.; Ewert, F.; Rosenzweig, C.; Jones, J.W.; Hatfield, Jerry; Ruane, Alex; Boote, K. J.; Thorburn, Peter; Rotter, R.P.; Cammarano, D.; Brisson, N.; Basso, B.; Martre, P.; Aggarwal, P.K.; Angulo, C.; Bertuzzi, P.; Biernath, C.; Challinor, AJ; Doltra, J.; Gayler, S.; Goldberg, R.; Grant, Robert; Heng, L.; Hooker, J.; Hunt, L.A.; Ingwersen, J.; Izaurralde, Roberto C.; Kersebaum, K.C.; Mueller, C.; Naresh Kumar, S.; Nendel, C.; O' Leary, G.O.; Olesen, JE; Osborne, T.; Palosuo, T.; Priesack, E.; Ripoche, D.; Semenov, M.A.; Shcherbak, I.; Steduto, P.; Stockle, Claudio O.; Stratonovitch, P.; Streck, T.; Supit, I.; Tao, F.; Travasso, M.; Waha, K.; Wallach, D.; White, J.W.; Williams, J.R.; Wolf, J.

    2013-09-01

    Anticipating the impacts of climate change on crop yields is critical for assessing future food security. Process-based crop simulation models are the most commonly used tools in such assessments1,2. Analysis of uncertainties in future greenhouse gas emissions and their impacts on future climate change has been increasingly described in the literature3,4 while assessments of the uncertainty in crop responses to climate change are very rare. Systematic and objective comparisons across impact studies is difficult, and thus has not been fully realized5. Here we present the largest coordinated and standardized crop model intercomparison for climate change impacts on wheat production to date. We found that several individual crop models are able to reproduce measured grain yields under current diverse environments, particularly if sufficient details are provided to execute them. However, simulated climate change impacts can vary across models due to differences in model structures and algorithms. The crop-model component of uncertainty in climate change impact assessments was considerably larger than the climate-model component from Global Climate Models (GCMs). Model responses to high temperatures and temperature-by-CO2 interactions are identified as major sources of simulated impact uncertainties. Significant reductions in impact uncertainties through model improvements in these areas and improved quantification of uncertainty through multi-model ensembles are urgently needed for a more reliable translation of climate change scenarios into agricultural impacts in order to develop adaptation strategies and aid policymaking.

  8. Sensitivity and uncertainty analysis of reactivities for UO2 and MOX fueled PWR cells

    Energy Technology Data Exchange (ETDEWEB)

    Foad, Basma [Research Institute of Nuclear Engineering, University of Fukui, Kanawa-cho 1-2-4, Tsuruga-shi, Fukui-ken, 914-0055 (Japan); Egypt Nuclear and Radiological Regulatory Authority, 3 Ahmad El Zomar St., Nasr City, Cairo, 11787 (Egypt); Takeda, Toshikazu [Research Institute of Nuclear Engineering, University of Fukui, Kanawa-cho 1-2-4, Tsuruga-shi, Fukui-ken, 914-0055 (Japan)

    2015-12-31

    The purpose of this paper is to apply our improved method for calculating sensitivities and uncertainties of reactivity responses for UO{sub 2} and MOX fueled pressurized water reactor cells. The improved method has been used to calculate sensitivity coefficients relative to infinite dilution cross-sections, where the self-shielding effect is taken into account. Two types of reactivities are considered: Doppler reactivity and coolant void reactivity, for each type of reactivity, the sensitivities are calculated for small and large perturbations. The results have demonstrated that the reactivity responses have larger relative uncertainty than eigenvalue responses. In addition, the uncertainty of coolant void reactivity is much greater than Doppler reactivity especially for large perturbations. The sensitivity coefficients and uncertainties of both reactivities were verified by comparing with SCALE code results using ENDF/B-VII library and good agreements have been found.

  9. Removal of Asperger's syndrome from the DSM V: community response to uncertainty.

    Science.gov (United States)

    Parsloe, Sarah M; Babrow, Austin S

    2016-01-01

    The May 2013 release of the new version of the Diagnostic and Statistical Manual of Mental Disorders (DSM V) subsumed Asperger's syndrome under the wider diagnostic label of autism spectrum disorder (ASD). The revision has created much uncertainty in the community affected by this condition. This study uses problematic integration theory and thematic analysis to investigate how participants in Wrong Planet, a large online community associated with autism and Asperger's syndrome, have constructed these uncertainties. The analysis illuminates uncertainties concerning both the likelihood of diagnosis and value of diagnosis, and it details specific issues within these two general areas of uncertainty. The article concludes with both conceptual and practical implications.

  10. Communicating spatial uncertainty to non-experts using R

    Science.gov (United States)

    Luzzi, Damiano; Sawicka, Kasia; Heuvelink, Gerard; de Bruin, Sytze

    2016-04-01

    Effective visualisation methods are important for the efficient use of uncertainty information for various groups of users. Uncertainty propagation analysis is often used with spatial environmental models to quantify the uncertainty within the information. A challenge arises when trying to effectively communicate the uncertainty information to non-experts (not statisticians) in a wide range of cases. Due to the growing popularity and applicability of the open source programming language R, we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. The package has implemented Monte Carlo algorithms for uncertainty propagation, the output of which is represented by an ensemble of model outputs (i.e. a sample from a probability distribution). Numerous visualisation methods exist that aim to present such spatial uncertainty information both statically, dynamically and interactively. To provide the most universal visualisation tools for non-experts, we conducted a survey on a group of 20 university students and assessed the effectiveness of selected static and interactive methods for visualising uncertainty in spatial variables such as DEM and land cover. The static methods included adjacent maps and glyphs for continuous variables. Both allow for displaying maps with information about the ensemble mean, variance/standard deviation and prediction intervals. Adjacent maps were also used for categorical data, displaying maps of the most probable class, as well as its associated probability. The interactive methods included a graphical user interface, which in addition to displaying the previously mentioned variables also allowed for comparison of joint uncertainties at multiple locations. The survey indicated that users could understand the basics of the uncertainty information displayed in the static maps, with the interactive interface allowing for more in-depth information. Subsequently, the R

  11. SENSIT: a cross-section and design sensitivity and uncertainty analysis code. [In FORTRAN for CDC-7600, IBM 360

    Energy Technology Data Exchange (ETDEWEB)

    Gerstl, S.A.W.

    1980-01-01

    SENSIT computes the sensitivity and uncertainty of a calculated integral response (such as a dose rate) due to input cross sections and their uncertainties. Sensitivity profiles are computed for neutron and gamma-ray reaction cross sections of standard multigroup cross section sets and for secondary energy distributions (SEDs) of multigroup scattering matrices. In the design sensitivity mode, SENSIT computes changes in an integral response due to design changes and gives the appropriate sensitivity coefficients. Cross section uncertainty analyses are performed for three types of input data uncertainties: cross-section covariance matrices for pairs of multigroup reaction cross sections, spectral shape uncertainty parameters for secondary energy distributions (integral SED uncertainties), and covariance matrices for energy-dependent response functions. For all three types of data uncertainties SENSIT computes the resulting variance and estimated standard deviation in an integral response of interest, on the basis of generalized perturbation theory. SENSIT attempts to be more comprehensive than earlier sensitivity analysis codes, such as SWANLAKE.

  12. Phenomenological uncertainty analysis of early containment failure at severe accident of nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Su Won

    2011-02-15

    The severe accident has inherently significant uncertainty due to wide range of conditions and performing experiments, validation and practical application are extremely difficult because of its high temperature and pressure. Although internal and external researches were put into practice, the reference used in Korean nuclear plants were foreign data of 1980s and safety analysis as the probabilistic safety assessment has not applied the newest methodology. Also, it is applied to containment pressure formed into point value as results of thermal hydraulic analysis to identify the probability of containment failure in level 2 PSA. In this paper, the uncertainty analysis methods for phenomena of severe accident influencing early containment failure were developed, the uncertainty analysis that apply Korean nuclear plants using the MELCOR code was performed and it is a point of view to present the distribution of containment pressure as a result of uncertainty analysis. Because early containment failure is important factor of Large Early Release Frequency(LERF) that is used as representative criteria of decision-making in nuclear power plants, it was selected in this paper among various modes of containment failure. Important phenomena of early containment failure at severe accident based on previous researches were comprehended and methodology of 7th steps to evaluate uncertainty was developed. The MELCOR input for analysis of the severe accident reflected natural circulation flow was developed and the accident scenario for station black out that was representative initial event of early containment failure was determined. By reviewing the internal model and correlation for MELCOR model relevant important phenomena of early containment failure, the uncertainty factors which could affect on the uncertainty were founded and the major factors were finally identified through the sensitivity analysis. In order to determine total number of MELCOR calculations which can

  13. Asymmetric Uncertainty Expression for High Gradient Aerodynamics

    Science.gov (United States)

    Pinier, Jeremy T

    2012-01-01

    When the physics of the flow around an aircraft changes very abruptly either in time or space (e.g., flow separation/reattachment, boundary layer transition, unsteadiness, shocks, etc), the measurements that are performed in a simulated environment like a wind tunnel test or a computational simulation will most likely incorrectly predict the exact location of where (or when) the change in physics happens. There are many reasons for this, includ- ing the error introduced by simulating a real system at a smaller scale and at non-ideal conditions, or the error due to turbulence models in a computational simulation. The un- certainty analysis principles that have been developed and are being implemented today do not fully account for uncertainty in the knowledge of the location of abrupt physics changes or sharp gradients, leading to a potentially underestimated uncertainty in those areas. To address this problem, a new asymmetric aerodynamic uncertainty expression containing an extra term to account for a phase-uncertainty, the magnitude of which is emphasized in the high-gradient aerodynamic regions is proposed in this paper. Additionally, based on previous work, a method for dispersing aerodynamic data within asymmetric uncer- tainty bounds in a more realistic way has been developed for use within Monte Carlo-type analyses.

  14. Uncertainty in CH4 and N2O emission estimates from a managed fen meadow using EC measurements

    International Nuclear Information System (INIS)

    Kroon, P.S.; Hensen, A.; Van 't Veen, W.H.; Vermeulen, A.T.; Jonker, H.

    2009-02-01

    The overall uncertainty in annual flux estimates derived from chamber measurements may be as high as 50% due to the temporal and spatial variability in the fluxes. As even a large number of chamber plots still cover typically less than 1% of the total field area, the field-scale integrated emission necessarily remains a matter of speculation. High frequency micrometeorological methods are a good option for obtaining integrated estimates on a hectare scale with a continuous coverage in time. Instrumentation is now becoming available that meets the requirements for CH4 and N2O eddy covariance (EC) measurements. A system consisting of a quantum cascade laser (QCL) spectrometer and a sonic anemometer has recently been proven to be suitable for performing EC measurements. This study analyses the EC flux measurements of CH4 and N2O and its corrections, like calibration, Webb-correction, and corrections for high and low frequency losses, and assesses the magnitude of the uncertainties associated with the precision of the measurement instruments, measurement set-up and the methodology. The uncertainty of one single EC flux measurement, a daily, monthly and 3-monthly average EC flux is estimated. In addition, the cumulative emission of C-CH4 and N-N2O and their uncertainties are determined over several fertilizing events at a dairy farm site in the Netherlands. These fertilizing events are selected from the continuously EC flux measurements from August 2006 to September 2008. The EC flux uncertainties are compared by the overall uncertainty in annual flux estimates derived from chamber measurements. It will be shown that EC flux measurements can decrease the overall uncertainty in annual flux estimates

  15. Uncertainty in CH4 and N2O emission estimates from a managed fen meadow using EC measurements

    Energy Technology Data Exchange (ETDEWEB)

    Kroon, P.S.; Hensen, A.; Van ' t Veen, W.H.; Vermeulen, A.T. [ECN Biomass, Coal and Environment, Petten (Netherlands); Jonker, H. [Delft University of Technology, Delft (Netherlands)

    2009-02-15

    The overall uncertainty in annual flux estimates derived from chamber measurements may be as high as 50% due to the temporal and spatial variability in the fluxes. As even a large number of chamber plots still cover typically less than 1% of the total field area, the field-scale integrated emission necessarily remains a matter of speculation. High frequency micrometeorological methods are a good option for obtaining integrated estimates on a hectare scale with a continuous coverage in time. Instrumentation is now becoming available that meets the requirements for CH4 and N2O eddy covariance (EC) measurements. A system consisting of a quantum cascade laser (QCL) spectrometer and a sonic anemometer has recently been proven to be suitable for performing EC measurements. This study analyses the EC flux measurements of CH4 and N2O and its corrections, like calibration, Webb-correction, and corrections for high and low frequency losses, and assesses the magnitude of the uncertainties associated with the precision of the measurement instruments, measurement set-up and the methodology. The uncertainty of one single EC flux measurement, a daily, monthly and 3-monthly average EC flux is estimated. In addition, the cumulative emission of C-CH4 and N-N2O and their uncertainties are determined over several fertilizing events at a dairy farm site in the Netherlands. These fertilizing events are selected from the continuously EC flux measurements from August 2006 to September 2008. The EC flux uncertainties are compared by the overall uncertainty in annual flux estimates derived from chamber measurements. It will be shown that EC flux measurements can decrease the overall uncertainty in annual flux estimates.

  16. Evaluation method for uncertainty of effective delayed neutron fraction βeff

    International Nuclear Information System (INIS)

    Zukeran, Atsushi

    1999-01-01

    Uncertainty of effective delayed neutron fraction β eff is evaluated in terms of three quantities; uncertainties of the basic delayed neutron constants, energy dependence of delayed neutron yield ν d m , and the uncertainties of the fission cross sections of fuel elements. The uncertainty of β eff due to the delayed neutron yield is expressed by a linearized formula assuming that the delayed neutron yield does not depend on the incident energy, and the energy dependence is supplemented by using the detailed energy dependence proposed by D'Angelo and Filip. The third quantity, uncertainties of fission cross section, is evaluated on the basis of the generalized perturbation theory in relation to reaction rate rations such as central spectral indexes or average reaction rate ratios. Resultant uncertainty of β eff is about 4 to 5%s, in which primary factor is the delayed neutron yield, and the secondary one is the fission cross section uncertainty, especially for 238 U. The energy dependence of ν d m systematically reduces the magnitude of β eff about 1.4% to 1.7%, depending on the model of the energy vs. ν d m correlation curve. (author)

  17. Quantifying remarks to the question of uncertainties of the 'general dose assessment fundamentals'

    International Nuclear Information System (INIS)

    Brenk, H.D.; Vogt, K.J.

    1982-12-01

    Dose prediction models are always subject to uncertainties due to a number of factors including deficiencies in the model structure and uncertainties of the model input parameter values. In lieu of validation experiments the evaluation of these uncertainties is restricted to scientific judgement. Several attempts have been made in the literature to evaluate the uncertainties of the current dose assessment models resulting from uncertainties of the model input parameter values using stochastic approaches. Less attention, however, has been paid to potential sources of systematic over- and underestimations of the predicted doses due to deficiencies in the model structure. The present study addresses this aspect with regard to dose assessment models currently used for regulatory purposes. The influence of a number of basic simplifications and conservative assumptions has been investigated. Our systematic approach is exemplified by a comparison of doses evaluated on the basis of the regulatory guide model and a more realistic model respectively. This is done for 3 critical exposure pathways. As a result of this comparison it can be concluded that the currently used regularoty-type models include significant safety factors resulting in a systematic overprediction of dose to man up to two orders of magnitude. For this reason there are some indications that these models usually more than compensate the bulk of the stochastic uncertainties caused by the variability of the input parameter values. (orig.) [de

  18. Instrument uncertainty predictions

    International Nuclear Information System (INIS)

    Coutts, D.A.

    1991-07-01

    The accuracy of measurements and correlations should normally be provided for most experimental activities. The uncertainty is a measure of the accuracy of a stated value or equation. The uncertainty term reflects a combination of instrument errors, modeling limitations, and phenomena understanding deficiencies. This report provides several methodologies to estimate an instrument's uncertainty when used in experimental work. Methods are shown to predict both the pretest and post-test uncertainty

  19. Estimating the uncertainty of damage costs of pollution: A simple transparent method and typical results

    International Nuclear Information System (INIS)

    Spadaro, Joseph V.; Rabl, Ari

    2008-01-01

    Whereas the uncertainty of environmental impacts and damage costs is usually estimated by means of a Monte Carlo calculation, this paper shows that most (and in many cases all) of the uncertainty calculation involves products and/or sums of products and can be accomplished with an analytic solution which is simple and transparent. We present our own assessment of the component uncertainties and calculate the total uncertainty for the impacts and damage costs of the classical air pollutants; results for a Monte Carlo calculation for the dispersion part are also shown. The distribution of the damage costs is approximately lognormal and can be characterized in terms of geometric mean μ g and geometric standard deviation σ g , implying that the confidence interval is multiplicative. We find that for the classical air pollutants σ g is approximately 3 and the 68% confidence interval is [μ g / σ g , μ g σ g ]. Because the lognormal distribution is highly skewed for large σ g , the median is significantly smaller than the mean. We also consider the case where several lognormally distributed damage costs are added, for example to obtain the total damage cost due to all the air pollutants emitted by a power plant, and we find that the relative error of the sum can be significantly smaller than the relative errors of the summands. Even though the distribution for such sums is not exactly lognormal, we present a simple lognormal approximation that is quite adequate for most applications

  20. Modeling Multibody Systems with Uncertainties. Part I: Theoretical and Computational Aspects

    International Nuclear Information System (INIS)

    Sandu, Adrian; Sandu, Corina; Ahmadian, Mehdi

    2006-01-01

    This study explores the use of generalized polynomial chaos theory for modeling complex nonlinear multibody dynamic systems in the presence of parametric and external uncertainty. The polynomial chaos framework has been chosen because it offers an efficient computational approach for the large, nonlinear multibody models of engineering systems of interest, where the number of uncertain parameters is relatively small, while the magnitude of uncertainties can be very large (e.g., vehicle-soil interaction). The proposed methodology allows the quantification of uncertainty distributions in both time and frequency domains, and enables the simulations of multibody systems to produce results with 'error bars'. The first part of this study presents the theoretical and computational aspects of the polynomial chaos methodology. Both unconstrained and constrained formulations of multibody dynamics are considered. Direct stochastic collocation is proposed as less expensive alternative to the traditional Galerkin approach. It is established that stochastic collocation is equivalent to a stochastic response surface approach. We show that multi-dimensional basis functions are constructed as tensor products of one-dimensional basis functions and discuss the treatment of polynomial and trigonometric nonlinearities. Parametric uncertainties are modeled by finite-support probability densities. Stochastic forcings are discretized using truncated Karhunen-Loeve expansions. The companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part II: Numerical Applications' illustrates the use of the proposed methodology on a selected set of test problems. The overall conclusion is that despite its limitations, polynomial chaos is a powerful approach for the simulation of multibody systems with uncertainties

  1. Uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor

    International Nuclear Information System (INIS)

    Ghione, Alberto; Noel, Brigitte; Vinai, Paolo; Demazière, Christophe

    2017-01-01

    Highlights: • A station blackout scenario in the Jules Horowitz Reactor is analyzed using CATHARE. • Input and model uncertainties relevant to the transient, are considered. • A statistical methodology for the propagation of the uncertainties is applied. • No safety criteria are exceeded and sufficiently large safety margins are estimated. • The most influential uncertainties are determined with a sensitivity analysis. - Abstract: An uncertainty and sensitivity analysis for the simulation of a station blackout scenario in the Jules Horowitz Reactor (JHR) is presented. The JHR is a new material testing reactor under construction at CEA on the Cadarache site, France. The thermal-hydraulic system code CATHARE is applied to investigate the response of the reactor system to the scenario. The uncertainty and sensitivity study was based on a statistical methodology for code uncertainty propagation, and the ‘Uncertainty and Sensitivity’ platform URANIE was used. Accordingly, the input uncertainties relevant to the transient, were identified, quantified, and propagated to the code output. The results show that the safety criteria are not exceeded and sufficiently large safety margins exist. In addition, the most influential input uncertainties on the safety parameters were found by making use of a sensitivity analysis.

  2. Framing risk and uncertainty in social science articles on climate change, 1995–2012

    OpenAIRE

    Shaw, Chris; Hellsten, Iina; Nerlich, Brigitte

    2016-01-01

    The issue of climate change is intimately linked to notions of risk and uncertainty, concepts that pose challenges to climate science, climate change communication, and science-society interactions. While a large majority of climate scientists are increasingly certain about the causes of climate change and the risks posed by its impacts (see IPCC, 2013 and 2014), public perception of climate change is still largely framed by uncertainty, especially regarding impacts (Poortinga et al., 2011). ...

  3. Signal detection in global mean temperatures after “Paris”: an uncertainty and sensitivity analysis

    Directory of Open Access Journals (Sweden)

    H. Visser

    2018-02-01

    Full Text Available In December 2015, 195 countries agreed in Paris to hold the increase in global mean surface temperature (GMST well below 2.0 °C above pre-industrial levels and to pursue efforts to limit the temperature increase to 1.5 °C. Since large financial flows will be needed to keep GMSTs below these targets, it is important to know how GMST has progressed since pre-industrial times. However, the Paris Agreement is not conclusive as regards methods to calculate it. Should trend progression be deduced from GCM simulations or from instrumental records by (statistical trend methods? Which simulations or GMST datasets should be chosen, and which trend models? What is pre-industrial and, finally, are the Paris targets formulated for total warming, originating from both natural and anthropogenic forcing, or do they refer to anthropogenic warming only? To find answers to these questions we performed an uncertainty and sensitivity analysis where datasets and model choices have been varied. For all cases we evaluated trend progression along with uncertainty information. To do so, we analysed four trend approaches and applied these to the five leading observational GMST products. We find GMST progression to be largely independent of various trend model approaches. However, GMST progression is significantly influenced by the choice of GMST datasets. Uncertainties due to natural variability are largest in size. As a parallel path, we calculated GMST progression from an ensemble of 42 GCM simulations. Mean progression derived from GCM-based GMSTs appears to lie in the range of trend–dataset combinations. A difference between both approaches appears to be the width of uncertainty bands: GCM simulations show a much wider spread. Finally, we discuss various choices for pre-industrial baselines and the role of warming definitions. Based on these findings we propose an estimate for signal progression in GMSTs since pre-industrial.

  4. Exploring entropic uncertainty relation in the Heisenberg XX model with inhomogeneous magnetic field

    Science.gov (United States)

    Huang, Ai-Jun; Wang, Dong; Wang, Jia-Ming; Shi, Jia-Dong; Sun, Wen-Yang; Ye, Liu

    2017-08-01

    In this work, we investigate the quantum-memory-assisted entropic uncertainty relation in a two-qubit Heisenberg XX model with inhomogeneous magnetic field. It has been found that larger coupling strength J between the two spin-chain qubits can effectively reduce the entropic uncertainty. Besides, we observe the mechanics of how the inhomogeneous field influences the uncertainty, and find out that when the inhomogeneous field parameter b1. Intriguingly, the entropic uncertainty can shrink to zero when the coupling coefficients are relatively large, while the entropic uncertainty only reduces to 1 with the increase of the homogeneous magnetic field. Additionally, we observe the purity of the state and Bell non-locality and obtain that the entropic uncertainty is anticorrelated with both the purity and Bell non-locality of the evolution state.

  5. LDRD Final Report: Capabilities for Uncertainty in Predictive Science.

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, Eric Todd; Eldred, Michael S; Salinger, Andrew G.; Webster, Clayton G.

    2008-10-01

    Predictive simulation of systems comprised of numerous interconnected, tightly coupled com-ponents promises to help solve many problems of scientific and national interest. Howeverpredictive simulation of such systems is extremely challenging due to the coupling of adiverse set of physical and biological length and time scales. This report investigates un-certainty quantification methods for such systems that attempt to exploit their structure togain computational efficiency. The traditional layering of uncertainty quantification aroundnonlinear solution processes is inverted to allow for heterogeneous uncertainty quantificationmethods to be applied to each component in a coupled system. Moreover this approachallows stochastic dimension reduction techniques to be applied at each coupling interface.The mathematical feasibility of these ideas is investigated in this report, and mathematicalformulations for the resulting stochastically coupled nonlinear systems are developed.3

  6. Effect of user interpretation on uncertainty estimates: examples from the air-to-milk transfer of radiocesium

    International Nuclear Information System (INIS)

    Kirchner, G.; Ring Peterson, S.; Bergstroem, U.; Bushell, S.; Davis, P.; Filistovic, V.; Hinton, T.G.; Krajewski, P.; Riesen, T.; Uijt de Haag, P.

    1998-01-01

    An important source of uncertainty in predictions of numerical simulation codes of environmental transport processes arises from the assumptions made by the user when interpreting the model and the scenario to be assessed. This type of uncertainty was examined systematically in this study and was compared with uncertainty due to varying parameter values in a code. Three terrestrial food chain codes that are driven by deposition of radionuclides from the atmosphere were used by up to ten participants to predict total deposition of 137 Cs and concentrations on pasture and in milk for two release scenarios. Collective uncertainty among the predictions of the ten users for concentrations in milk calculated for one scenario by one code was a factor of 2000, while the largest individual uncertainty was 20 times lower. Choice of parameter values contributed most to user-induced uncertainty, followed by scenario interpretation. Due to the significant disparity in predictions, it is recommended that assessments should not be carried out alone by a single code user. (Copyright (c) 1998 Elsevier Science B.V., Amsterdam. All rights reserved.)

  7. How should epistemic uncertainty in modelling water resources management problems shape evaluations of their operations?

    Science.gov (United States)

    Dobson, B.; Pianosi, F.; Reed, P. M.; Wagener, T.

    2017-12-01

    In previous work, we have found that water supply companies are typically hesitant to use reservoir operation tools to inform their release decisions. We believe that this is, in part, due to a lack of faith in the fidelity of the optimization exercise with regards to its ability to represent the real world. In an attempt to quantify this, recent literature has studied the impact on performance from uncertainty arising in: forcing (e.g. reservoir inflows), parameters (e.g. parameters for the estimation of evaporation rate) and objectives (e.g. worst first percentile or worst case). We suggest that there is also epistemic uncertainty in the choices made during model creation, for example in the formulation of an evaporation model or aggregating regional storages. We create `rival framings' (a methodology originally developed to demonstrate the impact of uncertainty arising from alternate objective formulations), each with different modelling choices, and determine their performance impacts. We identify the Pareto approximate set of policies for several candidate formulations and then make them compete with one another in a large ensemble re-evaluation in each other's modelled spaces. This enables us to distinguish the impacts of different structural changes in the model used to evaluate system performance in an effort to generalize the validity of the optimized performance expectations.

  8. Modeling of methane bubbles released from large sea-floor area: Condition required for methane emission to the atmosphere

    OpenAIRE

    Yamamoto, A.; Yamanaka, Y.; Tajika, E.

    2009-01-01

    Massive methane release from sea-floor sediments due to decomposition of methane hydrate, and thermal decomposition of organic matter by volcanic outgassing, is a potential contributor to global warming. However, the degree of global warming has not been estimated due to uncertainty over the proportion of methane flux from the sea-floor to reach the atmosphere. Massive methane release from a large sea-floor area would result in methane-saturated seawater, thus some methane would reach the atm...

  9. Automated cleaning and uncertainty attribution of archival bathymetry based on a priori knowledge

    Science.gov (United States)

    Ladner, Rodney Wade; Elmore, Paul; Perkins, A. Louise; Bourgeois, Brian; Avera, Will

    2017-09-01

    Hydrographic offices hold large valuable historic bathymetric data sets, many of which were collected using older generation survey systems that contain little or no metadata and/or uncertainty estimates. These bathymetric data sets generally contain large outlier (errant) data points to clean, yet standard practice does not include rigorous automated procedures for systematic cleaning of these historical data sets and their subsequent conversion into reusable data formats. In this paper, we propose an automated method for this task. We utilize statistically diverse threshold tests, including a robust least trimmed squared method, to clean the data. We use LOESS weighted regression residuals together with a Student-t distribution to attribute uncertainty for each retained sounding; the resulting uncertainty values compare favorably with native estimates of uncertainty from co-located data sets which we use to estimate a point-wise goodness-of-fit measure. Storing a cleansed validated data set augmented with uncertainty in a re-usable format provides the details of this analysis for subsequent users. Our test results indicate that the method significantly improves the quality of the data set while concurrently providing confidence interval estimates and point-wise goodness-of-fit estimates as referenced to current hydrographic practices.

  10. Reduction and Uncertainty Analysis of Chemical Mechanisms Based on Local and Global Sensitivities

    Science.gov (United States)

    Esposito, Gaetano

    Numerical simulations of critical reacting flow phenomena in hypersonic propulsion devices require accurate representation of finite-rate chemical kinetics. The chemical kinetic models available for hydrocarbon fuel combustion are rather large, involving hundreds of species and thousands of reactions. As a consequence, they cannot be used in multi-dimensional computational fluid dynamic calculations in the foreseeable future due to the prohibitive computational cost. In addition to the computational difficulties, it is also known that some fundamental chemical kinetic parameters of detailed models have significant level of uncertainty due to limited experimental data available and to poor understanding of interactions among kinetic parameters. In the present investigation, local and global sensitivity analysis techniques are employed to develop a systematic approach of reducing and analyzing detailed chemical kinetic models. Unlike previous studies in which skeletal model reduction was based on the separate analysis of simple cases, in this work a novel strategy based on Principal Component Analysis of local sensitivity values is presented. This new approach is capable of simultaneously taking into account all the relevant canonical combustion configurations over different composition, temperature and pressure conditions. Moreover, the procedure developed in this work represents the first documented inclusion of non-premixed extinction phenomena, which is of great relevance in hypersonic combustors, in an automated reduction algorithm. The application of the skeletal reduction to a detailed kinetic model consisting of 111 species in 784 reactions is demonstrated. The resulting reduced skeletal model of 37--38 species showed that the global ignition/propagation/extinction phenomena of ethylene-air mixtures can be predicted within an accuracy of 2% of the full detailed model. The problems of both understanding non-linear interactions between kinetic parameters and

  11. Estimation of sampling error uncertainties in observed surface air temperature change in China

    Science.gov (United States)

    Hua, Wei; Shen, Samuel S. P.; Weithmann, Alexander; Wang, Huijun

    2017-08-01

    This study examines the sampling error uncertainties in the monthly surface air temperature (SAT) change in China over recent decades, focusing on the uncertainties of gridded data, national averages, and linear trends. Results indicate that large sampling error variances appear at the station-sparse area of northern and western China with the maximum value exceeding 2.0 K2 while small sampling error variances are found at the station-dense area of southern and eastern China with most grid values being less than 0.05 K2. In general, the negative temperature existed in each month prior to the 1980s, and a warming in temperature began thereafter, which accelerated in the early and mid-1990s. The increasing trend in the SAT series was observed for each month of the year with the largest temperature increase and highest uncertainty of 0.51 ± 0.29 K (10 year)-1 occurring in February and the weakest trend and smallest uncertainty of 0.13 ± 0.07 K (10 year)-1 in August. The sampling error uncertainties in the national average annual mean SAT series are not sufficiently large to alter the conclusion of the persistent warming in China. In addition, the sampling error uncertainties in the SAT series show a clear variation compared with other uncertainty estimation methods, which is a plausible reason for the inconsistent variations between our estimate and other studies during this period.

  12. Uncertainty information in climate data records from Earth observation

    Science.gov (United States)

    Merchant, Christopher J.; Paul, Frank; Popp, Thomas; Ablain, Michael; Bontemps, Sophie; Defourny, Pierre; Hollmann, Rainer; Lavergne, Thomas; Laeng, Alexandra; de Leeuw, Gerrit; Mittaz, Jonathan; Poulsen, Caroline; Povey, Adam C.; Reuter, Max; Sathyendranath, Shubha; Sandven, Stein; Sofieva, Viktoria F.; Wagner, Wolfgang

    2017-07-01

    The question of how to derive and present uncertainty information in climate data records (CDRs) has received sustained attention within the European Space Agency Climate Change Initiative (CCI), a programme to generate CDRs addressing a range of essential climate variables (ECVs) from satellite data. Here, we review the nature, mathematics, practicalities, and communication of uncertainty information in CDRs from Earth observations. This review paper argues that CDRs derived from satellite-based Earth observation (EO) should include rigorous uncertainty information to support the application of the data in contexts such as policy, climate modelling, and numerical weather prediction reanalysis. Uncertainty, error, and quality are distinct concepts, and the case is made that CDR products should follow international metrological norms for presenting quantified uncertainty. As a baseline for good practice, total standard uncertainty should be quantified per datum in a CDR, meaning that uncertainty estimates should clearly discriminate more and less certain data. In this case, flags for data quality should not duplicate uncertainty information, but instead describe complementary information (such as the confidence in the uncertainty estimate provided or indicators of conditions violating the retrieval assumptions). The paper discusses the many sources of error in CDRs, noting that different errors may be correlated across a wide range of timescales and space scales. Error effects that contribute negligibly to the total uncertainty in a single-satellite measurement can be the dominant sources of uncertainty in a CDR on the large space scales and long timescales that are highly relevant for some climate applications. For this reason, identifying and characterizing the relevant sources of uncertainty for CDRs is particularly challenging. The characterization of uncertainty caused by a given error effect involves assessing the magnitude of the effect, the shape of the

  13. Uncertainty and measurement

    International Nuclear Information System (INIS)

    Landsberg, P.T.

    1990-01-01

    This paper explores how the quantum mechanics uncertainty relation can be considered to result from measurements. A distinction is drawn between the uncertainties obtained by scrutinising experiments and the standard deviation type of uncertainty definition used in quantum formalism. (UK)

  14. Influence of resonance parameters' correlations on the resonance integral uncertainty; 55Mn case

    International Nuclear Information System (INIS)

    Zerovnik, Gasper; Trkov, Andrej; Capote, Roberto; Rochman, Dimitri

    2011-01-01

    For nuclides with a large number of resonances the covariance matrix of resonance parameters can become very large and expensive to process in terms of the computation time. By converting covariance matrix of resonance parameters into covariance matrices of background cross-section in a more or less coarse group structure a considerable amount of computer time and memory can be saved. The question is how important is the information that is discarded in the process. First, the uncertainty of the 55 Mn resonance integral was estimated in narrow resonance approximation for different levels of self-shielding using Bondarenko method by random sampling of resonance parameters according to their covariance matrices from two different 55 Mn evaluations: one from Nuclear Research and Consultancy Group NRG (with large uncertainties but no correlations between resonances), the other from Oak Ridge National Laboratory (with smaller uncertainties but full covariance matrix). We have found out that if all (or at least significant part of the) resonance parameters are correlated, the resonance integral uncertainty greatly depends on the level of self-shielding. Second, it was shown that the commonly used 640-group SAND-II representation cannot describe the increase of the resonance integral uncertainty. A much finer energy mesh for the background covariance matrix would have to be used to take the resonance structure into account explicitly, but then the objective of a more compact data representation is lost.

  15. Massive vector particles tunneling from black holes influenced by the generalized uncertainty principle

    Directory of Open Access Journals (Sweden)

    Xiang-Qian Li

    2016-12-01

    Full Text Available This study considers the generalized uncertainty principle, which incorporates the central idea of large extra dimensions, to investigate the processes involved when massive spin-1 particles tunnel from Reissner–Nordstrom and Kerr black holes under the effects of quantum gravity. For the black hole, the quantum gravity correction decelerates the increase in temperature. Up to O(1Mf2, the corrected temperatures are affected by the mass and angular momentum of the emitted vector bosons. In addition, the temperature of the Kerr black hole becomes uneven due to rotation. When the mass of the black hole approaches the order of the higher dimensional Planck mass Mf, it stops radiating and yields a black hole remnant.

  16. Communicating uncertainties in assessments of future sea level rise

    Science.gov (United States)

    Wikman-Svahn, P.

    2013-12-01

    How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing

  17. On the uncertainties in effective dose estimates of adult CT head scans

    International Nuclear Information System (INIS)

    Gregory, Kent J.; Bibbo, Giovanni; Pattison, John E.

    2008-01-01

    Estimates of the effective dose to adult patients from computed tomography (CT) head scanning can be calculated using a number of different methods. These estimates can be used for a variety of purposes, such as improving scanning protocols, comparing different CT imaging centers, and weighing the benefits of the scan against the risk of radiation-induced cancer. The question arises: What is the uncertainty in these effective dose estimates? This study calculates the uncertainty of effective dose estimates produced by three computer programs (CT-EXPO, CTDosimetry, and ImpactDose) and one method that makes use of dose-length product (DLP) values. Uncertainties were calculated in accordance with an internationally recognized uncertainty analysis guide. For each of the four methods, the smallest and largest overall uncertainties (stated at the 95% confidence interval) were: 20%-31% (CT-EXPO), 15%-28% (CTDosimetry), 20%-36% (ImpactDose), and 22%-32% (DLP), respectively. The overall uncertainties for each method vary due to differences in the uncertainties of factors used in each method. The smallest uncertainties apply when the CT dose index for the scanner has been measured using a calibrated pencil ionization chamber

  18. Large Civil Tiltrotor (LCTR2) Interior Noise Predictions due to Turbulent Boundary Layer Excitation

    Science.gov (United States)

    Grosveld, Ferdinand W.

    2013-01-01

    The Large Civil Tiltrotor (LCTR2) is a conceptual vehicle that has a design goal to transport 90 passengers over a distance of 1800 km at a speed of 556 km/hr. In this study noise predictions were made in the notional LCTR2 cabin due to Cockburn/Robertson and Efimtsov turbulent boundary layer (TBL) excitation models. A narrowband hybrid Finite Element (FE) analysis was performed for the low frequencies (6-141 Hz) and a Statistical Energy Analysis (SEA) was conducted for the high frequency one-third octave bands (125- 8000 Hz). It is shown that the interior sound pressure level distribution in the low frequencies is governed by interactions between individual structural and acoustic modes. The spatially averaged predicted interior sound pressure levels for the low frequency hybrid FE and the high frequency SEA analyses, due to the Efimtsov turbulent boundary layer excitation, were within 1 dB in the common 125 Hz one-third octave band. The averaged interior noise levels for the LCTR2 cabin were predicted lower than the levels in a comparable Bombardier Q400 aircraft cabin during cruise flight due to the higher cruise altitude and lower Mach number of the LCTR2. LCTR2 cabin noise due to TBL excitation during cruise flight was found not unacceptable for crew or passengers when predictions were compared to an acoustic survey on a Q400 aircraft.

  19. Sensitivity analysis on uncertainty variables affecting the NPP's LUEC with probabilistic approach

    International Nuclear Information System (INIS)

    Nuryanti; Akhmad Hidayatno; Erlinda Muslim

    2013-01-01

    One thing that is quite crucial to be reviewed prior to any investment decision on the nuclear power plant (NPP) project is the calculation of project economic, including calculation of Levelized Unit Electricity Cost (LUEC). Infrastructure projects such as NPP’s project are vulnerable to a number of uncertainty variables. Information on the uncertainty variables which makes LUEC’s value quite sensitive due to the changes of them is necessary in order the cost overrun can be avoided. Therefore this study aimed to do the sensitivity analysis on variables that affect LUEC with probabilistic approaches. This analysis was done by using Monte Carlo technique that simulate the relationship between the uncertainty variables and visible impact on LUEC. The sensitivity analysis result shows the significant changes on LUEC value of AP1000 and OPR due to the sensitivity of investment cost and capacity factors. While LUEC changes due to sensitivity of U 3 O 8 ’s price looks not quite significant. (author)

  20. Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed

  1. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis.

  2. A Preliminary Study on Sensitivity and Uncertainty Analysis with Statistic Method: Uncertainty Analysis with Cross Section Sampling from Lognormal Distribution

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2013-01-01

    The uncertainty evaluation with statistical method is performed by repetition of transport calculation with sampling the directly perturbed nuclear data. Hence, the reliable uncertainty result can be obtained by analyzing the results of the numerous transport calculations. One of the problems in the uncertainty analysis with the statistical approach is known as that the cross section sampling from the normal (Gaussian) distribution with relatively large standard deviation leads to the sampling error of the cross sections such as the sampling of the negative cross section. Some collection methods are noted; however, the methods can distort the distribution of the sampled cross sections. In this study, a sampling method of the nuclear data is proposed by using lognormal distribution. After that, the criticality calculations with sampled nuclear data are performed and the results are compared with that from the normal distribution which is conventionally used in the previous studies. In this study, the statistical sampling method of the cross section with the lognormal distribution was proposed to increase the sampling accuracy without negative sampling error. Also, a stochastic cross section sampling and writing program was developed. For the sensitivity and uncertainty analysis, the cross section sampling was pursued with the normal and lognormal distribution. The uncertainties, which are caused by covariance of (n,.) cross sections, were evaluated by solving GODIVA problem. The results show that the sampling method with lognormal distribution can efficiently solve the negative sampling problem referred in the previous studies. It is expected that this study will contribute to increase the accuracy of the sampling-based uncertainty analysis

  3. Uncertainty in the Future of Seasonal Snowpack over North America.

    Science.gov (United States)

    McCrary, R. R.; Mearns, L.

    2017-12-01

    The uncertainty in future changes in seasonal snowpack (snow water equivalent, SWE) and snow cover extent (SCE) for North America are explored using the North American Regional Climate Change Assessment Program (NARCCAP) suite of regional climate models (RCMs) and their driving CMIP3 global circulation models (GCMs). The higher resolution of the NARCCAP RCMs is found to add significant value to the details of future projections of SWE in topographically complex regions such as the Pacific Northwest and the Rocky Mountains. The NARCCAP models also add detailed information regarding changes in the southernmost extent of snow cover. 11 of the 12 NARCCAP ensemble members contributed SWE output which we use to explore the uncertainty in future snowpack at higher resolution. In this study, we quantify the uncertainty in future projections by looking at the spread of the interquartile range of the different models. By mid-Century the RCMs consistently predict that winter SWE amounts will decrease over most of North America. The only exception to this is in Northern Canada, where increased moisture supply leads to increases in SWE in all but one of the RCMs. While the models generally agree on the sign of the change in SWE, there is considerable spread in the magnitude (absolute and percent) of the change. The RCMs also agree that the number of days with measureable snow on the ground is projected to decrease, with snow accumulation occurring later in the Fall/Winter and melting starting earlier in the Spring/Summer. As with SWE amount, spread across the models is large for changes in the timing of the snow season and can vary by over a month between models. While most of the NARCCAP models project a total loss of measurable snow along the southernmost edge of their historical range, there is considerable uncertainty about where this will occur within the ensemble due to the bias in snow cover extent in the historical simulations. We explore methods to increase our

  4. Temperature field due to time-dependent heat sources in a large rectangular grid - Derivation of analytical solution

    International Nuclear Information System (INIS)

    Claesson, J.; Probert, T.

    1996-01-01

    The temperature field in rock due to a large rectangular grid of heat releasing canisters containing nuclear waste is studied. The solution is by superposition divided into different parts. There is a global temperature field due to the large rectangular canister area, while a local field accounts for the remaining heat source problem. The global field is reduced to a single integral. The local field is also solved analytically using solutions for a finite line heat source and for an infinite grid of point sources. The local solution is reduced to three parts, each of which depends on two spatial coordinates only. The temperatures at the envelope of a canister are given by a single thermal resistance, which is given by an explicit formula. The results are illustrated by a few numerical examples dealing with the KBS-3 concept for storage of nuclear waste. 8 refs

  5. Quantifying chemical uncertainties in simulations of the ISM

    Science.gov (United States)

    Glover, Simon

    2018-06-01

    The ever-increasing power of large parallel computers now makes it possible to include increasingly sophisticated chemical models in three-dimensional simulations of the interstellar medium (ISM). This allows us to study the role that chemistry plays in the thermal balance of a realistically-structured, turbulent ISM, as well as enabling us to generated detailed synthetic observations of important atomic or molecular tracers. However, one major constraint on the accuracy of these models is the accuracy with which the input chemical rate coefficients are known. Uncertainties in these chemical rate coefficients inevitably introduce uncertainties into the model predictions. In this talk, I will review some of the methods we can use to quantify these uncertainties and to identify the key reactions where improved chemical data is most urgently required. I will also discuss a few examples, ranging from the local ISM to the high-redshift universe.

  6. Managing uncertainty in flood protection planning with climate projections

    Directory of Open Access Journals (Sweden)

    B. Dittes

    2018-04-01

    Full Text Available Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is

  7. Managing uncertainty in flood protection planning with climate projections

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Schoppa, Lukas; Straub, Daniel

    2018-04-01

    Technical flood protection is a necessary part of integrated strategies to protect riverine settlements from extreme floods. Many technical flood protection measures, such as dikes and protection walls, are costly to adapt after their initial construction. This poses a challenge to decision makers as there is large uncertainty in how the required protection level will change during the measure lifetime, which is typically many decades long. Flood protection requirements should account for multiple future uncertain factors: socioeconomic, e.g., whether the population and with it the damage potential grows or falls; technological, e.g., possible advancements in flood protection; and climatic, e.g., whether extreme discharge will become more frequent or not. This paper focuses on climatic uncertainty. Specifically, we devise methodology to account for uncertainty associated with the use of discharge projections, ultimately leading to planning implications. For planning purposes, we categorize uncertainties as either visible, if they can be quantified from available catchment data, or hidden, if they cannot be quantified from catchment data and must be estimated, e.g., from the literature. It is vital to consider the hidden uncertainty, since in practical applications only a limited amount of information (e.g., a finite projection ensemble) is available. We use a Bayesian approach to quantify the visible uncertainties and combine them with an estimate of the hidden uncertainties to learn a joint probability distribution of the parameters of extreme discharge. The methodology is integrated into an optimization framework and applied to a pre-alpine case study to give a quantitative, cost-optimal recommendation on the required amount of flood protection. The results show that hidden uncertainty ought to be considered in planning, but the larger the uncertainty already present, the smaller the impact of adding more. The recommended planning is robust to moderate changes in

  8. Uncertainty estimation and ensemble forecast with a chemistry-transport model - Application to air-quality modeling and simulation

    International Nuclear Information System (INIS)

    Mallet, Vivien

    2005-01-01

    The thesis deals with the evaluation of a chemistry-transport model, not primarily with classical comparisons to observations, but through the estimation of its a priori uncertainties due to input data, model formulation and numerical approximations. These three uncertainty sources are studied respectively on the basis of Monte Carlos simulations, multi-models simulations and numerical schemes inter-comparisons. A high uncertainty is found, in output ozone concentrations. In order to overtake the limitations due to the uncertainty, a solution is ensemble forecast. Through combinations of several models (up to forty-eight models) on the basis of past observations, the forecast can be significantly improved. The achievement of this work has also led to develop the innovative modelling-system Polyphemus. (author) [fr

  9. Sensitivity coefficients of reactor parameters in fast critical assemblies and uncertainty analysis

    International Nuclear Information System (INIS)

    Aoyama, Takafumi; Suzuki, Takayuki; Takeda, Toshikazu; Hasegawa, Akira; Kikuchi, Yasuyuki.

    1986-02-01

    Sensitivity coefficients of reactor parameters in several fast critical assemblies to various cross sections were calculated in 16 group by means of SAGEP code based on the generalized perturbation theory. The sensitivity coefficients were tabulated and the difference of sensitivity coefficients was discussed. Furthermore, the uncertainty of calculated reactor parameters due to cross section uncertainty were estimated using the sensitivity coefficients and cross section covariance data. (author)

  10. Uncertainties in predicting species distributions under climate change: a case study using Tetranychus evansi (Acari: Tetranychidae), a widespread agricultural pest.

    Science.gov (United States)

    Meynard, Christine N; Migeon, Alain; Navajas, Maria

    2013-01-01

    Many species are shifting their distributions due to climate change and to increasing international trade that allows dispersal of individuals across the globe. In the case of agricultural pests, such range shifts may heavily impact agriculture. Species distribution modelling may help to predict potential changes in pest distributions. However, these modelling strategies are subject to large uncertainties coming from different sources. Here we used the case of the tomato red spider mite (Tetranychus evansi), an invasive pest that affects some of the most important agricultural crops worldwide, to show how uncertainty may affect forecasts of the potential range of the species. We explored three aspects of uncertainty: (1) species prevalence; (2) modelling method; and (3) variability in environmental responses between mites belonging to two invasive clades of T. evansi. Consensus techniques were used to forecast the potential range of the species under current and two different climate change scenarios for 2080, and variance between model projections were mapped to identify regions of high uncertainty. We revealed large predictive variations linked to all factors, although prevalence had a greater influence than the statistical model once the best modelling strategies were selected. The major areas threatened under current conditions include tropical countries in South America and Africa, and temperate regions in North America, the Mediterranean basin and Australia. Under future scenarios, the threat shifts towards northern Europe and some other temperate regions in the Americas, whereas tropical regions in Africa present a reduced risk. Analysis of niche overlap suggests that the current differential distribution of mites of the two clades of T. evansi can be partially attributed to environmental niche differentiation. Overall this study shows how consensus strategies and analysis of niche overlap can be used jointly to draw conclusions on invasive threat

  11. Uncertainties in predicting species distributions under climate change: a case study using Tetranychus evansi (Acari: Tetranychidae, a widespread agricultural pest.

    Directory of Open Access Journals (Sweden)

    Christine N Meynard

    Full Text Available Many species are shifting their distributions due to climate change and to increasing international trade that allows dispersal of individuals across the globe. In the case of agricultural pests, such range shifts may heavily impact agriculture. Species distribution modelling may help to predict potential changes in pest distributions. However, these modelling strategies are subject to large uncertainties coming from different sources. Here we used the case of the tomato red spider mite (Tetranychus evansi, an invasive pest that affects some of the most important agricultural crops worldwide, to show how uncertainty may affect forecasts of the potential range of the species. We explored three aspects of uncertainty: (1 species prevalence; (2 modelling method; and (3 variability in environmental responses between mites belonging to two invasive clades of T. evansi. Consensus techniques were used to forecast the potential range of the species under current and two different climate change scenarios for 2080, and variance between model projections were mapped to identify regions of high uncertainty. We revealed large predictive variations linked to all factors, although prevalence had a greater influence than the statistical model once the best modelling strategies were selected. The major areas threatened under current conditions include tropical countries in South America and Africa, and temperate regions in North America, the Mediterranean basin and Australia. Under future scenarios, the threat shifts towards northern Europe and some other temperate regions in the Americas, whereas tropical regions in Africa present a reduced risk. Analysis of niche overlap suggests that the current differential distribution of mites of the two clades of T. evansi can be partially attributed to environmental niche differentiation. Overall this study shows how consensus strategies and analysis of niche overlap can be used jointly to draw conclusions on invasive

  12. Uncertainty as Knowledge: Constraints on Policy Choices Provided by Analysis of Uncertainty

    Science.gov (United States)

    Lewandowsky, S.; Risbey, J.; Smithson, M.; Newell, B. R.

    2012-12-01

    Uncertainty forms an integral part of climate science, and it is often cited in connection with arguments against mitigative action. We argue that an analysis of uncertainty must consider existing knowledge as well as uncertainty, and the two must be evaluated with respect to the outcomes and risks associated with possible policy options. Although risk judgments are inherently subjective, an analysis of the role of uncertainty within the climate system yields two constraints that are robust to a broad range of assumptions. Those constraints are that (a) greater uncertainty about the climate system is necessarily associated with greater expected damages from warming, and (b) greater uncertainty translates into a greater risk of the failure of mitigation efforts. These ordinal constraints are unaffected by subjective or cultural risk-perception factors, they are independent of the discount rate, and they are independent of the magnitude of the estimate for climate sensitivity. The constraints mean that any appeal to uncertainty must imply a stronger, rather than weaker, need to cut greenhouse gas emissions than in the absence of uncertainty.

  13. Quantitative non-monotonic modeling of economic uncertainty by probability and possibility distributions

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2012-01-01

    uncertainty can be calculated. The possibility approach is particular well suited for representation of uncertainty of a non-statistical nature due to lack of knowledge and requires less information than the probability approach. Based on the kind of uncertainty and knowledge present, these aspects...... to the understanding of similarities and differences of the two approaches as well as practical applications. The probability approach offers a good framework for representation of randomness and variability. Once the probability distributions of uncertain parameters and their correlations are known the resulting...... are thoroughly discussed in the case of rectangular representation of uncertainty by the uniform probability distribution and the interval, respectively. Also triangular representations are dealt with and compared. Calculation of monotonic as well as non-monotonic functions of variables represented...

  14. Model structural uncertainty quantification and hydrogeophysical data integration using airborne electromagnetic data (Invited)

    DEFF Research Database (Denmark)

    Minsley, Burke; Christensen, Nikolaj Kruse; Christensen, Steen

    of airborne electromagnetic (AEM) data to estimate large-scale model structural geometry, i.e. the spatial distribution of different lithological units based on assumed or estimated resistivity-lithology relationships, and the uncertainty in those structures given imperfect measurements. Geophysically derived...... estimates of model structural uncertainty are then combined with hydrologic observations to assess the impact of model structural error on hydrologic calibration and prediction errors. Using a synthetic numerical model, we describe a sequential hydrogeophysical approach that: (1) uses Bayesian Markov chain...... Monte Carlo (McMC) methods to produce a robust estimate of uncertainty in electrical resistivity parameter values, (2) combines geophysical parameter uncertainty estimates with borehole observations of lithology to produce probabilistic estimates of model structural uncertainty over the entire AEM...

  15. Uncertainty analysis of the nonideal competitive adsorption-donnan model: effects of dissolved organic matter variability on predicted metal speciation in soil solution.

    Science.gov (United States)

    Groenenberg, Jan E; Koopmans, Gerwin F; Comans, Rob N J

    2010-02-15

    Ion binding models such as the nonideal competitive adsorption-Donnan model (NICA-Donnan) and model VI successfully describe laboratory data of proton and metal binding to purified humic substances (HS). In this study model performance was tested in more complex natural systems. The speciation predicted with the NICA-Donnan model and the associated uncertainty were compared with independent measurements in soil solution extracts, including the free metal ion activity and fulvic (FA) and humic acid (HA) fractions of dissolved organic matter (DOM). Potentially important sources of uncertainty are the DOM composition and the variation in binding properties of HS. HS fractions of DOM in soil solution extracts varied between 14 and 63% and consisted mainly of FA. Moreover, binding parameters optimized for individual FA samples show substantial variation. Monte Carlo simulations show that uncertainties in predicted metal speciation, for metals with a high affinity for FA (Cu, Pb), are largely due to the natural variation in binding properties (i.e., the affinity) of FA. Predictions for metals with a lower affinity (Cd) are more prone to uncertainties in the fraction FA in DOM and the maximum site density (i.e., the capacity) of the FA. Based on these findings, suggestions are provided to reduce uncertainties in model predictions.

  16. Climate change impact on streamflow in large-scale river basins: projections and their uncertainties sourced from GCMs and RCP scenarios

    Science.gov (United States)

    Nasonova, Olga N.; Gusev, Yeugeniy M.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    Climate change impact on river runoff was investigated within the framework of the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP2) using a physically-based land surface model Soil Water - Atmosphere - Plants (SWAP) (developed in the Institute of Water Problems of the Russian Academy of Sciences) and meteorological projections (for 2006-2099) simulated by five General Circulation Models (GCMs) (including GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M) for each of four Representative Concentration Pathway (RCP) scenarios (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Eleven large-scale river basins were used in this study. First of all, SWAP was calibrated and validated against monthly values of measured river runoff with making use of forcing data from the WATCH data set and all GCMs' projections were bias-corrected to the WATCH. Then, for each basin, 20 projections of possible changes in river runoff during the 21st century were simulated by SWAP. Analysis of the obtained hydrological projections allowed us to estimate their uncertainties resulted from application of different GCMs and RCP scenarios. On the average, the contribution of different GCMs to the uncertainty of the projected river runoff is nearly twice larger than the contribution of RCP scenarios. At the same time the contribution of GCMs slightly decreases with time.

  17. Uncertainty in projected impacts of climate change on biodiversity

    DEFF Research Database (Denmark)

    Garcia, Raquel A.

    Evidence for shifts in the phenologies and distributions of species over recent decades has often been attributed to climate change. The prospect of greater and faster changes in climate during the 21st century has spurred a stream of studies anticipating future biodiversity impacts. Yet, uncerta......Evidence for shifts in the phenologies and distributions of species over recent decades has often been attributed to climate change. The prospect of greater and faster changes in climate during the 21st century has spurred a stream of studies anticipating future biodiversity impacts. Yet......, uncertainty is inherent to both projected climate changes and their effects on biodiversity, and needs to be understood before projections can be used. This thesis seeks to elucidate some of the uncertainties clouding assessments of biodiversity impacts from climate change, and explores ways to address them...... models, are shown to be affected by multiple uncertainties. Different model algorithms produce different outputs, as do alternative future climate models and scenarios of future emissions of greenhouse gases. Another uncertainty arises due to omission of species with small sample sizes, which...

  18. Design of Adaptive Policy Pathways under Deep Uncertainties

    Science.gov (United States)

    Babovic, Vladan

    2013-04-01

    The design of large-scale engineering and infrastructural systems today is growing in complexity. Designers need to consider sociotechnical uncertainties, intricacies, and processes in the long- term strategic deployment and operations of these systems. In this context, water and spatial management is increasingly challenged not only by climate-associated changes such as sea level rise and increased spatio-temporal variability of precipitation, but also by pressures due to population growth and particularly accelerating rate of urbanisation. Furthermore, high investment costs and long term-nature of water-related infrastructure projects requires long-term planning perspective, sometimes extending over many decades. Adaptation to such changes is not only determined by what is known or anticipated at present, but also by what will be experienced and learned as the future unfolds, as well as by policy responses to social and water events. As a result, a pathway emerges. Instead of responding to 'surprises' and making decisions on ad hoc basis, exploring adaptation pathways into the future provide indispensable support in water management decision-making. In this contribution, a structured approach for designing a dynamic adaptive policy based on the concepts of adaptive policy making and adaptation pathways is introduced. Such an approach provides flexibility which allows change over time in response to how the future unfolds, what is learned about the system, and changes in societal preferences. The introduced flexibility provides means for dealing with complexities of adaptation under deep uncertainties. It enables engineering systems to change in the face of uncertainty to reduce impacts from downside scenarios while capitalizing on upside opportunities. This contribution presents comprehensive framework for development and deployment of adaptive policy pathway framework, and demonstrates its performance under deep uncertainties on a case study related to urban

  19. Scenario-based approach for flexible resource loading under uncertainty

    NARCIS (Netherlands)

    Wullink, G.; Gademann, A.J.R.M.; Hans, E.W.; Harten, van A.

    2004-01-01

    Order acceptance decisions in manufacture-to-order environments are often made based on incomplete or uncertain information. To quote reliable due dates in order processing, manage resource capacity adequately and take into account uncertainty, the paper presents and analyses models and tools for

  20. Large-Scale Ocean Circulation-Cloud Interactions Reduce the Pace of Transient Climate Change

    Science.gov (United States)

    Trossman, D. S.; Palter, J. B.; Merlis, T. M.; Huang, Y.; Xia, Y.

    2016-01-01

    Changes to the large scale oceanic circulation are thought to slow the pace of transient climate change due, in part, to their influence on radiative feedbacks. Here we evaluate the interactions between CO2-forced perturbations to the large-scale ocean circulation and the radiative cloud feedback in a climate model. Both the change of the ocean circulation and the radiative cloud feedback strongly influence the magnitude and spatial pattern of surface and ocean warming. Changes in the ocean circulation reduce the amount of transient global warming caused by the radiative cloud feedback by helping to maintain low cloud coverage in the face of global warming. The radiative cloud feedback is key in affecting atmospheric meridional heat transport changes and is the dominant radiative feedback mechanism that responds to ocean circulation change. Uncertainty in the simulated ocean circulation changes due to CO2 forcing may contribute a large share of the spread in the radiative cloud feedback among climate models.