WorldWideScience

Sample records for model uncertainty assessment

  1. Statistical assessment of predictive modeling uncertainty

    Science.gov (United States)

    Barzaghi, Riccardo; Marotta, Anna Maria

    2017-04-01

    When the results of geophysical models are compared with data, the uncertainties of the model are typically disregarded. We propose a method for defining the uncertainty of a geophysical model based on a numerical procedure that estimates the empirical auto and cross-covariances of model-estimated quantities. These empirical values are then fitted by proper covariance functions and used to compute the covariance matrix associated with the model predictions. The method is tested using a geophysical finite element model in the Mediterranean region. Using a novel χ2 analysis in which both data and model uncertainties are taken into account, the model's estimated tectonic strain pattern due to the Africa-Eurasia convergence in the area that extends from the Calabrian Arc to the Alpine domain is compared with that estimated from GPS velocities while taking into account the model uncertainty through its covariance structure and the covariance of the GPS estimates. The results indicate that including the estimated model covariance in the testing procedure leads to lower observed χ2 values that have better statistical significance and might help a sharper identification of the best-fitting geophysical models.

  2. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  3. Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds

    Science.gov (United States)

    We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...

  4. Uncertainties in environmental radiological assessment models and their implications

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible.

  5. Spatial uncertainty assessment in modelling reference evapotranspiration at regional scale

    Directory of Open Access Journals (Sweden)

    G. Buttafuoco

    2010-07-01

    Full Text Available Evapotranspiration is one of the major components of the water balance and has been identified as a key factor in hydrological modelling. For this reason, several methods have been developed to calculate the reference evapotranspiration (ET0. In modelling reference evapotranspiration it is inevitable that both model and data input will present some uncertainty. Whatever model is used, the errors in the input will propagate to the output of the calculated ET0. Neglecting information about estimation uncertainty, however, may lead to improper decision-making and water resources management. One geostatistical approach to spatial analysis is stochastic simulation, which draws alternative and equally probable, realizations of a regionalized variable. Differences between the realizations provide a measure of spatial uncertainty and allow to carry out an error propagation analysis. Among the evapotranspiration models, the Hargreaves-Samani model was used.

    The aim of this paper was to assess spatial uncertainty of a monthly reference evapotranspiration model resulting from the uncertainties in the input attributes (mainly temperature at regional scale. A case study was presented for Calabria region (southern Italy. Temperature data were jointly simulated by conditional turning bands simulation with elevation as external drift and 500 realizations were generated.

    The ET0 was then estimated for each set of the 500 realizations of the input variables, and the ensemble of the model outputs was used to infer the reference evapotranspiration probability distribution function. This approach allowed to delineate the areas characterized by greater uncertainty, to improve supplementary sampling strategies and ET0 value predictions.

  6. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... "preferred" GIA model has been used, without any consideration of the possible errors involved. Lacking a rigorous assessment of systematic errors in GIA modeling, the reliability of the results is uncertain. GIA sensitivity and uncertainties associated with the viscosity models have been explored......, such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...

  7. Assessing uncertainties in solute transport models: Upper Narew case study

    Science.gov (United States)

    Osuch, M.; Romanowicz, R.; Napiórkowski, J. J.

    2009-04-01

    This paper evaluates uncertainties in two solute transport models based on tracer experiment data from the Upper River Narew. Data Based Mechanistic and transient storage models were applied to Rhodamine WT tracer observations. We focus on the analysis of uncertainty and the sensitivity of model predictions to varying physical parameters, such as dispersion and channel geometry. An advection-dispersion model with dead zones (Transient Storage model) adequately describes the transport of pollutants in a single channel river with multiple storage. The applied transient storage model is deterministic; it assumes that observations are free of errors and the model structure perfectly describes the process of transport of conservative pollutants. In order to take into account the model and observation errors, an uncertainty analysis is required. In this study we used a combination of the Generalized Likelihood Uncertainty Estimation technique (GLUE) and the variance based Global Sensitivity Analysis (GSA). The combination is straightforward as the same samples (Sobol samples) were generated for GLUE analysis and for sensitivity assessment. Additionally, the results of the sensitivity analysis were used to specify the best parameter ranges and their prior distributions for the evaluation of predictive model uncertainty using the GLUE methodology. Apart from predictions of pollutant transport trajectories, two ecological indicators were also studied (time over the threshold concentration and maximum concentration). In particular, a sensitivity analysis of the length of "over the threshold" period shows an interesting multi-modal dependence on model parameters. This behavior is a result of the direct influence of parameters on different parts of the dynamic response of the system. As an alternative to the transient storage model, a Data Based Mechanistic approach was tested. Here, the model is identified and the parameters are estimated from available time series data using

  8. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... uncertainties can be implemented in probabilistic reliability assessments....

  9. Assessing and propagating uncertainty in model inputs in corsim

    Energy Technology Data Exchange (ETDEWEB)

    Molina, G.; Bayarri, M. J.; Berger, J. O.

    2001-07-01

    CORSIM is a large simulator for vehicular traffic, and is being studied with respect to its ability to successfully model and predict behavior of traffic in a 36 block section of Chicago. Inputs to the simulator include information about street configuration, driver behavior, traffic light timing, turning probabilities at each corner and distributions of traffic ingress into the system. This work is described in more detail in the article Fast Simulators for Assessment and Propagation of Model Uncertainty also in these proceedings. The focus of this conference poster is on the computational aspects of this problem. In particular, we address the description of the full conditional distributions needed for implementation of the MCMC algorithm and, in particular, how the constraints can be incorporated; details concerning the run time and convergence of the MCMC algorithm; and utilisation of the MCMC output for prediction and uncertainty analysis concerning the CORSIM computer model. As this last is the ultimate goal, it is worth emphasizing that the incorporation of all uncertainty concerning inputs can significantly affect the model predictions. (Author)

  10. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    Science.gov (United States)

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  11. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  12. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  13. Assessment of Solution Uncertainties in Single-Column Modeling Frameworks.

    Science.gov (United States)

    Hack, James J.; Pedretti, John A.

    2000-01-01

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  14. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...

  15. Assessment of model uncertainty during the river export modelling of pesticides and transformation products

    Science.gov (United States)

    Gassmann, Matthias; Olsson, Oliver; Kümmerer, Klaus

    2013-04-01

    The modelling of organic pollutants in the environment is burdened by a load of uncertainties. Not only parameter values are uncertain but often also the mass and timing of pesticide application. By introducing transformation products (TPs) into modelling, further uncertainty coming from the dependence of these substances on their parent compounds and the introduction of new model parameters are likely. The purpose of this study was the investigation of the behaviour of a parsimonious catchment scale model for the assessment of river concentrations of the insecticide Chlorpyrifos (CP) and two of its TPs, Chlorpyrifos Oxon (CPO) and 3,5,6-trichloro-2-pyridinol (TCP) under the influence of uncertain input parameter values. Especially parameter uncertainty and pesticide application uncertainty were investigated by Global Sensitivity Analysis (GSA) and the Generalized Likelihood Uncertainty Estimation (GLUE) method, based on Monte-Carlo sampling. GSA revealed that half-lives and sorption parameters as well as half-lives and transformation parameters were correlated to each other. This means, that the concepts of modelling sorption and degradation/transformation were correlated. Thus, it may be difficult in modelling studies to optimize parameter values for these modules. Furthermore, we could show that erroneous pesticide application mass and timing were compensated during Monte-Carlo sampling by changing the half-life of CP. However, the introduction of TCP into the calculation of the objective function was able to enhance identifiability of pesticide application mass. The GLUE analysis showed that CP and TCP were modelled successfully, but CPO modelling failed with high uncertainty and insensitive parameters. We assumed a structural error of the model which was especially important for CPO assessment. This shows that there is the possibility that a chemical and some of its TPs can be modelled successfully by a specific model structure, but for other TPs, the model

  16. Uncertainty Assessment in Long Term Urban Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    the probability of system failures (defined as either flooding or surcharge of manholes or combined sewer overflow); (2) an application of the Generalized Likelihood Uncertainty Estimation methodology in which an event based stochastic calibration is performed; and (3) long term Monte Carlo simulations...... with the purpose of estimating the uncertainties on the extreme event statistics of maximum water levels and combined sewer overflow volumes in drainage systems. The thesis concludes that the uncertainties on both maximum water levels and combined sewer overflow volumes are considerable, especially on the large...

  17. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    GIA modeling. GIA errors are also important in the far field of previously glaciated areas and in the time evolution of global indicators. In this regard we also account for other possible errors sources which can impact global indicators like the sea level history related to GIA. The thermal......During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... in the literature. However, at least two major sources of errors remain. The first is associated with the ice models, spatial distribution of ice and history of melting (this is especially the case of Antarctica), the second with the numerical implementation of model features relevant to sea level modeling...

  18. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    , such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...

  19. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    2012-01-01

    , such as time-evolving shorelines and paleo coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...

  20. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    Science.gov (United States)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  1. Assessment of conceptual model uncertainty for the regional aquifer Pampa del Tamarugal – North Chile

    Directory of Open Access Journals (Sweden)

    R. Rojas

    2009-09-01

    Full Text Available In this work we assess the uncertainty in modelling the groundwater flow for the Pampa del Tamarugal Aquifer (PTA – North Chile using a novel and fully integrated multi-model approach aimed at explicitly accounting for uncertainties arising from the definition of alternative conceptual models. The approach integrates the Generalized Likelihood Uncertainty Estimation (GLUE and Bayesian Model Averaging (BMA methods. For each member of an ensemble M of potential conceptualizations, model weights used in BMA for multi-model aggregation are obtained from GLUE-based likelihood values. These model weights are based on model performance, thus, reflecting how well a conceptualization reproduces an observed dataset D. GLUE-based cumulative predictive distributions for each member of M are then aggregated obtaining predictive distributions accounting for conceptual model uncertainties. For the PTA we propose an ensemble of eight alternative conceptualizations covering all major features of groundwater flow models independently developed in past studies and including two recharge mechanisms which have been source of debate for several years. Results showed that accounting for heterogeneities in the hydraulic conductivity field (a reduced the uncertainty in the estimations of parameters and state variables, and (b increased the corresponding model weights used for multi-model aggregation. This was more noticeable when the hydraulic conductivity field was conditioned on available hydraulic conductivity measurements. Contribution of conceptual model uncertainty to the predictive uncertainty varied between 6% and 64% for ground water head estimations and between 16% and 79% for ground water flow estimations. These results clearly illustrate the relevance of conceptual model uncertainty.

  2. Assessment of conceptual model uncertainty for the regional aquifer Pampa del Tamarugal – North Chile

    Directory of Open Access Journals (Sweden)

    R. Rojas

    2010-02-01

    Full Text Available In this work we assess the uncertainty in modelling the groundwater flow for the Pampa del Tamarugal Aquifer (PTA – North Chile using a novel and fully integrated multi-model approach aimed at explicitly accounting for uncertainties arising from the definition of alternative conceptual models. The approach integrates the Generalized Likelihood Uncertainty Estimation (GLUE and Bayesian Model Averaging (BMA methods. For each member of an ensemble M of potential conceptualizations, model weights used in BMA for multi-model aggregation are obtained from GLUE-based likelihood values. These model weights are based on model performance, thus, reflecting how well a conceptualization reproduces an observed dataset D. GLUE-based cumulative predictive distributions for each member of M are then aggregated obtaining predictive distributions accounting for conceptual model uncertainties. For the PTA we propose an ensemble of eight alternative conceptualizations covering all major features of groundwater flow models independently developed in past studies and including two recharge mechanisms which have been source of debate for several years. Results showed that accounting for heterogeneities in the hydraulic conductivity field (a reduced the uncertainty in the estimations of parameters and state variables, and (b increased the corresponding model weights used for multi-model aggregation. This was more noticeable when the hydraulic conductivity field was conditioned on available hydraulic conductivity measurements. Contribution of conceptual model uncertainty to the predictive uncertainty varied between 6% and 64% for ground water head estimations and between 16% and 79% for ground water flow estimations. These results clearly illustrate the relevance of conceptual model uncertainty.

  3. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; van den Berg, Stéphanie Martine

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  4. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2016-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  5. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  6. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... expansion of the oceans and other sources of water during the deglaciation different from the one coming from ice-sheets account for further sea level rise since LGM, and the water stored in atmosphere, groundwater and lakes accounts for negative contribution to it. In this way we aim to assess which GIA...

  7. Modeling uncertainty in risk assessment: an integrated approach with fuzzy set theory and Monte Carlo simulation.

    Science.gov (United States)

    Arunraj, N S; Mandal, Saptarshi; Maiti, J

    2013-06-01

    Modeling uncertainty during risk assessment is a vital component for effective decision making. Unfortunately, most of the risk assessment studies suffer from uncertainty analysis. The development of tools and techniques for capturing uncertainty in risk assessment is ongoing and there has been a substantial growth in this respect in health risk assessment. In this study, the cross-disciplinary approaches for uncertainty analyses are identified and a modified approach suitable for industrial safety risk assessment is proposed using fuzzy set theory and Monte Carlo simulation. The proposed method is applied to a benzene extraction unit (BEU) of a chemical plant. The case study results show that the proposed method provides better measure of uncertainty than the existing methods as unlike traditional risk analysis method this approach takes into account both variability and uncertainty of information into risk calculation, and instead of a single risk value this approach provides interval value of risk values for a given percentile of risk. The implications of these results in terms of risk control and regulatory compliances are also discussed.

  8. Uncertainty Assessment for Numerical Modeling of Dune and Backshore Evolution Under Sea-Level Rise Scenarios

    Science.gov (United States)

    Dai, H.; Ye, M.; Niedoroda, A. W.; Kish, S.; Donoghue, J. F.; Saha, B.

    2010-12-01

    The beach dunes play an essential role in the evolution of barrier island shapes and coastlines. The dunes protect the beaches and beach ecology by absorbing energy from the storms and provide sediment to the beaches or backshores when erosion occurs. While a number of models have been developed to simulate the evolution of dunes and backshores, few of the models have comprehensively addressed dune growth, dune erosion, and backshore changes. Based on the assumption that dune shapes are stationary, we develop a new model that can estimate the dune and backshore evolution (including both growth and erosion) under the influence of storms with different sea-level rise scenarios. The modeling results are inherently uncertain due to unknown storm variability and sea-level rise scenarios. The storm uncertainty, characterized as parametric uncertainty, and its propagation to the modeling results are assessed using the Monte Carlo (MC) method. A total of 1500 realizations of storm magnitude, frequency, and track through a barrier island are generated and used for the MC simulation. The numerical modeling and uncertainty analysis is conducted for a synthetic barrier island with physical features and hurrucane exposure similar to Santa Rosa Island, in northwest Florida. Uncertainty in the simulated beach dune heights, dune width, and the backshore positions is assessed for five sea-level rise scenarios. The parametric uncertainty is different for different sea-level rise scenarios. For a given scenario, uncertainty of dune height is the largest and it is mainly caused by uncertainty in storm magnitude. This uncertainty analysis provides guidelines for coastal management and protection of coastal ecology.

  9. Parameter estimation and uncertainty assessment in hydrological modelling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena

    En rationel og effektiv vandressourceadministration forudsætter indsigt i og forståelse af de hydrologiske processer samt præcise opgørelser af de tilgængelige vandmængder i både overfladevands- og grundvandsmagasiner. Til det formål er hydrologiske modeller et uomgængeligt værktøj. I de senest 10......-20 år er der forsket meget i hydrologiske processer og især i implementeringen af denne viden i numeriske modelsystemer. Dette har ledt til modeller af stigende kompleksitet. Samtidig er en række forskellige teknikker til at estimere modelparametre og til at skønne usikkerheden på modelprædiktioner...... hertil har været de lange beregningstider og omfattende datakrav, der karakteriserer denne type modeller, og som udgør et stort problem ved rekursiv anvendelse af modellerne. Dertil kommer, at de komplekse modeller sædvanligvis ikke er frit tilgængelige på samme måde som de simple nedbør...

  10. Parameter estimation and uncertainty assessment in hydrological modelling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena

    En rationel og effektiv vandressourceadministration forudsætter indsigt i og forståelse af de hydrologiske processer samt præcise opgørelser af de tilgængelige vandmængder i både overfladevands- og grundvandsmagasiner. Til det formål er hydrologiske modeller et uomgængeligt værktøj. I de senest 10......-20 år er der forsket meget i hydrologiske processer og især i implementeringen af denne viden i numeriske modelsystemer. Dette har ledt til modeller af stigende kompleksitet. Samtidig er en række forskellige teknikker til at estimere modelparametre og til at skønne usikkerheden på modelprædiktioner...... hertil har været de lange beregningstider og omfattende datakrav, der karakteriserer denne type modeller, og som udgør et stort problem ved rekursiv anvendelse af modellerne. Dertil kommer, at de komplekse modeller sædvanligvis ikke er frit tilgængelige på samme måde som de simple nedbør...

  11. Uncertainty and Probability in Natural Hazard Assessment and Their Role in the Testability of Hazard Models

    Science.gov (United States)

    Marzocchi, Warner; Jordan, Thomas

    2014-05-01

    Probabilistic assessment has become a widely accepted procedure to estimate quantitatively natural hazards. In essence probabilities are meant to quantify the ubiquitous and deep uncertainties that characterize the evolution of natural systems. However, notwithstanding the very wide use of the terms 'uncertainty' and 'probability' in natural hazards, the way in which they are linked, how they are estimated and their scientific meaning are far from being clear, as testified by the last Intergovernmental Panel on Climate Change (IPCC) report and by its subsequent review. The lack of a formal framework to interpret uncertainty and probability coherently has paved the way for some of the strongest critics of hazard analysis; in fact, it has been argued that most of natural hazard analyses are intrinsically 'unscientific'. For example, among the concerns is the use of expert opinion to characterize the so-called epistemic uncertainties; many have argued that such personal degrees of belief cannot be measured and, by implication, cannot be tested. The purpose of this talk is to confront and clarify the conceptual issues associated with the role of uncertainty and probability in natural hazard analysis and the conditions that make a hazard model testable and then 'scientific'. Specifically, we show that testability of hazard models requires a suitable taxonomy of uncertainty embedded in a proper logical framework. This taxonomy of uncertainty is composed by aleatory variability, epistemic uncertainty, and ontological error. We discuss their differences, the link with the probability, and their estimation using data, models, and subjective expert opinion. We show that these different uncertainties, and the testability of hazard models, can be unequivocally defined only for a well-defined experimental concept that is a concept external to the model under test. All these discussions are illustrated through simple examples related to the probabilistic seismic hazard analysis.

  12. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Science.gov (United States)

    Sikorska, A. E.; Scheidegger, A.; Banasik, K.; Rieckermann, J.

    2012-04-01

    Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  13. Using Predictive Uncertainty Analysis to Assess Hydrologic Model Performance for a Watershed in Oregon

    Science.gov (United States)

    Brannan, K. M.; Somor, A.

    2016-12-01

    A variety of statistics are used to assess watershed model performance but these statistics do not directly answer the question: what is the uncertainty of my prediction. Understanding predictive uncertainty is important when using a watershed model to develop a Total Maximum Daily Load (TMDL). TMDLs are a key component of the US Clean Water Act and specify the amount of a pollutant that can enter a waterbody when the waterbody meets water quality criteria. TMDL developers use watershed models to estimate pollutant loads from nonpoint sources of pollution. We are developing a TMDL for bacteria impairments in a watershed in the Coastal Range of Oregon. We setup an HSPF model of the watershed and used the calibration software PEST to estimate HSPF hydrologic parameters and then perform predictive uncertainty analysis of stream flow. We used Monte-Carlo simulation to run the model with 1,000 different parameter sets and assess predictive uncertainty. In order to reduce the chance of specious parameter sets, we accounted for the relationships among parameter values by using mathematically-based regularization techniques and an estimate of the parameter covariance when generating random parameter sets. We used a novel approach to select flow data for predictive uncertainty analysis. We set aside flow data that occurred on days that bacteria samples were collected. We did not use these flows in the estimation of the model parameters. We calculated a percent uncertainty for each flow observation based 1,000 model runs. We also used several methods to visualize results with an emphasis on making the data accessible to both technical and general audiences. We will use the predictive uncertainty estimates in the next phase of our work, simulating bacteria fate and transport in the watershed.

  14. Measures of Model Uncertainty in the Assessment of Primary Stresses in Ship Structures

    DEFF Research Database (Denmark)

    Östergaard, Carsten; Dogliani, Mario; Guedes Soares, Carlos;

    1996-01-01

    The paper considers various models and methods commonly used for linear elastic stress analysis and assesses the uncertainty involved in their application to the analysis of the distribution of primary stresses in the hull of a containership example, through statistical evaluations of the results...

  15. A comparative assessment of efficient uncertainty analysis techniques for environmental fate and transport models: application to the FACT model

    Science.gov (United States)

    Balakrishnan, Suhrid; Roy, Amit; Ierapetritou, Marianthi G.; Flach, Gregory P.; Georgopoulos, Panos G.

    2005-06-01

    This work presents a comparative assessment of efficient uncertainty modeling techniques, including Stochastic Response Surface Method (SRSM) and High Dimensional Model Representation (HDMR). This assessment considers improvement achieved with respect to conventional techniques of modeling uncertainty (Monte Carlo). Given that traditional methods for characterizing uncertainty are very computationally demanding, when they are applied in conjunction with complex environmental fate and transport models, this study aims to assess how accurately these efficient (and hence viable) techniques for uncertainty propagation can capture complex model output uncertainty. As a part of this effort, the efficacy of HDMR, which has primarily been used in the past as a model reduction tool, is also demonstrated for uncertainty analysis. The application chosen to highlight the accuracy of these new techniques is the steady state analysis of the groundwater flow in the Savannah River Site General Separations Area (GSA) using the subsurface Flow And Contaminant Transport (FACT) code. Uncertain inputs included three-dimensional hydraulic conductivity fields, and a two-dimensional recharge rate field. The output variables under consideration were the simulated stream baseflows and hydraulic head values. Results show that the uncertainty analysis outcomes obtained using SRSM and HDMR are practically indistinguishable from those obtained using the conventional Monte Carlo method, while requiring orders of magnitude fewer model simulations.

  16. Assessment of uncertainties of the models used in thermal-hydraulic computer codes

    Science.gov (United States)

    Gricay, A. S.; Migrov, Yu. A.

    2015-09-01

    The article deals with matters concerned with the problem of determining the statistical characteristics of variable parameters (the variation range and distribution law) in analyzing the uncertainty and sensitivity of calculation results to uncertainty in input data. A comparative analysis of modern approaches to uncertainty in input data is presented. The need to develop an alternative method for estimating the uncertainty of model parameters used in thermal-hydraulic computer codes, in particular, in the closing correlations of the loop thermal hydraulics block, is shown. Such a method shall feature the minimal degree of subjectivism and must be based on objective quantitative assessment criteria. The method includes three sequential stages: selecting experimental data satisfying the specified criteria, identifying the key closing correlation using a sensitivity analysis, and carrying out case calculations followed by statistical processing of the results. By using the method, one can estimate the uncertainty range of a variable parameter and establish its distribution law in the above-mentioned range provided that the experimental information is sufficiently representative. Practical application of the method is demonstrated taking as an example the problem of estimating the uncertainty of a parameter appearing in the model describing transition to post-burnout heat transfer that is used in the thermal-hydraulic computer code KORSAR. The performed study revealed the need to narrow the previously established uncertainty range of this parameter and to replace the uniform distribution law in the above-mentioned range by the Gaussian distribution law. The proposed method can be applied to different thermal-hydraulic computer codes. In some cases, application of the method can make it possible to achieve a smaller degree of conservatism in the expert estimates of uncertainties pertinent to the model parameters used in computer codes.

  17. Uncertainty and Sensitivity of Alternative Rn-222 Flux Density Models Used in Performance Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Greg J. Shott, Vefa Yucel, Lloyd Desotell

    2007-06-01

    Performance assessments for the Area 5 Radioactive Waste Management Site on the Nevada Test Site have used three different mathematical models to estimate Rn-222 flux density. This study describes the performance, uncertainty, and sensitivity of the three models which include the U.S. Nuclear Regulatory Commission Regulatory Guide 3.64 analytical method and two numerical methods. The uncertainty of each model was determined by Monte Carlo simulation using Latin hypercube sampling. The global sensitivity was investigated using Morris one-at-time screening method, sample-based correlation and regression methods, the variance-based extended Fourier amplitude sensitivity test, and Sobol's sensitivity indices. The models were found to produce similar estimates of the mean and median flux density, but to have different uncertainties and sensitivities. When the Rn-222 effective diffusion coefficient was estimated using five different published predictive models, the radon flux density models were found to be most sensitive to the effective diffusion coefficient model selected, the emanation coefficient, and the radionuclide inventory. Using a site-specific measured effective diffusion coefficient significantly reduced the output uncertainty. When a site-specific effective-diffusion coefficient was used, the models were most sensitive to the emanation coefficient and the radionuclide inventory.

  18. Assessing uncertainty in pollutant wash-off modelling via model validation.

    Science.gov (United States)

    Haddad, Khaled; Egodawatta, Prasanna; Rahman, Ataur; Goonetilleke, Ashantha

    2014-11-01

    Stormwater pollution is linked to stream ecosystem degradation. In predicting stormwater pollution, various types of modelling techniques are adopted. The accuracy of predictions provided by these models depends on the data quality, appropriate estimation of model parameters, and the validation undertaken. It is well understood that available water quality datasets in urban areas span only relatively short time scales unlike water quantity data, which limits the applicability of the developed models in engineering and ecological assessment of urban waterways. This paper presents the application of leave-one-out (LOO) and Monte Carlo cross validation (MCCV) procedures in a Monte Carlo framework for the validation and estimation of uncertainty associated with pollutant wash-off when models are developed using a limited dataset. It was found that the application of MCCV is likely to result in a more realistic measure of model coefficients than LOO. Most importantly, MCCV and LOO were found to be effective in model validation when dealing with a small sample size which hinders detailed model validation and can undermine the effectiveness of stormwater quality management strategies.

  19. Uncertainty and Evaluation of Impacts Modeling at Regional Scales in Integrated Assessment: the Case of Buildings

    Science.gov (United States)

    Clarke, L.; Zhou, Y.; Eom, J.; Kyle, P.; Daly, D.

    2012-12-01

    Integrated assessment (IA) models have traditionally focused on the evaluation of climate mitigation strategies. However, in recent years, efforts to consider both impacts and mitigation simultaneously have expanded dramatically. Because climate impacts are inherently regional in scale, the incorporation of impacts into IA modeling - which is inherently global in character - raises a range of challenges beyond the already substantial challenges associated with modeling impacts. In particular, it raises questions about how to best evaluate and diagnose the resulting representations of impacts, and how to characterize the uncertainty surrounding associated projections. This presentation will provide an overview of the challenges and uncertainties surrounding modeling climate impacts on building heating and cooling demands in an integrated assessment modeling framework - the Global Change Assessment Model (GCAM). The presentation will first discuss the issues associated with modeling building heating and cooling degree days in IA models. It will review research using spatially explicit climate and population information to inform a standard version of GCAM with fourteen geopolitical regions. It will discuss a new subregional version of GCAM in which building energy consumption is resolved at a fifty-state level. The presentation will also characterize efforts to link GCAM to more technologically resolved buildings models to gain insights about demands at higher temporal resolution. The second portion of the presentation will discuss the uncertainties associated with projections of building heating and cooling demands at various scales. A range of key uncertainties are important. This includes a range of uncertainties surrounding the nature of changes to global and regional climates, with particular emphasis on the uncertainty surrounding temperature projections. In addition, the linkage in this research between human and Earth systems means that the projections are

  20. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    DEFF Research Database (Denmark)

    Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.;

    2016-01-01

    to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...... with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based...

  1. Cost-Benefit Assessment of Inspection and Repair Planning for Ship Structures Considering Corrosion Model Uncertainty

    Institute of Scientific and Technical Information of China (English)

    LI Dian-qing; TANG Wen-yong; ZHANG Sheng-kun

    2005-01-01

    Owing to high costs and unnecessary inspections necessitated by the traditional inspection planning for ship structures, the risk-based inspection and repair planning should be investigated for the most cost-effective inspection. This paper aims to propose a cost-benefit assessment model of risk-based inspection and repair planning for ship structures subjected to corrosion deterioration. Then, the benefit-cost ratio is taken to be an index for the selection of the optimal inspection and repair strategy. The planning problem is formulated as an optimization problem where the benefit-cost ratio for the expected lifetime is maximized with a constraint on the minimum acceptable reliability index. To account for the effect of corrosion model uncertainty on the cost-benefit assessment, two corrosion models, namely, Paik's model and Guedes Soares' model, are adopted for analysis. A numerical example is presented to illustrate the proposed method. Sensitivity studies are also provided. The results indicate that the proposed method of risk-based cost-benefit analysis can effectively integrate the economy with reliability of the inspection and repair planning. A balance can be achieved between the risk cost and total expected inspection and repair costs with the proposed method, which is very effective in selecting the optimal inspection and repair strategy. It is pointed out that the corrosion model uncertainty and parametric uncertainty have a significant impact on the cost-benefit assessment of inspection and repair planning.

  2. Uncertainty quantification and reliability assessment in operational oil spill forecast modeling system.

    Science.gov (United States)

    Hou, Xianlong; Hodges, Ben R; Feng, Dongyu; Liu, Qixiao

    2017-03-15

    As oil transport increasing in the Texas bays, greater risks of ship collisions will become a challenge, yielding oil spill accidents as a consequence. To minimize the ecological damage and optimize rapid response, emergency managers need to be informed with how fast and where oil will spread as soon as possible after a spill. The state-of-the-art operational oil spill forecast modeling system improves the oil spill response into a new stage. However uncertainty due to predicted data inputs often elicits compromise on the reliability of the forecast result, leading to misdirection in contingency planning. Thus understanding the forecast uncertainty and reliability become significant. In this paper, Monte Carlo simulation is implemented to provide parameters to generate forecast probability maps. The oil spill forecast uncertainty is thus quantified by comparing the forecast probability map and the associated hindcast simulation. A HyosPy-based simple statistic model is developed to assess the reliability of an oil spill forecast in term of belief degree. The technologies developed in this study create a prototype for uncertainty and reliability analysis in numerical oil spill forecast modeling system, providing emergency managers to improve the capability of real time operational oil spill response and impact assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Directory of Open Access Journals (Sweden)

    A. E. Sikorska

    2011-12-01

    Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced by 150% with Bayesian updating, using only a few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  4. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Directory of Open Access Journals (Sweden)

    A. E. Sikorska

    2012-04-01

    Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  5. Evaluating uncertainty in simulation models

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  6. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites

    Science.gov (United States)

    Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  7. A Bayesian belief network approach for assessing uncertainty in conceptual site models at contaminated sites.

    Science.gov (United States)

    Thomsen, Nanna I; Binning, Philip J; McKnight, Ursula S; Tuxen, Nina; Bjerg, Poul L; Troldborg, Mads

    2016-05-01

    A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information

  8. Uncertainty propagation in a radionuclide transport model for performance assessment of a nuclear waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Dutfoy, A. [Electricite de France R and D Safety and Reliability Branch (EDF), 92 - Clamart (France); Bouton, M. [Electricite de France R and D National Hydraulic Lab. and Environment (EDF), 78 - Chatou (France)

    2001-07-01

    Given the complexity of the involved phenomenon, performance assessment of a nuclear waste disposal requires numerical modelling. Because many of the input parameters of models are uncertain, analysis of uncertainties and their impact on the probabilistic outcome has become of major importance. This paper presents the EDF Research and Development Division methodology to propagate uncertainties arising from the parameters through models. This reliability approach provides two important quantitative results: an estimate of the probability that the outcome exceeds some two important quantitative results: an estimate of the probability that the outcome exceeds some specified threshold level (called failure event), and a probabilistic sensitivity measure which quantifies the relative importance of each uncertain variable with respect to the probabilistic outcome. Such results could become an integral component of the decision process for the nuclear disposal. The reliability method proposed in this paper is applied to a radionuclide transport model. (authors)

  9. A practical method to assess model sensitivity and parameter uncertainty in C cycle models

    Science.gov (United States)

    Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy

    2015-04-01

    The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary

  10. Assessing Uncertainties of Water Footprints Using an Ensemble of Crop Growth Models on Winter Wheat

    Directory of Open Access Journals (Sweden)

    Kurt Christian Kersebaum

    2016-12-01

    Full Text Available Crop productivity and water consumption form the basis to calculate the water footprint (WF of a specific crop. Under current climate conditions, calculated evapotranspiration is related to observed crop yields to calculate WF. The assessment of WF under future climate conditions requires the simulation of crop yields adding further uncertainty. To assess the uncertainty of model based assessments of WF, an ensemble of crop models was applied to data from five field experiments across Europe. Only limited data were provided for a rough calibration, which corresponds to a typical situation for regional assessments, where data availability is limited. Up to eight models were applied for wheat. The coefficient of variation for the simulated actual evapotranspiration between models was in the range of 13%–19%, which was higher than the inter-annual variability. Simulated yields showed a higher variability between models in the range of 17%–39%. Models responded differently to elevated CO2 in a FACE (Free-Air Carbon Dioxide Enrichment experiment, especially regarding the reduction of water consumption. The variability of calculated WF between models was in the range of 15%–49%. Yield predictions contributed more to this variance than the estimation of water consumption. Transpiration accounts on average for 51%–68% of the total actual evapotranspiration.

  11. Uncertainty in Integrated Assessment Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Mort Webster

    2005-10-17

    The determination of climate policy is a decision under uncertainty. The uncertainty in future climate change impacts is large, as is the uncertainty in the costs of potential policies. Rational and economically efficient policy choices will therefore seek to balance the expected marginal costs with the expected marginal benefits. This approach requires that the risks of future climate change be assessed. The decision process need not be formal or quantitative for descriptions of the risks to be useful. Whatever the decision procedure, a useful starting point is to have as accurate a description of climate risks as possible. Given the goal of describing uncertainty in future climate change, we need to characterize the uncertainty in the main causes of uncertainty in climate impacts. One of the major drivers of uncertainty in future climate change is the uncertainty in future emissions, both of greenhouse gases and other radiatively important species such as sulfur dioxide. In turn, the drivers of uncertainty in emissions are uncertainties in the determinants of the rate of economic growth and in the technologies of production and how those technologies will change over time. This project uses historical experience and observations from a large number of countries to construct statistical descriptions of variability and correlation in labor productivity growth and in AEEI. The observed variability then provides a basis for constructing probability distributions for these drivers. The variance of uncertainty in growth rates can be further modified by expert judgment if it is believed that future variability will differ from the past. But often, expert judgment is more readily applied to projected median or expected paths through time. Analysis of past variance and covariance provides initial assumptions about future uncertainty for quantities that are less intuitive and difficult for experts to estimate, and these variances can be normalized and then applied to mean

  12. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  13. Epistemic uncertainty in the ranking and categorization of probabilistic safety assessment model elements: issues and findings.

    Science.gov (United States)

    Borgonovo, Emanuele

    2008-08-01

    In this work, we study the effect of epistemic uncertainty in the ranking and categorization of elements of probabilistic safety assessment (PSA) models. We show that, while in a deterministic setting a PSA element belongs to a given category univocally, in the presence of epistemic uncertainty, a PSA element belongs to a given category only with a certain probability. We propose an approach to estimate these probabilities, showing that their knowledge allows to appreciate "the sensitivity of component categorizations to uncertainties in the parameter values" (U.S. NRC Regulatory Guide 1.174). We investigate the meaning and utilization of an assignment method based on the expected value of importance measures. We discuss the problem of evaluating changes in quality assurance, maintenance activities prioritization, etc. in the presence of epistemic uncertainty. We show that the inclusion of epistemic uncertainly in the evaluation makes it necessary to evaluate changes through their effect on PSA model parameters. We propose a categorization of parameters based on the Fussell-Vesely and differential importance (DIM) measures. In addition, issues in the calculation of the expected value of the joint importance measure are present when evaluating changes affecting groups of components. We illustrate that the problem can be solved using DIM. A numerical application to a case study concludes the work.

  14. Bayesian Assessment of the Uncertainties of Estimates of a Conceptual Rainfall-Runoff Model Parameters

    Science.gov (United States)

    Silva, F. E. O. E.; Naghettini, M. D. C.; Fernandes, W.

    2014-12-01

    This paper evaluated the uncertainties associated with the estimation of the parameters of a conceptual rainfall-runoff model, through the use of Bayesian inference techniques by Monte Carlo simulation. The Pará River sub-basin, located in the upper São Francisco river basin, in southeastern Brazil, was selected for developing the studies. In this paper, we used the Rio Grande conceptual hydrologic model (EHR/UFMG, 2001) and the Markov Chain Monte Carlo simulation method named DREAM (VRUGT, 2008a). Two probabilistic models for the residues were analyzed: (i) the classic [Normal likelihood - r ≈ N (0, σ²)]; and (ii) a generalized likelihood (SCHOUPS & VRUGT, 2010), in which it is assumed that the differences between observed and simulated flows are correlated, non-stationary, and distributed as a Skew Exponential Power density. The assumptions made for both models were checked to ensure that the estimation of uncertainties in the parameters was not biased. The results showed that the Bayesian approach proved to be adequate to the proposed objectives, enabling and reinforcing the importance of assessing the uncertainties associated with hydrological modeling.

  15. Modelling the Epistemic Uncertainty in the Vulnerability Assessment Component of an Earthquake Loss Model

    Science.gov (United States)

    Crowley, H.; Modica, A.

    2009-04-01

    Loss estimates have been shown in various studies to be highly sensitive to the methodology employed, the seismicity and ground-motion models, the vulnerability functions, and assumed replacement costs (e.g. Crowley et al., 2005; Molina and Lindholm, 2005; Grossi, 2000). It is clear that future loss models should explicitly account for these epistemic uncertainties. Indeed, a cause of frequent concern in the insurance and reinsurance industries is precisely the fact that for certain regions and perils, available commercial catastrophe models often yield significantly different loss estimates. Of equal relevance to many users is the fact that updates of the models sometimes lead to very significant changes in the losses compared to the previous version of the software. In order to model the epistemic uncertainties that are inherent in loss models, a number of different approaches for the hazard, vulnerability, exposure and loss components should be clearly and transparently applied, with the shortcomings and benefits of each method clearly exposed by the developers, such that the end-users can begin to compare the results and the uncertainty in these results from different models. This paper looks at an application of a logic-tree type methodology to model the epistemic uncertainty in the vulnerability component of a loss model for Tunisia. Unlike other countries which have been subjected to damaging earthquakes, there has not been a significant effort to undertake vulnerability studies for the building stock in Tunisia. Hence, when presented with the need to produce a loss model for a country like Tunisia, a number of different approaches can and should be applied to model the vulnerability. These include empirical procedures which utilise observed damage data, and mechanics-based methods where both the structural characteristics and response of the buildings are analytically modelled. Some preliminary applications of the methodology are presented and discussed

  16. Assessing rainfall triggered landslide hazards through physically based models under uncertainty

    Science.gov (United States)

    Balin, D.; Metzger, R.; Fallot, J. M.; Reynard, E.

    2009-04-01

    Hazard and risk assessment require, besides good data, good simulation capabilities to allow prediction of events and their consequences. The present study introduces a landslide hazards assessment strategy based on the coupling of hydrological physically based models with slope stability models that should be able to cope with uncertainty of input data and model parameters. The hydrological model used is based on the Water balance Simulation Model, WASIM-ETH (Schulla et al., 1997), a fully distributed hydrological model that has been successfully used previously in the alpine regions to simulate runoff, snowmelt, glacier melt, and soil erosion and impact of climate change on these. The study region is the Vallon de Nant catchment (10km2) in the Swiss Alps. A sound sensitivity analysis will be conducted in order to choose the discretization threshold derived from a Laser DEM model, to which the hydrological model yields the best compromise between performance and time computation. The hydrological model will be further coupled with slope stability methods (that use the topographic index and the soil moisture such as derived from the hydrological model) to simulate the spatial distribution of the initiation areas of different geomorphic processes such as debris flows and rainfall triggered landslides. To calibrate the WASIM-ETH model, the Monte Carlo Markov Chain Bayesian approach is privileged (Balin, 2004, Schaefli et al., 2006). The model is used in a single and a multi-objective frame to simulate discharge and soil moisture with uncertainty at representative locations. This information is further used to assess the potential initial areas for rainfall triggered landslides and to study the impact of uncertain input data, model parameters and simulated responses (discharge and soil moisture) on the modelling of geomorphological processes.

  17. Bayesian uncertainty assessment of rainfall-runoff models for small urban basins - the influence of the rating curve

    Science.gov (United States)

    Sikorska, A. E.; Scheidegger, A.; Banasik, K.; Rieckermann, J.

    2012-04-01

    Keywords: uncertainty assessment, rating curve uncertainties, Bayesian inference, rainfall-runoff models, small urban basins In hydrological flood forecasting, the problem of quantitative assessment of predictive uncertainties has been widely recognized. Despite several important findings in recent years, which helped to distinguish uncertainty contribution from input uncertainty (e.g., due to poor rainfall data), model structure deficits, parameter uncertainties and measurement errors, uncertainty analysis still remains a challenging task. This is especially true for small urbanized basins, where monitoring data are often poor. Among other things, measurement errors have been generally assumed to be significantly smaller than the other sources of uncertainty. It has been also shown that input error and model structure deficits are contributing more to the predictive uncertainties than uncertainties regarding the model parameters (Sikorska et al., 2011). These assumptions, however, are only correct when the modeled output is directly measurable in the system. Unfortunately, river discharge usually cannot be directly measured but is converted from the measured water stage with a rating curve method. The uncertainty introduced by the rating curve was shown in resent studies (Di Baldassarre et al., 2011) to be potentially significant in flood forecasting. This is especially true when extrapolating a rating curve above the measured level, which is often the case in (urban) flooding. In this work, we therefore investigated how flood predictions for small urban basins are affected by the uncertainties associated with the rating curve. To this aim, we augmented the model structure of a conceptual rainfall-runoff model to include the applied rating curve. This enabled us not only to directly modeled measurable water levels instead of discharges, but also to propagate the uncertainty of the rating curve through the model. To compare the importance of the rating curve to the

  18. Using HYSPLIT Generated Ensembles to Improve the Simulation of Plume Dispersion and Assess Model Uncertainty.

    Science.gov (United States)

    Chai, T.; Stein, A. F.; Ngan, F.

    2016-12-01

    Over the last few years, the use of dispersion model ensembles has become an increasingly attractive approach to study atmospheric transport in the lower troposphere. The HYSPLIT modeling system has a built-in capability to produce three different simulation ensembles. These ensembles have been constructed based on applied case studies using different sets of initial conditions and internal model physical parameters. They are not meant to be comprehensive and only account for some of the components of the concentration uncertainty. The first one, called "Meteorological Grid" ensemble, is created by slightly offsetting the meteorological data to test the sensitivity of the advection calculation to the gradients in the meteorological data fields. The rationale for the shifting is to assess the effect that a limited spatial and temporal resolution meteorological data field has on the output concentration. The second, called the "Turbulence" ensemble, represents the uncertainty in the concentration calculation arising from the model's discrete characterization of the turbulent random motions of its lagrangian particles. In this ensemble approach, the number of particles released is reduced and multiple simulations are run, each with a different random number seed. The third, the "Physics" ensemble, is constructed by varying key physical model parameters and model options such as the Lagrangian representation of the particles/puffs, Lagrangian timescales, and vertical and horizontal dispersion parameterizations. One of the biggest challenges in creating dispersion ensembles is developing the appropriate member selection process to get the most accurate results, quantify ensemble uncertainty, and use computing resources more efficiently by avoiding the use of redundant model information. In this work, we use the HYSPLIT modeling system to generate ensembles and evaluate them against the Cross-Appalachian Tracer Experiment (CAPTEX). Furthermore, we apply a reduction

  19. Sensitivity Analysis and Uncertainty Characterization of Subnational Building Energy Demand in an Integrated Assessment Model

    Science.gov (United States)

    Scott, M. J.; Daly, D.; McJeon, H.; Zhou, Y.; Clarke, L.; Rice, J.; Whitney, P.; Kim, S.

    2012-12-01

    Residential and commercial buildings are a major source of energy consumption and carbon dioxide emissions in the United States, accounting for 41% of energy consumption and 40% of carbon emissions in 2011. Integrated assessment models (IAMs) historically have been used to estimate the impact of energy consumption on greenhouse gas emissions at the national and international level. Increasingly they are being asked to evaluate mitigation and adaptation policies that have a subnational dimension. In the United States, for example, building energy codes are adopted and enforced at the state and local level. Adoption of more efficient appliances and building equipment is sometimes directed or actively promoted by subnational governmental entities for mitigation or adaptation to climate change. The presentation reports on new example results from the Global Change Assessment Model (GCAM) IAM, one of a flexibly-coupled suite of models of human and earth system interactions known as the integrated Regional Earth System Model (iRESM) system. iRESM can evaluate subnational climate policy in the context of the important uncertainties represented by national policy and the earth system. We have added a 50-state detailed U.S. building energy demand capability to GCAM that is sensitive to national climate policy, technology, regional population and economic growth, and climate. We are currently using GCAM in a prototype stakeholder-driven uncertainty characterization process to evaluate regional climate mitigation and adaptation options in a 14-state pilot region in the U.S. upper Midwest. The stakeholder-driven decision process involves several steps, beginning with identifying policy alternatives and decision criteria based on stakeholder outreach, identifying relevant potential uncertainties, then performing sensitivity analysis, characterizing the key uncertainties from the sensitivity analysis, and propagating and quantifying their impact on the relevant decisions. In the

  20. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  1. A software tool to assess uncertainty in transient-storage model parameters using Monte Carlo simulations

    Science.gov (United States)

    Ward, Adam S.; Kelleher, Christa A.; Mason, Seth J. K.; Wagener, Thorsten; McIntyre, Neil; McGlynn, Brian L.; Runkel, Robert L.; Payn, Robert A.

    2017-01-01

    Researchers and practitioners alike often need to understand and characterize how water and solutes move through a stream in terms of the relative importance of in-stream and near-stream storage and transport processes. In-channel and subsurface storage processes are highly variable in space and time and difficult to measure. Storage estimates are commonly obtained using transient-storage models (TSMs) of the experimentally obtained solute-tracer test data. The TSM equations represent key transport and storage processes with a suite of numerical parameters. Parameter values are estimated via inverse modeling, in which parameter values are iteratively changed until model simulations closely match observed solute-tracer data. Several investigators have shown that TSM parameter estimates can be highly uncertain. When this is the case, parameter values cannot be used reliably to interpret stream-reach functioning. However, authors of most TSM studies do not evaluate or report parameter certainty. Here, we present a software tool linked to the One-dimensional Transport with Inflow and Storage (OTIS) model that enables researchers to conduct uncertainty analyses via Monte-Carlo parameter sampling and to visualize uncertainty and sensitivity results. We demonstrate application of our tool to 2 case studies and compare our results to output obtained from more traditional implementation of the OTIS model. We conclude by suggesting best practices for transient-storage modeling and recommend that future applications of TSMs include assessments of parameter certainty to support comparisons and more reliable interpretations of transport processes.

  2. A meta model-based methodology for an energy savings uncertainty assessment of building retrofitting

    Directory of Open Access Journals (Sweden)

    Caucheteux Antoine

    2016-01-01

    Full Text Available To reduce greenhouse gas emissions, energy retrofitting of building stock presents significant potential for energy savings. In the design stage, energy savings are usually assessed through Building Energy Simulation (BES. The main difficulty is to first assess the energy efficiency of the existing buildings, in other words, to calibrate the model. As calibration is an under determined problem, there is many solutions for building representation in simulation tools. In this paper, a method is proposed to assess not only energy savings but also their uncertainty. Meta models, using experimental designs, are used to identify many acceptable calibrations: sets of parameters that provide the most accurate representation of the building are retained to calculate energy savings. The method was applied on an existing office building modeled with the TRNsys BES. The meta model, using 13 parameters, is built with no more than 105 simulations. The evaluation of the meta model on thousands of new simulations gives a normalized mean bias error between the meta model and BES of <4%. Energy savings are assessed based on six energy savings concepts, which indicate savings of 2–45% with a standard deviation ranging between 1.3% and 2.5%.

  3. Recent developments in predictive uncertainty assessment based on the model conditional processor approach

    Directory of Open Access Journals (Sweden)

    G. Coccia

    2011-10-01

    Full Text Available The work aims at discussing the role of predictive uncertainty in flood forecasting and flood emergency management, its relevance to improve the decision making process and the techniques to be used for its assessment.

    Real time flood forecasting requires taking into account predictive uncertainty for a number of reasons. Deterministic hydrological/hydraulic forecasts give useful information about real future events, but their predictions, as usually done in practice, cannot be taken and used as real future occurrences but rather used as pseudo-measurements of future occurrences in order to reduce the uncertainty of decision makers. Predictive Uncertainty (PU is in fact defined as the probability of occurrence of a future value of a predictand (such as water level, discharge or water volume conditional upon prior observations and knowledge as well as on all the information we can obtain on that specific future value from model forecasts. When dealing with commensurable quantities, as in the case of floods, PU must be quantified in terms of a probability distribution function which will be used by the emergency managers in their decision process in order to improve the quality and reliability of their decisions.

    After introducing the concept of PU, the presently available processors are introduced and discussed in terms of their benefits and limitations. In this work the Model Conditional Processor (MCP has been extended to the possibility of using two joint Truncated Normal Distributions (TNDs, in order to improve adaptation to low and high flows.

    The paper concludes by showing the results of the application of the MCP on two case studies, the Po river in Italy and the Baron Fork river, OK, USA. In the Po river case the data provided by the Civil Protection of the Emilia Romagna region have been used to implement an operational example, where the predicted variable is the observed water level. In the Baron Fork River

  4. Critical loads - assessment of uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Barkman, A.

    1998-10-01

    The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data

  5. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining...... the identifiability of the parameters and results in satisfactory multi-variable simulations and uncertainty estimates. However, the parameter uncertainty alone cannot explain the total uncertainty at all the sites, due to limitations in the distributed data included in the model calibration. The study also indicates...

  6. Uncertainty in runoff based on Global Climate Model precipitation and temperature data – Part 1: Assessment of Global Climate Models

    Directory of Open Access Journals (Sweden)

    T. A. McMahon

    2014-05-01

    Full Text Available Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between Global Climate Models (GCMs and within a GCM. Uncertainty between GCM projections of future climate can be assessed through analysis of runs of a given scenario from a wide range of GCMs. Within GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The objective of this, the first of two complementary papers, is to reduce between-GCM uncertainty by identifying and removing poorly performing GCMs prior to the analysis presented in the second paper. Here we assess how well 46 runs from 22 Coupled Model Intercomparison Project phase 3 (CMIP3 GCMs are able to reproduce observed precipitation and temperature climatological statistics. The performance of each GCM in reproducing these statistics was ranked and better performing GCMs identified for later analyses. Observed global land surface precipitation and temperature data were drawn from the CRU 3.10 gridded dataset and re-sampled to the resolution of each GCM for comparison. Observed and GCM based estimates of mean and standard deviation of annual precipitation, mean annual temperature, mean monthly precipitation and temperature and Köppen climate type were compared. The main metrics for assessing GCM performance were the Nash–Sutcliffe efficiency index and RMSE between modelled and observed long-term statistics. This information combined with a literature review of the performance of the CMIP3 models identified the following five models as the better performing models for the next phase of our analysis in assessing the uncertainty in runoff estimated from GCM projections of precipitation and temperature: HadCM3 (Hadley Centre for Climate Prediction and Research, MIROCM (Center for Climate System Research (The University of Tokyo, National

  7. Distributed energy balance modeling of South Cascade Glacier, Washington and assessment of model uncertainty

    Science.gov (United States)

    Anslow, Faron S.; Hostetler, S.; Bidlake, W.R.; Clark, P.U.

    2008-01-01

    We have developed a physically based, distributed surface energy balance model to simulate glacier mass balance under meteorological and climatological forcing. Here we apply the model to estimate summer ablation on South Cascade Glacier, Washington, for the 2004 and 2005 mass balance seasons. To arrive at optimal mass balance simulations, we investigate and quantify model uncertainty associated with selecting from a range of physical parameter values that are not commonly measured in glaciological mass balance field studies. We optimize the performance of the model by varying values for atmospheric transmissivity, the albedo of surrounding topography, precipitation-elevation lapse rate, surface roughness for turbulent exchange of momentum, and snow albedo aging coefficient. Of these the snow aging parameter and precipitation lapse rates have the greatest influence on the modeled ablation. We examined model sensitivity to varying parameters by performing an additional 103 realizations with parameters randomly chosen over a ??5% range centered about the optimum values. The best fit suite of model parameters yielded a net balance of -1.69??0.38 m water equivalent (WE) for the 2004 water year and -2.10??0.30 m WE up to 11 September 2005. The 2004 result is within 3% of the measured value. These simulations account for 91% and 93% of the variance in measured ablation for the respective years. Copyright 2008 by the American Geophysical Union.

  8. Hanford groundwater modeling: statistical methods for evaluating uncertainty and assessing sampling effectiveness

    Energy Technology Data Exchange (ETDEWEB)

    McLaughlin, D.B.

    1979-01-01

    This report is the first in a series of three documents which address the role of uncertainty in the Rockwell Hanford Operations groundwater model development and application program at Hanford Site. Groundwater data collection activities at Hanford are reviewed as they relate to Rockwell groundwater modeling. Methods of applying statistical and probability theory in quantifying the propagation of uncertainty from field measurements to model predictions are discussed. It is shown that measures of model accuracy or uncertainty provided by a statistical analysis can be useful in guiding model development and sampling network design. Recommendations are presented in the areas of model input data needs, parameter estimation data needs, and model verification and variance estimation data needs. 8 figures.

  9. Droplet number uncertainties associated with CCN: an assessment using observations and a global model adjoint

    Directory of Open Access Journals (Sweden)

    R. H. Moore

    2013-04-01

    Full Text Available We use the Global Modelling Initiative (GMI chemical transport model with a cloud droplet parameterisation adjoint to quantify the sensitivity of cloud droplet number concentration to uncertainties in predicting CCN concentrations. Published CCN closure uncertainties for six different sets of simplifying compositional and mixing state assumptions are used as proxies for modelled CCN uncertainty arising from application of those scenarios. It is found that cloud droplet number concentrations (Nd are fairly insensitive to the number concentration (Na of aerosol which act as CCN over the continents (∂lnNd/∂lnNa ~10–30%, but the sensitivities exceed 70% in pristine regions such as the Alaskan Arctic and remote oceans. This means that CCN concentration uncertainties of 4–71% translate into only 1–23% uncertainty in cloud droplet number, on average. Since most of the anthropogenic indirect forcing is concentrated over the continents, this work shows that the application of Köhler theory and attendant simplifying assumptions in models is not a major source of uncertainty in predicting cloud droplet number or anthropogenic aerosol indirect forcing for the liquid, stratiform clouds simulated in these models. However, it does highlight the sensitivity of some remote areas to pollution brought into the region via long-range transport (e.g., biomass burning or from seasonal biogenic sources (e.g., phytoplankton as a source of dimethylsulfide in the southern oceans. Since these transient processes are not captured well by the climatological emissions inventories employed by current large-scale models, the uncertainties in aerosol-cloud interactions during these events could be much larger than those uncovered here. This finding motivates additional measurements in these pristine regions, for which few observations exist, to quantify the impact (and associated uncertainty of transient aerosol processes on cloud properties.

  10. Towards Robust Energy Systems Modeling: Examinging Uncertainty in Fossil Fuel-Based Life Cycle Assessment Approaches

    Science.gov (United States)

    Venkatesh, Aranya

    Increasing concerns about the environmental impacts of fossil fuels used in the U.S. transportation and electricity sectors have spurred interest in alternate energy sources, such as natural gas and biofuels. Life cycle assessment (LCA) methods can be used to estimate the environmental impacts of incumbent energy sources and potential impact reductions achievable through the use of alternate energy sources. Some recent U.S. climate policies have used the results of LCAs to encourage the use of low carbon fuels to meet future energy demands in the U.S. However, the LCA methods used to estimate potential reductions in environmental impact have some drawbacks. First, the LCAs are predominantly based on deterministic approaches that do not account for any uncertainty inherent in life cycle data and methods. Such methods overstate the accuracy of the point estimate results, which could in turn lead to incorrect and (consequent) expensive decision-making. Second, system boundaries considered by most LCA studies tend to be limited (considered a manifestation of uncertainty in LCA). Although LCAs can estimate the benefits of transitioning to energy systems of lower environmental impact, they may not be able to characterize real world systems perfectly. Improved modeling of energy systems mechanisms can provide more accurate representations of reality and define more likely limits on potential environmental impact reductions. This dissertation quantitatively and qualitatively examines the limitations in LCA studies outlined previously. The first three research chapters address the uncertainty in life cycle greenhouse gas (GHG) emissions associated with petroleum-based fuels, natural gas and coal consumed in the U.S. The uncertainty in life cycle GHG emissions from fossil fuels was found to range between 13 and 18% of their respective mean values. For instance, the 90% confidence interval of the life cycle GHG emissions of average natural gas consumed in the U.S was found to

  11. High resolution multi model Climate change scenario over India including first uncertainty assessment

    Science.gov (United States)

    Kumar, P.; Wiltshire, A.; Asharaf, S.; Ahrens, B.; Lucas-Picher, P.; Christensen, J. H.; Gobiet, A.; Saeed, F.; Hagemann, S.; Jacob, D.

    2011-12-01

    This study presents the possible regional climate change over SA with a focus over India as simulated by three very-high-resolution regional climate models. The models are driven by the same lateral boundary conditions from two global models (ECHAM5-MPIOM and HadCM3) under the IPCC AR4 SRES A1B scenario at horizontal resolution of ~25km, except one model which is simulated for only one GCM. The results are presented for two time slices 2021-2050 and 2070-2099. The analysis concentrates along precipitation and temperature over land and focuses mainly on the monsoon season. The circulation parameter is also discussed. In general all models show a clear signal of gradual wide-spread warming throughout the 21st century. The ensemble-mean warming evident at the end of 2050 is 1-2K, whereas it is 3-5K at the end of century. The projected pattern of the precipitation change shows spatial variability. The increase in precipitation is noticed over peninsular and coastal areas and no change or decrease over areas away from the ocean. The influence of the driving GCM on projected precipitation change simulated with each RCM is as strong as the variability among the RCMs driven with one GCM. Some results of the first uncertainties assessment are also presented.

  12. Model Uncertainty via the Integration of Hormesis and LNT as the Default in Cancer Risk Assessment

    Directory of Open Access Journals (Sweden)

    Edward J. Calabrese

    2015-12-01

    Full Text Available On June 23, 2015, the US Nuclear Regulatory Commission (NRC issued a formal notice in the Federal Register that it would consider whether “it should amend its ‘Standards for Protection Against Radiation’ regulations from the linear non-threshold (LNT model of radiation protection to the hormesis model.” The present commentary supports this recommendation based on the (1 flawed and deceptive history of the adoption of LNT by the US National Academy of Sciences (NAS in 1956; (2 the documented capacity of hormesis to make more accurate predictions of biological responses for diverse biological end points in the low-dose zone; (3 the occurrence of extensive hormetic data from the peer-reviewed biomedical literature that revealed hormetic responses are highly generalizable, being independent of biological model, end point measured, inducing agent, level of biological organization, and mechanism; and (4 the integration of hormesis and LNT models via a model uncertainty methodology that optimizes public health responses at 10−4. Thus, both LNT and hormesis can be integratively used for risk assessment purposes, and this integration defines the so-called “regulatory sweet spot.”

  13. Assessment of structural model and parameter uncertainty with a multi-model system for soil water balance models

    Science.gov (United States)

    Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz

    2016-04-01

    Water for agriculture is strongly limited in arid and semi-arid regions and often of low quality in terms of salinity. The application of saline waters for irrigation increases the salt load in the rooting zone and has to be managed by leaching to maintain a healthy soil, i.e. to wash out salts by additional irrigation. Dynamic simulation models are helpful tools to calculate the root zone water fluxes and soil salinity content in order to investigate best management practices. However, there is little information on structural and parameter uncertainty for simulations regarding the water and salt balance of saline irrigation. Hence, we established a multi-model system with four different models (AquaCrop, RZWQM, SWAP, Hydrus1D/UNSATCHEM) to analyze the structural and parameter uncertainty by using the Global Likelihood and Uncertainty Estimation (GLUE) method. Hydrus1D/UNSATCHEM and SWAP were set up with multiple sets of different implemented functions (e.g. matric and osmotic stress for root water uptake) which results in a broad range of different model structures. The simulations were evaluated against soil water and salinity content observations. The posterior distribution of the GLUE analysis gives behavioral parameters sets and reveals uncertainty intervals for parameter uncertainty. Throughout all of the model sets, most parameters accounting for the soil water balance show a low uncertainty, only one or two out of five to six parameters in each model set displays a high uncertainty (e.g. pore-size distribution index in SWAP and Hydrus1D/UNSATCHEM). The differences between the models and model setups reveal the structural uncertainty. The highest structural uncertainty is observed for deep percolation fluxes between the model sets of Hydrus1D/UNSATCHEM (~200 mm) and RZWQM (~500 mm) that are more than twice as high for the latter. The model sets show a high variation in uncertainty intervals for deep percolation as well, with an interquartile range (IQR) of

  14. A semi-empirical model to assess uncertainty of spatial patterns of erosion

    NARCIS (Netherlands)

    Sterk, G.; Vigiak, O.; Romanowicz, R.J.; Beven, K.J.

    2006-01-01

    Distributed erosion models are potentially good tools for locating soil sediment sources and guiding efficient Soil and Water Conservation (SWC) planning, but the uncertainty of model predictions may be high. In this study, the distribution of erosion within a catchment was predicted with a

  15. Droplet number prediction uncertainties from CCN: an integrated assessment using observations and a global adjoint model

    Directory of Open Access Journals (Sweden)

    R. H. Moore

    2012-08-01

    Full Text Available We use the Global Modeling Initiative (GMI chemical transport model with a cloud droplet parameterization adjoint to quantify the sensitivity of cloud droplet number concentration to uncertainties in predicting CCN concentrations. Published CCN closure prediction uncertainties for six different sets of simplifying compositional and mixing state assumptions are used as proxies for modeled CCN uncertainty arising from application of those scenarios. It is found that cloud droplet number concentrations are fairly insensitive to CCN-active aerosol number concentrations over the continents (∂Nd/∂Na ~ 10–30%, but the sensitivities exceed 70% in pristine regions such as the Alaskan Arctic and remote oceans. Since most of the anthropogenic indirect forcing is concentrated over the continents, this work shows that the application of Köhler theory and attendant simplifying assumptions in models is not a major source of uncertainty in predicting cloud droplet number or anthropogenic aerosol indirect forcing for the liquid, stratiform clouds simulated in these models. However, it does highlight the sensitivity of some remote areas to pollution brought into the region via long-range transport (e.g. biomass burning or from seasonal biogenic sources (e.g. phytoplankton as a source of dimethylsulfide in the southern oceans. Since these transient processes are not captured well by the climatological emissions inventories employed by current large-scale models, the uncertainties in aerosol-cloud interactions during these events could be much larger than those uncovered here. This finding motivates additional measurements in these pristine regions, which have recieved little attention to date, in order to quantify the impact of, and uncertainty associated with, transient processes in effecting changes in cloud properties.

  16. Quantitative assessment of uncertainties for a model of tropospheric ethene oxidation using the European Photoreactor (EUPHORE)

    Science.gov (United States)

    Zádor, Judit; Wagner, Volker; Wirtz, Klaus; Pilling, Michael J.

    Methods of uncertainty analysis were used for comparison of the Master Chemical Mechanism version 3 (MCMv3) with measurements made in the European Photoreactor (EUPHORE) at Valencia (Spain) to investigate model-measurement discrepancies and to obtain information on the importance of wall effects. Two EUPHORE smog chamber measurements of ethene oxidation, under high and low NO x conditions were analysed by the following methods: (i) local uncertainty analysis, (ii) the global screening method of Morris and (iii) Monte Carlo (MC) analysis with Latin hypercube sampling. For both experiments, ozone (by 25% and 30%, respectively) and formaldehyde (by 34% and 40%, respectively) are significantly over-predicted by the model calculations, while the disagreement for other species is less substantial. According to the local uncertainty analysis and the Morris method, the most important contributor to ozone uncertainty under low NO x conditions is HOCH 2CH 2O 2+NO→HOCH 2CH 2O+NO 2, while under high NO x conditions OH+NO 2→HNO 3 is the main contributor. The MC simulations give an estimate of the 2σ uncertainty for ozone as ˜20% in both scenarios at the end of the experiment. The results suggest systematic disagreement between measurements and model calculations, although the origin of this is not clear. It seems that chamber effects alone are not responsible for the observed discrepancies.

  17. Flood risk assessment and associated uncertainty

    Directory of Open Access Journals (Sweden)

    H. Apel

    2004-01-01

    Full Text Available Flood disaster mitigation strategies should be based on a comprehensive assessment of the flood risk combined with a thorough investigation of the uncertainties associated with the risk assessment procedure. Within the 'German Research Network of Natural Disasters' (DFNK the working group 'Flood Risk Analysis' investigated the flood process chain from precipitation, runoff generation and concentration in the catchment, flood routing in the river network, possible failure of flood protection measures, inundation to economic damage. The working group represented each of these processes by deterministic, spatially distributed models at different scales. While these models provide the necessary understanding of the flood process chain, they are not suitable for risk and uncertainty analyses due to their complex nature and high CPU-time demand. We have therefore developed a stochastic flood risk model consisting of simplified model components associated with the components of the process chain. We parameterised these model components based on the results of the complex deterministic models and used them for the risk and uncertainty analysis in a Monte Carlo framework. The Monte Carlo framework is hierarchically structured in two layers representing two different sources of uncertainty, aleatory uncertainty (due to natural and anthropogenic variability and epistemic uncertainty (due to incomplete knowledge of the system. The model allows us to calculate probabilities of occurrence for events of different magnitudes along with the expected economic damage in a target area in the first layer of the Monte Carlo framework, i.e. to assess the economic risks, and to derive uncertainty bounds associated with these risks in the second layer. It is also possible to identify the contributions of individual sources of uncertainty to the overall uncertainty. It could be shown that the uncertainty caused by epistemic sources significantly alters the results

  18. Uncertainties and assessments of chemistry-climate models of the stratosphere

    Directory of Open Access Journals (Sweden)

    J. Austin

    2003-01-01

    Full Text Available In recent years a number of chemistry-climate models have been developed with an emphasis on the stratosphere. Such models cover a wide range of time scales of integration and vary considerably in complexity. The results of specific diagnostics are here analysed to examine the differences amongst individual models and observations, to assess the consistency of model predictions, with a particular focus on polar ozone. For example, many models indicate a significant cold bias in high latitudes, the "cold pole problem', particularly in the southern hemisphere during winter and spring. This is related to wave propagation from the troposphere which can be improved by improving model horizontal resolution and with the use of non-orographic gravity wave drag. As a result of the widely differing modelled polar temperatures, different amounts of polar stratospheric clouds are simulated which in turn result in varying ozone values in the models. The results are also compared to determine the possible future behaviour of ozone, with an emphasis on the polar regions and mid-latitudes. All models predict eventual ozone recovery, but give a range of results concerning its timing and extent. Differences in the simulation of gravity waves and planetary waves as well as model resolution are likely major sources of uncertainty for this issue. In the Antarctic, the ozone hole has probably reached almost its deepest although the vertical and horizontal extent of depletion may increase slightly further over the next few years. According to the model results, Antarctic ozone recovery could begin any year within the range 2001 to 2008. The limited number of models which have been integrated sufficiently far indicate that full recovery of ozone to 1980 levels may not occur in the Antarctic until about the year 2050. For the Arctic, most models indicate that small ozone losses may continue for a few more years and that recovery could begin any year within the range

  19. Uncertainty assessment of a dominant-process catchment model of dissolved phosphorus transfer

    Science.gov (United States)

    Dupas, Rémi; Salmon-Monviola, Jordy; Beven, Keith J.; Durand, Patrick; Haygarth, Philip M.; Hollaway, Michael J.; Gascuel-Odoux, Chantal

    2016-12-01

    We developed a parsimonious topography-based hydrologic model coupled with a soil biogeochemistry sub-model in order to improve understanding and prediction of soluble reactive phosphorus (SRP) transfer in agricultural headwater catchments. The model structure aims to capture the dominant hydrological and biogeochemical processes identified from multiscale observations in a research catchment (Kervidy-Naizin, 5 km2). Groundwater fluctuations, responsible for the connection of soil SRP production zones to the stream, were simulated with a fully distributed hydrologic model at 20 m resolution. The spatial variability of the soil phosphorus content and the temporal variability of soil moisture and temperature, which had previously been identified as key controlling factors of SRP solubilization in soils, were included as part of an empirical soil biogeochemistry sub-model. The modelling approach included an analysis of the information contained in the calibration data and propagation of uncertainty in model predictions using a generalized likelihood uncertainty estimation (GLUE) "limits of acceptability" framework. Overall, the model appeared to perform well given the uncertainty in the observational data, with a Nash-Sutcliffe efficiency on daily SRP loads between 0.1 and 0.8 for acceptable models. The role of hydrological connectivity via groundwater fluctuation and the role of increased SRP solubilization following dry/hot periods were captured well. We conclude that in the absence of near-continuous monitoring, the amount of information contained in the data is limited; hence, parsimonious models are more relevant than highly parameterized models. An analysis of uncertainty in the data is recommended for model calibration in order to provide reliable predictions.

  20. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    Science.gov (United States)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D. A.; Brogaard, Sara; van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-11-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsdam-Jena General Ecosystem Simulator) using socio-economic data from the SSPs and climate data from the RCPs (representative concentration pathways). The simulated range of global cropland is 893-2380 Mha in 2100 (± 1 standard deviation), with the main uncertainties arising from differences in the socio-economic conditions prescribed by the SSP scenarios and the assumptions that underpin the translation of qualitative SSP storylines into quantitative model input parameters. Uncertainties in the assumptions for population growth, technological change and cropland degradation were found to be the most important for global cropland, while uncertainty in food consumption had less influence on the results. The uncertainties arising from climate variability and the differences between climate change scenarios do not strongly affect the range of global cropland futures. Some overlap occurred across all of the conditional probabilistic futures, except for those based on SSP3. We conclude that completely different socio-economic and climate change futures, although sharing low to medium population development, can result in very similar cropland areas on the aggregated global scale.

  1. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    DEFF Research Database (Denmark)

    Thomsen, Nanna Isbak; Troldborg, Mads; McKnight, Ursula S.

    2012-01-01

    ) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk......Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1...... the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We...

  2. Uncertainty quantification's role in modeling and simulation planning, and credibility assessment through the predictive capability maturity model

    Energy Technology Data Exchange (ETDEWEB)

    Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Witkowski, Walter R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-04-13

    The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.

  3. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  4. Uncertainty and sensitivity assessments of an agricultural-hydrological model (RZWQM2) using the GLUE method

    Science.gov (United States)

    Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin

    2016-03-01

    Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This

  5. Information on Hydrologic Conceptual Models, Parameters, Uncertainty Analysis, and Data Sources for Dose Assessments at Decommissioning Sites

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.

    2000-02-28

    This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases.

  6. Assessing Reliability of Cellulose Hydrolysis Models to Support Biofuel Process Design – Identifiability and Uncertainty Analysis

    DEFF Research Database (Denmark)

    Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist

    2010-01-01

    The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...... are not informative enough (sensitivities of 16 parameters were insignificant). This indicates that the NREL model has severe parameter uncertainty, likely to be the case for other hydrolysis models as well since similar kinetic expressions are used. To overcome this impasse, we have used the Monte Carlo procedure...

  7. Modeling uncertainty in coal resource assessments, with an application to a central area of the Gillette coal field, Wyoming

    Science.gov (United States)

    Olea, Ricardo A.; Luppens, James A.

    2014-01-01

    Standards for the public disclosure of mineral resources and reserves do not require the use of any specific methodology when it comes to estimating the reliability of the resources. Unbeknownst to most intended recipients of resource appraisals, such freedom commonly results in subjective opinions or estimations based on suboptimal approaches, such as use of distance methods. This report presents the results of a study of the third of three coal deposits in which drilling density has been increased one order of magnitude in three stages. Applying geostatistical simulation, the densest dataset was used to check the results obtained by modeling the sparser drillings. We have come up with two summary displays of results based on the same simulations, which individually and combined provide a better assessment of uncertainty than traditional qualitative resource classifications: (a) a display of cell 90 percent confidence interval versus cumulative cell tonnage, and (b) a histogram of total resources. The first graph allows classification of data into any number of bins with dividers to be decided by the assessor on the basis of a discriminating variable that is statistically accepted as a measure of uncertainty, thereby improving the quality and flexibility of the modeling. The second display expands the scope of the modeling by providing a quantitative measure of uncertainty for total tonnage, which is a fundamental concern for stockholders, geologists, and decision makers. Our approach allows us to correctly model uncertainty issues not possible to predict with distance methods, such as (a) different levels of uncertainty for individual beds with the same pattern and density of drill holes, (b) different local degrees of reduction of uncertainty with drilling densification reflecting fluctuation in the complexity of the geology, (c) average reduction in uncertainty at a disproportionately lesser rate than the reduction in area per drill hole, (d) the proportional

  8. Uncertainty assessment and implications for data acquisition in support of integrated hydrologic models

    OpenAIRE

    Brunner, Philip; Doherty, J; Craig T. Simmons

    2017-01-01

    The data set used for calibration of regional numerical models which simulate groundwater flow and vadose zone processes is often dominated by head observations. It is to be expected therefore, that parameters describing vadose zone processes are poorly constrained. A number of studies on small spatial scales explored how additional data types used in calibration constrain vadose zone parameters or reduce predictive uncertainty. However, available studies focused on subsets of observati...

  9. Using the Community Land Model to Assess Uncertainty in Basin Scale GRACE-Based Groundwater Estimates

    Science.gov (United States)

    Swenson, S. C.; Lawrence, D. M.

    2015-12-01

    One method for interpreting the variability in total water storage observed by GRACE is to partition the integrated GRACE measurement into its component storage reservoirs based on information provided by hydrological models. Such models, often designed to be used in couple Earth System models, simulate the stocks and fluxes of moisture through the land surface and subsurface. One application of this method attempts to isolate groundwater changes by removing modeled surface water, snow, and soil moisture changes from GRACE total water storage estimates. Human impacts on groundwater variability can be estimated by further removing model estimates of climate-driven groundwater changes. Errors in modeled water storage components directly affect the residual groundwater estimates. Here we examine the influence of model structure and process representation on soil moisture and groundwater uncertainty using the Community Land Model, with a particular focus on basins in the western U.S.

  10. Monte Carlo approach to assess the uncertainty of wide-angle layered models: Application to the Santos Basin, Brazil

    Science.gov (United States)

    Loureiro, Afonso; Afilhado, Alexandra; Matias, Luís; Moulin, Maryline; Aslanian, Daniel

    2016-06-01

    In the Santos Basin (Brazil), two parallel wide-angle refraction profiles show different crustal structures. One shows moderate crustal velocity gradient, and a clear Moho with topography. The other has an anomalous velocity zone, and no clear Moho reflections. This has large implications on the geological and geodynamical interpretation of the basin. Model uncertainties must be excluded as a source of these differences. We developed VMONTECARLO, a tool to assess model uncertainty of layered velocity models using a Monte Carlo approach and simultaneous parameter perturbation using all picked refracted and reflected arrivals. It gives insights into the acceptable geological interpretations allowed by data and model uncertainty through velocity-depth plots that provide: a) the velocity-depth profile range that is consistent with the travel times; b) the random model that provides the best fit, keeping most of the observations covered by ray-tracing; c) insight into valid models dispersion; d) main model features unequivocally required by the travel times, e.g., first-order versus second-order discontinuities, and velocity gradient magnitudes; e) parameter value probability distribution histograms. VMONTECARLO is seamlessly integrated into a RAYINVR-based modelling work-flow, and can be used to assess final models or sound the solution space for alternate models, and is also capable of evaluating forward models without the need for inversion, thus avoiding local minima that may trap the inversion algorithms and providing information for models still not well-parametrised. Results for the Brazilian models show that the imaged structures are indeed geologically different and are not due to different interpretations of the same features within the model uncertainty bounds. These differences highlight the strong heterogeneity of the crust in the middle of the Santos Basin, where the rift is supposed to have failed.

  11. Assessment of SFR Wire Wrap Simulation Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-09-30

    Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results

  12. Assessing trends and uncertainties in satellite-era ocean chlorophyll using space-time modeling

    Science.gov (United States)

    Hammond, Matthew L.; Beaulieu, Claudie; Sahu, Sujit K.; Henson, Stephanie A.

    2017-07-01

    The presence, magnitude, and even direction of long-term trends in phytoplankton abundance over the past few decades are still debated in the literature, primarily due to differences in the data sets and methodologies used. Recent work has suggested that the satellite chlorophyll record is not yet long enough to distinguish climate change trends from natural variability, despite the high density of coverage in both space and time. Previous work has typically focused on using linear models to determine the presence of trends, where each grid cell is considered independently from its neighbors. However, trends can be more thoroughly evaluated using a spatially resolved approach. Here a Bayesian hierarchical spatiotemporal model is fitted to quantify trends in ocean chlorophyll from September 1997 to December 2013. The approach used in this study explicitly accounts for the dependence between neighboring grid cells, which allows us to estimate trend by "borrowing strength" from the spatial correlation. By way of comparison, a model without spatial correlation is also fitted. This results in a notable loss of accuracy in model fit. Additionally, we find an order of magnitude smaller global trend, and larger uncertainty, when using the spatiotemporal model: -0.023 ± 0.12% yr-1 as opposed to -0.38 ± 0.045% yr-1 when the spatial correlation is not taken into account. The improvement in accuracy of trend estimates and the more complete account of their uncertainty emphasize the solution that space-time modeling offers for studying global long-term change.

  13. Assessing uncertainties in land cover projections.

    Science.gov (United States)

    Alexander, Peter; Prestele, Reinhard; Verburg, Peter H; Arneth, Almut; Baranzelli, Claudia; Batista E Silva, Filipe; Brown, Calum; Butler, Adam; Calvin, Katherine; Dendoncker, Nicolas; Doelman, Jonathan C; Dunford, Robert; Engström, Kerstin; Eitelberg, David; Fujimori, Shinichiro; Harrison, Paula A; Hasegawa, Tomoko; Havlik, Petr; Holzhauer, Sascha; Humpenöder, Florian; Jacobs-Crisioni, Chris; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Lavalle, Carlo; Lenton, Tim; Liu, Jiayi; Meiyappan, Prasanth; Popp, Alexander; Powell, Tom; Sands, Ronald D; Schaldach, Rüdiger; Stehfest, Elke; Steinbuks, Jevgenijs; Tabeau, Andrzej; van Meijl, Hans; Wise, Marshall A; Rounsevell, Mark D A

    2017-02-01

    Understanding uncertainties in land cover projections is critical to investigating land-based climate mitigation policies, assessing the potential of climate adaptation strategies and quantifying the impacts of land cover change on the climate system. Here, we identify and quantify uncertainties in global and European land cover projections over a diverse range of model types and scenarios, extending the analysis beyond the agro-economic models included in previous comparisons. The results from 75 simulations over 18 models are analysed and show a large range in land cover area projections, with the highest variability occurring in future cropland areas. We demonstrate systematic differences in land cover areas associated with the characteristics of the modelling approach, which is at least as great as the differences attributed to the scenario variations. The results lead us to conclude that a higher degree of uncertainty exists in land use projections than currently included in climate or earth system projections. To account for land use uncertainty, it is recommended to use a diverse set of models and approaches when assessing the potential impacts of land cover change on future climate. Additionally, further work is needed to better understand the assumptions driving land use model results and reveal the causes of uncertainty in more depth, to help reduce model uncertainty and improve the projections of land cover. © 2016 John Wiley & Sons Ltd.

  14. The MIT Integrated Global System Model: A facility for Assessing and Communicating Climate Change Uncertainty (Invited)

    Science.gov (United States)

    Prinn, R. G.

    2013-12-01

    The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that

  15. Uncertainty in tsunami sediment transport modeling

    Science.gov (United States)

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  16. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.

  17. Uncertainty analysis comes to integrated assessment models for climate change…and conversely

    NARCIS (Netherlands)

    Cooke, R.M.

    2012-01-01

    This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification. T

  18. Modifying climate change habitat models using tree species-specific assessments of model uncertainty and life history-factors

    Science.gov (United States)

    Stephen N. Matthews; Louis R. Iverson; Anantha M. Prasad; Matthew P. Peters; Paul G. Rodewald

    2011-01-01

    Species distribution models (SDMs) to evaluate trees' potential responses to climate change are essential for developing appropriate forest management strategies. However, there is a great need to better understand these models' limitations and evaluate their uncertainties. We have previously developed statistical models of suitable habitat, based on both...

  19. Assessing model uncertainties in climate projections of severe, mid-latitude windstorms using seamless approach

    Science.gov (United States)

    Trzeciak, T. M.; Knippertz, P.; Owen, J. S. R.

    2012-04-01

    Despite the enormous advances made in climate change research, robust projections of the position and the strength of the North Atlantic stormtrack are not yet possible. In particular with respect to damaging windstorms, this incertitude bears enormous risks to European societies and the (re-)insurance industry. Previous studies have addressed the problem of climate model uncertainty through statistical comparisons of simulations of the current climate with (re-)analysis data and found that there is large disagreement between different climate models, different ensemble members of the same model and observed climatologies of intense cyclones. The use of different horizontal and vertical resolutions, as well as different approaches to measure storminess further complicate comparison between the results from different studies. One weakness of such statistical evaluations lies in the difficulty to separate influences of the climate model's basic state, which will be governed by slow processes such as ocean circulations or sea-ice transport, from the influence of fast processes such as energy fluxes from the ocean or latent heating on the development of the most intense storms. The former might generate a bias in storm counts through an incorrect occurrence frequency of storm-prone initial conditions, while the latter could generate a similar bias due to the lack of crucial dynamics of extreme cyclone intensification due to over-simplistic model physics or insufficient horizontal resolution. Compensating effects between the two might conceal errors and suggest higher reliability than there really is. Therefore, separating sources of uncertainty is an important step towards a more reliable interpretation of climate projections and towards targeted improvements of future model generations. A possible way to separate influences of fast and slow processes in climate projections is through a "seamless" approach of hindcasting historical, severe storms with climate models

  20. Can Bayesian Belief Networks help tackling conceptual model uncertainties in contaminated site risk assessment?

    DEFF Research Database (Denmark)

    Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.;

    models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The developed BBN combines data from desk studies and initial site investigations with expert opinion to assess which of the conceptual models are more...... help inform future investigations at a contaminated site....

  1. An ensemble approach to assess hydrological models' contribution to uncertainties in the analysis of climate change impact on water resources

    Directory of Open Access Journals (Sweden)

    J. A. Velázquez

    2013-02-01

    Full Text Available Over the recent years, several research efforts investigated the impact of climate change on water resources for different regions of the world. The projection of future river flows is affected by different sources of uncertainty in the hydro-climatic modelling chain. One of the aims of the QBic3 project (Québec-Bavarian International Collaboration on Climate Change is to assess the contribution to uncertainty of hydrological models by using an ensemble of hydrological models presenting a diversity of structural complexity (i.e., lumped, semi distributed and distributed models. The study investigates two humid, mid-latitude catchments with natural flow conditions; one located in Southern Québec (Canada and one in Southern Bavaria (Germany. Daily flow is simulated with four different hydrological models, forced by outputs from regional climate models driven by global climate models over a reference (1971–2000 and a future (2041–2070 period. The results show that, for our hydrological model ensemble, the choice of model strongly affects the climate change response of selected hydrological indicators, especially those related to low flows. Indicators related to high flows seem less sensitive on the choice of the hydrological model.

  2. On the use of hierarchical probabilistic models for characterizing and managing uncertainty in risk/safety assessment.

    Science.gov (United States)

    Kodell, Ralph L; Chen, James J

    2007-04-01

    A general probabilistically-based approach is proposed for both cancer and noncancer risk/safety assessments. The familiar framework of the original ADI/RfD formulation is used, substituting in the numerator a benchmark dose derived from a hierarchical pharmacokinetic/pharmacodynamic model and in the denominator a unitary uncertainty factor derived from a hierarchical animal/average human/sensitive human model. The empirical probability distributions of the numerator and denominator can be combined to produce an empirical human-equivalent distribution for an animal-derived benchmark dose in external-exposure units.

  3. An ensemble approach to assess hydrological models' contribution to uncertainties in the analysis of climate change impact on water resources

    Directory of Open Access Journals (Sweden)

    J. A. Velázquez

    2012-06-01

    Full Text Available Over the recent years, several research efforts investigated the impact of climate change on water resources for different regions of the world. The projection of future river flows is affected by different sources of uncertainty in the hydro-climatic modelling chain. One of the aims of the QBic3 project (Québec-Bavarian International Collaboration on Climate Change is to assess the contribution to uncertainty of hydrological models by using an ensemble of hydrological models presenting a diversity of structural complexity (i.e. lumped, semi distributed and distributed models. The study investigates two humid, mid-latitude catchments with natural flow conditions; one located in Southern Québec (Canada and one in Southern Bavaria (Germany. Daily flow is simulated with four different hydrological models, forced by outputs from regional climate models driven by a given number of GCMs' members over a reference (1971–2000 and a future (2041–2070 periods. The results show that the choice of the hydrological model does strongly affect the climate change response of selected hydrological indicators, especially those related to low flows. Indicators related to high flows seem less sensitive on the choice of the hydrological model. Therefore, the computationally less demanding models (usually simple, lumped and conceptual give a significant level of trust for high and overall mean flows.

  4. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    Directory of Open Access Journals (Sweden)

    T. O. Sonnenborg

    2015-04-01

    Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  5. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  6. Assessing flood forecast uncertainty with fuzzy arithmetic

    Directory of Open Access Journals (Sweden)

    de Bruyn Bertrand

    2016-01-01

    Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out

  7. Assessment of parameter uncertainty in hydrological model using a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis method

    Science.gov (United States)

    Zhang, Junlong; Li, Yongping; Huang, Guohe; Chen, Xi; Bao, Anming

    2016-07-01

    Without a realistic assessment of parameter uncertainty, decision makers may encounter difficulties in accurately describing hydrologic processes and assessing relationships between model parameters and watershed characteristics. In this study, a Markov-Chain-Monte-Carlo-based multilevel-factorial-analysis (MCMC-MFA) method is developed, which can not only generate samples of parameters from a well constructed Markov chain and assess parameter uncertainties with straightforward Bayesian inference, but also investigate the individual and interactive effects of multiple parameters on model output through measuring the specific variations of hydrological responses. A case study is conducted for addressing parameter uncertainties in the Kaidu watershed of northwest China. Effects of multiple parameters and their interactions are quantitatively investigated using the MCMC-MFA with a three-level factorial experiment (totally 81 runs). A variance-based sensitivity analysis method is used to validate the results of parameters' effects. Results disclose that (i) soil conservation service runoff curve number for moisture condition II (CN2) and fraction of snow volume corresponding to 50% snow cover (SNO50COV) are the most significant factors to hydrological responses, implying that infiltration-excess overland flow and snow water equivalent represent important water input to the hydrological system of the Kaidu watershed; (ii) saturate hydraulic conductivity (SOL_K) and soil evaporation compensation factor (ESCO) have obvious effects on hydrological responses; this implies that the processes of percolation and evaporation would impact hydrological process in this watershed; (iii) the interactions of ESCO and SNO50COV as well as CN2 and SNO50COV have an obvious effect, implying that snow cover can impact the generation of runoff on land surface and the extraction of soil evaporative demand in lower soil layers. These findings can help enhance the hydrological model

  8. Effect of formal and informal likelihood functions on uncertainty assessment in a single event rainfall-runoff model

    Science.gov (United States)

    Nourali, Mahrouz; Ghahraman, Bijan; Pourreza-Bilondi, Mohsen; Davary, Kamran

    2016-09-01

    In the present study, DREAM(ZS), Differential Evolution Adaptive Metropolis combined with both formal and informal likelihood functions, is used to investigate uncertainty of parameters of the HEC-HMS model in Tamar watershed, Golestan province, Iran. In order to assess the uncertainty of 24 parameters used in HMS, three flood events were used to calibrate and one flood event was used to validate the posterior distributions. Moreover, performance of seven different likelihood functions (L1-L7) was assessed by means of DREAM(ZS)approach. Four likelihood functions, L1-L4, Nash-Sutcliffe (NS) efficiency, Normalized absolute error (NAE), Index of agreement (IOA), and Chiew-McMahon efficiency (CM), is considered as informal, whereas remaining (L5-L7) is represented in formal category. L5 focuses on the relationship between the traditional least squares fitting and the Bayesian inference, and L6, is a hetereoscedastic maximum likelihood error (HMLE) estimator. Finally, in likelihood function L7, serial dependence of residual errors is accounted using a first-order autoregressive (AR) model of the residuals. According to the results, sensitivities of the parameters strongly depend on the likelihood function, and vary for different likelihood functions. Most of the parameters were better defined by formal likelihood functions L5 and L7 and showed a high sensitivity to model performance. Posterior cumulative distributions corresponding to the informal likelihood functions L1, L2, L3, L4 and the formal likelihood function L6 are approximately the same for most of the sub-basins, and these likelihood functions depict almost a similar effect on sensitivity of parameters. 95% total prediction uncertainty bounds bracketed most of the observed data. Considering all the statistical indicators and criteria of uncertainty assessment, including RMSE, KGE, NS, P-factor and R-factor, results showed that DREAM(ZS) algorithm performed better under formal likelihood functions L5 and L7

  9. Impact of input data uncertainty on environmental exposure assessment models : A case study for electromagnetic field modelling from mobile phone base stations

    NARCIS (Netherlands)

    Beekhuizen, Johan; Heuvelink, Gerard B M; Huss, Anke; Bürgi, Alfred; Kromhout, Hans; Vermeulen, Roel

    2014-01-01

    BACKGROUND: With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can the

  10. Impact of input data uncertainty on environmental exposure assessment models: A case study for electromagnetic field modelling from mobile phone base stations

    NARCIS (Netherlands)

    Beekhuizen, J.; Heuvelink, G.B.M.; Huss, A.; Burgi, A.; Kromhout, H.; Vermeulen, R.

    2014-01-01

    Background: With the increased availability of spatial data and computing power, spatial prediction approaches have become a standard tool for exposure assessment in environmental epidemiology. However, such models are largely dependent on accurate input data. Uncertainties in the input data can the

  11. Shall we upgrade one-dimensional secondary settler models used in WWTP simulators? – An assessment of model structure uncertainty and its propagation

    DEFF Research Database (Denmark)

    Plósz, Benedek; De Clercq, Jeriffa; Nopens, Ingmar;

    2011-01-01

    -wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer/winter sequence. The model prediction in terms of nitrogen removal, solids inventory in the bioreactors and solids retention time as a function...... results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant...

  12. Uncertainty in Air Quality Modeling.

    Science.gov (United States)

    Fox, Douglas G.

    1984-01-01

    Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that

  13. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...

  14. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  15. Uncertainties in Nuclear Proliferation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2015-05-15

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies.

  16. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    Science.gov (United States)

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly

  17. Uncertainty assessment of water quality modeling for a small-scale urban catchment using the GLUE methodology: a case study in Shanghai, China.

    Science.gov (United States)

    Zhang, Wei; Li, Tian; Dai, Meihong

    2015-06-01

    There is often great uncertainty in water quality modeling for urban drainage systems because water quality variation in systems is complex and affected by many factors. The stormwater management model (SWMM) was applied to a small-scale urban catchment with a simple and well-maintained stormwater drainage system without illicit connections. This was done to assess uncertainty in build-up and wash-off modeling of pollutants within the generalized likelihood uncertainty estimation (GLUE) methodology, based on a well-calibrated water quantity model. The results indicated great uncertainty of water quality modeling within the GLUE methodology. Comparison of uncertainties in various pollutant build-up and wash-off models that were available in SWMM indicated that those uncertainties varied slightly. This may be a consequence of the specific characteristics of rainfall events and experimental sites used in the study. The uncertainty analysis of water quality parameters in SWMM is conducive to effectively evaluating model reliability, and provides an experience base for similar research and applications.

  18. Assessment of model behavior and acceptable forcing data uncertainty in the context of land surface soil moisture estimation

    Science.gov (United States)

    Dumedah, Gift; Walker, Jeffrey P.

    2017-03-01

    The sources of uncertainty in land surface models are numerous and varied, from inaccuracies in forcing data to uncertainties in model structure and parameterizations. Majority of these uncertainties are strongly tied to the overall makeup of the model, but the input forcing data set is independent with its accuracy usually defined by the monitoring or the observation system. The impact of input forcing data on model estimation accuracy has been collectively acknowledged to be significant, yet its quantification and the level of uncertainty that is acceptable in the context of the land surface model to obtain a competitive estimation remain mostly unknown. A better understanding is needed about how models respond to input forcing data and what changes in these forcing variables can be accommodated without deteriorating optimal estimation of the model. As a result, this study determines the level of forcing data uncertainty that is acceptable in the Joint UK Land Environment Simulator (JULES) to competitively estimate soil moisture in the Yanco area in south eastern Australia. The study employs hydro genomic mapping to examine the temporal evolution of model decision variables from an archive of values obtained from soil moisture data assimilation. The data assimilation (DA) was undertaken using the advanced Evolutionary Data Assimilation. Our findings show that the input forcing data have significant impact on model output, 35% in root mean square error (RMSE) for 5cm depth of soil moisture and 15% in RMSE for 15cm depth of soil moisture. This specific quantification is crucial to illustrate the significance of input forcing data spread. The acceptable uncertainty determined based on dominant pathway has been validated and shown to be reliable for all forcing variables, so as to provide optimal soil moisture. These findings are crucial for DA in order to account for uncertainties that are meaningful from the model standpoint. Moreover, our results point to a proper

  19. Uncertainty in Life Cycle Assessment of Nanomaterials

    Science.gov (United States)

    Seager, T. P.; Linkov, I.

    Despite concerns regarding environmental fate and toxicology, engineered nanostructured material manufacturing is expanding at an increasingly rapid pace. In particular, the unique properties of single walled carbon nanotubes (SWCNT) have made them attractive in many areas, including high-tech power applications such as experimental batteries, fuel cells or electrical wiring. The intensity of research interest in SWCNT has raised questions regarding the life cycle environmental impact of nanotechnologies, including assessment of: worker and consumer safety, greenhouse gas emissions, toxicological risks associated with production or product emissions and the disposition of nanoproducts at end of life. However, development of appropriate nanotechnology assessment tools has lagged progress in the nanotechnologies themselves. In particular, current approaches to life cycle assessment (LCA) — originally developed for application in mature manufacturing industries such as automobiles and chemicals — suffer from several shortcomings that make applicability to nanotechnologies problematic. Among these are uncertainties related to the variability of material properties, toxicity and risk, technology performance in the use phase, nanomaterial degradation and change during the product life cycle and the impact assessment stage of LCA. This chapter expounds upon the unique challenges presented by nanomaterials in general, specifies sources of uncertainty and variability in LCA of SWCNT for use in electric and hybrid vehicle batteries and makes recommendations for modeling and decision-making using LCA in a multi-criteria decision analysis framework under conditions of high uncertainty.1

  20. Propagation-of-uncertainty from contact angle and streaming potential measurements to XDLVO model assessments of membrane-colloid interactions.

    Science.gov (United States)

    Muthu, Satish; Childress, Amy; Brant, Jonathan

    2014-08-15

    Membrane fouling assessed from a fundamental standpoint within the context of the Derjaguin-Landau-Verwey-Overbeek (DLVO) model. The DLVO model requires that the properties of the membrane and foulant(s) be quantified. Membrane surface charge (zeta potential) and free energy values are characterized using streaming potential and contact angle measurements, respectively. Comparing theoretical assessments for membrane-colloid interactions between research groups requires that the variability of the measured inputs be established. The impact that such variability in input values on the outcome from interfacial models must be quantified to determine an acceptable variance in inputs. An interlaboratory study was conducted to quantify the variability in streaming potential and contact angle measurements when using standard protocols. The propagation of uncertainty from these errors was evaluated in terms of their impact on the quantitative and qualitative conclusions on extended DLVO (XDLVO) calculated interaction terms. The error introduced into XDLVO calculated values was of the same magnitude as the calculated free energy values at contact and at any given separation distance. For two independent laboratories to draw similar quantitative conclusions regarding membrane-foulant interfacial interactions the standard error in contact angle values must be⩽2.5°, while that for the zeta potential values must be⩽7 mV.

  1. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  2. Senstitivity of water balance components to environmental changes in a mountainous watershed: uncertainty assessment based on models comparison

    Directory of Open Access Journals (Sweden)

    E. Morán-Tejeda

    2013-10-01

    Full Text Available This paper evaluates the response of stream flow and other components of the water balance to changes in climate and land-use in a Pyrenean watershed. It further provides a measure of uncertainty in water resources forecasts by comparing the performance of two hydrological models: Soil and Water Assessment Tool (SWAT and Regional Hydro-Ecological Simulation System (RHESSys. Regional Climate Model outputs for the 2021–2050 time-frame, and hypothetical (but plausible land-use scenarios considering re-vegetation and wildfire processes were used as inputs to the models. Results indicate an overall decrease in river flows when the scenarios are considered, except for the post-fire vegetation scenario, in which stream flows are simulated to increase. However the magnitude of these projections varies between the two models used, as SWAT tends to produce larger hydrological changes under climate change scenarios, and RHESSys shows more sensitivity to changes in land-cover. The final prediction will therefore depend largely on the combination of the land-use and climate scenarios, and on the model utilized.

  3. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS (Lund-Potsd

  4. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  5. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....

  6. More Efficient Bayesian-based Optimization and Uncertainty Assessment of Hydrologic Model Parameters

    Science.gov (United States)

    2012-02-01

    is more objective, repeatable, and better capitalizes on the computational capacity of the modern computer) is an active area of research and...existence of multiple local optima , non-smooth objective function surfaces, and long valleys in parameter space that are a result of excessive parameter...outputs, structural aspects of the model, as well as its input dataset, model parameters that are adjustable through the calibration process, and the

  7. Improving efficiency of uncertainty analysis in complex Integrated Assessment models: The case of the RAINS emission module

    NARCIS (Netherlands)

    Gabbert, S.G.M.

    2006-01-01

    Ever since the Regional Acidification Information and Simulation Model (RAINS) has been constructed, the treatment of uncertainty has remained an issue of major interest. In a recent review of the model performed for the Clean Air for Europe (CAFE) programme of the European Commission, a more system

  8. Dam break modelling, risk assessment and uncertainty analysis for flood mitigation

    NARCIS (Netherlands)

    Zagonjolli, M.

    2007-01-01

    In this thesis a range of modelling techniques is explored to deal effectively with flood risk management. In particular, attention is paid to floods caused by failure of hydraulic structures such as dams and dikes. The methods considered here are applied for simulating dam and dike failure events,

  9. Dam break modelling, risk assessment and uncertainty analysis for flood mitigation

    NARCIS (Netherlands)

    Zagonjolli, M.

    2007-01-01

    In this thesis a range of modelling techniques is explored to deal effectively with flood risk management. In particular, attention is paid to floods caused by failure of hydraulic structures such as dams and dikes. The methods considered here are applied for simulating dam and dike failure events,

  10. Avoiding climate change uncertainties in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick Arthur

    2013-01-01

    This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies...

  11. Uncertainty analysis in integrated assessment: the users' perspective

    NARCIS (Netherlands)

    Gabbert, S.G.M.; Ittersum, van M.K.; Kroeze, C.; Stalpers, S.I.P.; Ewert, F.; Alkan Olsson, J.

    2010-01-01

    Integrated Assessment (IA) models aim at providing information- and decision-support to complex problems. This paper argues that uncertainty analysis in IA models should be user-driven in order to strengthen science–policy interaction. We suggest an approach to uncertainty analysis that starts with

  12. Use of a process-based model for assessing the methane budgets of global terrestrial ecosystems and evaluation of uncertainty

    Directory of Open Access Journals (Sweden)

    A. Ito

    2012-02-01

    Full Text Available We assessed the global terrestrial budget of methane (CH4 by using a process-based biogeochemical model (VISIT and inventory data for components of the budget that were not included in the model. Emissions from wetlands, paddy fields, biomass burning, and plants, as well as oxidative consumption by upland soils, were simulated by the model. Emissions from ruminant livestock and termites were evaluated by using an inventory approach. These CH4 flows were estimated for each of the model's 0.5° × 0.5° grid cells from 1901 to 2009, while accounting for atmospheric composition, meteorological factors, and land-use changes. Estimation uncertainties were examined through ensemble simulations using different parameterization schemes and input data (e.g., different wetland maps and emission factors. From 1996 to 2005, the average global terrestrial CH4 budget was estimated on the basis of 1152 simulations, and terrestrial ecosystems were found to be a net source of 308.3 ± 20.7 Tg CH4 yr−1. Wetland and livestock ruminant emissions were the primary sources. The results of our simulations indicate that sources and sinks are distributed highly heterogeneously over the Earth's land surface. Seasonal and interannual variability in the terrestrial budget was also assessed. The trend of increasing net emission from terrestrial sources and its relationship with temperature variability imply that terrestrial CH4 feedbacks will play an increasingly important role as a result of future climatic change.

  13. Dam break modelling, risk assessment and uncertainty analysis for flood mitigation

    OpenAIRE

    Zagonjolli, M.

    2007-01-01

    In this thesis a range of modelling techniques is explored to deal effectively with flood risk management. In particular, attention is paid to floods caused by failure of hydraulic structures such as dams and dikes. The methods considered here are applied for simulating dam and dike failure events, flood water routing in downstream areas, and flood risk reduction, providing a unified framework for addressing a variety of flood related events. Numerical, statistical and constraint based method...

  14. An integrated assessment modeling framework for uncertainty studies in global and regional climate change: the MIT IGSM-CAM (version 1.0)

    OpenAIRE

    Monier, E.; Scott, J R; A. P. Sokolov; C. E. Forest; C. A. Schlosser

    2013-01-01

    This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) inc...

  15. Model Uncertainty for Bilinear Hysteric Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...... density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model...... uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used....

  16. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...

  17. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...

  18. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high gro

  19. Modelling the exposure to chemicals for risk assessment: a comprehensive library of multimedia and PBPK models for integration, prediction, uncertainty and sensitivity analysis - the MERLIN-Expo tool.

    Science.gov (United States)

    Ciffroy, P; Alfonso, B; Altenpohl, A; Banjac, Z; Bierkens, J; Brochot, C; Critto, A; De Wilde, T; Fait, G; Fierens, T; Garratt, J; Giubilato, E; Grange, E; Johansson, E; Radomyski, A; Reschwann, K; Suciu, N; Tanaka, T; Tediosi, A; Van Holderbeke, M; Verdonck, F

    2016-10-15

    MERLIN-Expo is a library of models that was developed in the frame of the FP7 EU project 4FUN in order to provide an integrated assessment tool for state-of-the-art exposure assessment for environment, biota and humans, allowing the detection of scientific uncertainties at each step of the exposure process. This paper describes the main features of the MERLIN-Expo tool. The main challenges in exposure modelling that MERLIN-Expo has tackled are: (i) the integration of multimedia (MM) models simulating the fate of chemicals in environmental media, and of physiologically based pharmacokinetic (PBPK) models simulating the fate of chemicals in human body. MERLIN-Expo thus allows the determination of internal effective chemical concentrations; (ii) the incorporation of a set of functionalities for uncertainty/sensitivity analysis, from screening to variance-based approaches. The availability of such tools for uncertainty and sensitivity analysis aimed to facilitate the incorporation of such issues in future decision making; (iii) the integration of human and wildlife biota targets with common fate modelling in the environment. MERLIN-Expo is composed of a library of fate models dedicated to non biological receptor media (surface waters, soils, outdoor air), biological media of concern for humans (several cultivated crops, mammals, milk, fish), as well as wildlife biota (primary producers in rivers, invertebrates, fish) and humans. These models can be linked together to create flexible scenarios relevant for both human and wildlife biota exposure. Standardized documentation for each model and training material were prepared to support an accurate use of the tool by end-users. One of the objectives of the 4FUN project was also to increase the confidence in the applicability of the MERLIN-Expo tool through targeted realistic case studies. In particular, we aimed at demonstrating the feasibility of building complex realistic exposure scenarios and the accuracy of the

  20. Does the Core Contain Potassium?: An Assessment of the Uncertainties in Thermal and Dynamo Evolution Models

    Science.gov (United States)

    Nimmo, F.

    2006-12-01

    The long-term thermal evolution of the core, and the history of the geodynamo, are determined by the rate at which heat is extracted from the core, and the presence of any heat sources within the core [1,2]. Radioactive potassium may provide one such heat source: mineral physics results [3,4] are permissive but not definitive; cosmochemical constraints are weak [5]; and geoneutrino detection [6] does not yet have the required resolution. Theoretical models [1-2,7-9] can help to address whether or not potassium is present in the core. Since the evolution of the CMB heat flux is hard to calculate, a better approach is to assume that the entropy available to power the geodynamo has remained constant over time, and to infer the resulting heat flux [2]. Unfortunately, several important parameters, notably core thermal conductivity and the entropy production rate required to sustain the geodynamo, are uncertain. I have carried out a suite of models using a wide range of parameter values based on published results. In the absence of potassium, an ancient inner core [10] and a continuously active geodynamo are only possible if 1) the dissipation generated by the dynamo is small, Gessmann and Wood, EPSL 200, 63-78, 2002. [5] Lassiter G3, Q11012, 2004. [6] Araki et al., Nature 436, 499-503, 2005. [7] Lister PEPI 140, 145-158, 2003. [8] Roberts et al., in Earth's Core and Lower Mantle, ed. Jones et al. [9] Nimmo et al. GJI 156, 363-376, 2004. [10] Brandon et al., EPSL 206, 411-426, 2003. [11] Hernlund et al., Nature 434, 882-886, 2005. [12] Zhong, JGR 111, B04409, 2006.

  1. Handling Unquantifiable Uncertainties in Landslide Modelling

    Science.gov (United States)

    Almeida, S.; Holcombe, E.; Pianosi, F.; Wagener, T.

    2015-12-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, there is no agreement on what probability distribution should be used to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform adequately under a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use and combine several GSA methods including the Method of Morris, Regional Sensitivity Analysis and CART, as well as advanced visualization tools. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.

  2. Assessing the impact of uncertainty on flood risk estimates with reliability analysis using 1-D and 2-D hydraulic models

    Directory of Open Access Journals (Sweden)

    L. Altarejos-García

    2012-07-01

    Full Text Available This paper addresses the use of reliability techniques such as Rosenblueth's Point-Estimate Method (PEM as a practical alternative to more precise Monte Carlo approaches to get estimates of the mean and variance of uncertain flood parameters water depth and velocity. These parameters define the flood severity, which is a concept used for decision-making in the context of flood risk assessment. The method proposed is particularly useful when the degree of complexity of the hydraulic models makes Monte Carlo inapplicable in terms of computing time, but when a measure of the variability of these parameters is still needed. The capacity of PEM, which is a special case of numerical quadrature based on orthogonal polynomials, to evaluate the first two moments of performance functions such as the water depth and velocity is demonstrated in the case of a single river reach using a 1-D HEC-RAS model. It is shown that in some cases, using a simple variable transformation, statistical distributions of both water depth and velocity approximate the lognormal. As this distribution is fully defined by its mean and variance, PEM can be used to define the full probability distribution function of these flood parameters and so allowing for probability estimations of flood severity. Then, an application of the method to the same river reach using a 2-D Shallow Water Equations (SWE model is performed. Flood maps of mean and standard deviation of water depth and velocity are obtained, and uncertainty in the extension of flooded areas with different severity levels is assessed. It is recognized, though, that whenever application of Monte Carlo method is practically feasible, it is a preferred approach.

  3. Prediction uncertainty assessment of a systems biology model requires a sample of the full probability distribution of its parameters

    NARCIS (Netherlands)

    Mourik, van S.; Braak, ter C.J.F.; Stigter, J.D.; Molenaar, J.

    2014-01-01

    Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate

  4. Shall we upgrade one-dimensional secondary settler models used in WWTP simulators? - An assessment of model structure uncertainty and its propagation.

    Science.gov (United States)

    Plósz, Benedek Gy; De Clercq, Jeriffa; Nopens, Ingmar; Benedetti, Lorenzo; Vanrolleghem, Peter A

    2011-01-01

    In WWTP models, the accurate assessment of solids inventory in bioreactors equipped with solid-liquid separators, mostly described using one-dimensional (1-D) secondary settling tank (SST) models, is the most fundamental requirement of any calibration procedure. Scientific knowledge on characterising particulate organics in wastewater and on bacteria growth is well-established, whereas 1-D SST models and their impact on biomass concentration predictions are still poorly understood. A rigorous assessment of two 1-DSST models is thus presented: one based on hyperbolic (the widely used Takács-model) and one based on parabolic (the more recently presented Plósz-model) partial differential equations. The former model, using numerical approximation to yield realistic behaviour, is currently the most widely used by wastewater treatment process modellers. The latter is a convection-dispersion model that is solved in a numerically sound way. First, the explicit dispersion in the convection-dispersion model and the numerical dispersion for both SST models are calculated. Second, simulation results of effluent suspended solids concentration (XTSS,Eff), sludge recirculation stream (XTSS,RAS) and sludge blanket height (SBH) are used to demonstrate the distinct behaviour of the models. A thorough scenario analysis is carried out using SST feed flow rate, solids concentration, and overflow rate as degrees of freedom, spanning a broad loading spectrum. A comparison between the measurements and the simulation results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant-wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer

  5. Invasive alien species in the food chain: Advancing risk assessment models to address climate change, economics and uncertainty

    Directory of Open Access Journals (Sweden)

    Darren Kriticos

    2013-09-01

    Full Text Available Pest risk maps illustrate where invasive alien arthropods, molluscs, pathogens, and weeds might become established, spread, and cause harm to natural and agricultural resources within a pest risk area. Such maps can be powerful tools to assist policymakers in matters of international trade, domestic quarantines, biosecurity surveillance, or pest-incursion responses. The International Pest Risk Mapping Workgroup (IPRMW is a group of ecologists, economists, modellers, and practising risk analysts who are committed to improving the methods used to estimate risks posed by invasive alien species to agricultural and natural resources. The group also strives to improve communication about pest risks to biosecurity, production, and natural-resource-sector stakeholders so that risks can be better managed. The IPRMW previously identified ten activities to improve pest risk assessment procedures, among these were: “improve representations of uncertainty, … expand communications with decision-makers on the interpretation and use of risk maps, … increase international collaboration, … incorporate climate change, … [and] study how human and biological dimensions interact” (Venette et al. 2010.

  6. Managing uncertainty, ambiguity and ignorance in impact assessment by embedding evolutionary resilience, participatory modelling and adaptive management.

    Science.gov (United States)

    Bond, Alan; Morrison-Saunders, Angus; Gunn, Jill A E; Pope, Jenny; Retief, Francois

    2015-03-15

    In the context of continuing uncertainty, ambiguity and ignorance in impact assessment (IA) prediction, the case is made that existing IA processes are based on false 'normal' assumptions that science can solve problems and transfer knowledge into policy. Instead, a 'post-normal science' approach is needed that acknowledges the limits of current levels of scientific understanding. We argue that this can be achieved through embedding evolutionary resilience into IA; using participatory workshops; and emphasising adaptive management. The goal is an IA process capable of informing policy choices in the face of uncertain influences acting on socio-ecological systems. We propose a specific set of process steps to operationalise this post-normal science approach which draws on work undertaken by the Resilience Alliance. This process differs significantly from current models of IA, as it has a far greater focus on avoidance of, or adaptation to (through incorporating adaptive management subsequent to decisions), unwanted future scenarios rather than a focus on the identification of the implications of a single preferred vision. Implementing such a process would represent a culture change in IA practice as a lack of knowledge is assumed and explicit, and forms the basis of future planning activity, rather than being ignored.

  7. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  8. Uncertainty assessment using uncalibrated objects:

    DEFF Research Database (Denmark)

    Meneghello, R.; Savio, Enrico; Larsen, Erik;

    This report is made as a part of the project Easytrac, an EU project under the programme: Competitive and Sustainable Growth: Contract No: G6RD-CT-2000-00188, coordinated by UNIMETRIK S.A. (Spain). The project is concerned with low uncertainty calibrations on coordinate measuring machines....... The Centre for Geometrical Metrology (CGM) at the Technical University of Denmark takes care of free form measurements, in collaboration with DIMEG, University of Padova, Italy and Unilab Laboratori Industriali Srl, Italy. The present report describes the calibration of a bevel gear using the method...

  9. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  10. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  11. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  12. Enhance accuracy in Software cost and schedule estimation by using "Uncertainty Analysis and Assessment" in the system modeling process

    CERN Document Server

    Vasantrao, Kardile Vilas

    2011-01-01

    Accurate software cost and schedule estimation are essential for software project success. Often it referred to as the "black art" because of its complexity and uncertainty, software estimation is not as difficult or puzzling as people think. In fact, generating accurate estimates is straightforward-once you understand the intensity of uncertainty and framework for the modeling process. The mystery to successful software estimation-distilling academic information and real-world experience into a practical guide for working software professionals. Instead of arcane treatises and rigid modeling techniques, this will guide highlights a proven set of procedures, understandable formulas, and heuristics that individuals and development teams can apply to their projects to help achieve estimation proficiency with choose appropriate development approaches In the early stage of software life cycle project manager are inefficient to estimate the effort, schedule, cost estimation and its development approach .This in tu...

  13. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  14. Uncertainty assessment in quantitative risk analyses. Literature review; Usikkerhedsbeskrivelse i kvantitative risikoanalyser. Litteraturgennemgang

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    Literature on uncertainty assessment for risk-analytical purposes has been compiled. Databases Inspec, Compendex, Energy Science and Technology, Chemical Abstracts, Chemical Safety Newsbase, HSEline and MathSci were searched. Roughly 80 references have been selected from these databases and divided according to the following uncertainty classes: 1. Statistical uncertainty; 2. Data uncertainty; 3. Presumption uncertainty; 4. Uncertainty of consequence models; 5. Cognitive uncertainty. (EG)

  15. A review of uncertainty research in impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)

    2015-01-15

    This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We

  16. Avoiding climate change uncertainties in Strategic Environmental Assessment

    DEFF Research Database (Denmark)

    Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick Arthur

    2013-01-01

    incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.......This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies...... ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite...

  17. Quantifying geological uncertainty in metamorphic phase equilibria modelling; a Monte Carlo assessment and implications for tectonic interpretations

    Directory of Open Access Journals (Sweden)

    Richard M. Palin

    2016-07-01

    Full Text Available Pseudosection modelling is rapidly becoming an essential part of a petrologist's toolkit and often forms the basis of interpreting the tectonothermal evolution of a rock sample, outcrop, or geological region. Of the several factors that can affect the accuracy and precision of such calculated phase diagrams, “geological” uncertainty related to natural petrographic variation at the hand sample- and/or thin section-scale is rarely considered. Such uncertainty influences the sample's bulk composition, which is the primary control on its equilibrium phase relationships and thus the interpreted pressure–temperature (P–T conditions of formation. Two case study examples—a garnet–cordierite granofels and a garnet–staurolite–kyanite schist—are used to compare the relative importance that geological uncertainty has on bulk compositions determined via (1 X-ray fluorescence (XRF or (2 point counting techniques. We show that only minor mineralogical variation at the thin-section scale propagates through the phase equilibria modelling procedure and affects the absolute P–T conditions at which key assemblages are stable. Absolute displacements of equilibria can approach ±1 kbar for only a moderate degree of modal proportion uncertainty, thus being essentially similar to the magnitudes reported for analytical uncertainties in conventional thermobarometry. Bulk compositions determined from multiple thin sections of a heterogeneous garnet–staurolite–kyanite schist show a wide range in major-element oxides, owing to notable variation in mineral proportions. Pseudosections constructed for individual point count-derived bulks accurately reproduce this variability on a case-by-case basis, though averaged proportions do not correlate with those calculated at equivalent peak P–T conditions for a whole-rock XRF-derived bulk composition. The main discrepancies relate to varying proportions of matrix phases (primarily mica relative to

  18. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  19. Sensitivity of model assessments of high-speed civil transport effects on stratospheric ozone resulting from uncertainties in the NO x production from lightning

    Science.gov (United States)

    Smyshlyaev, Sergei P.; Geller, Marvin A.; Yudin, Valery A.

    1999-11-01

    Lightning NOx production is one of the most important and most uncertain sources of reactive nitrogen in the atmosphere. To examine the role of NOx lightning production uncertainties in supersonic aircraft assessment studies, we have done a number of numerical calculations with the State University of New York at Stony Brook-Russian State Hydrometeorological Institute of Saint-Petersburg two-dimensional model. The amount of nitrogen oxides produced by lightning discharges was varied within its quoted uncertainty from 2 to 12 Tg N/yr. Different latitudinal, altitudinal, and seasonal distributions of lightning NOx production were considered. Results of these model calculations show that the assessment of supersonic aircraft impacts on the ozone layer is very sensitive to the strength of NOx production from lightning. The high-speed civil transport produced NOx leads to positive column ozone changes for lightning NOx production less than 4 Tg N/yr, and to total ozone decrease for lightning NOx production more than 5 Tg N/yr for the same NOx emission scenario. For large lightning production the ozone response is mostly decreasing with increasing emission index, while for low lightning production the ozone response is mostly increasing with increasing emission index. Uncertainties in the global lightning NOx production strength may lead to uncertainties in column ozone up to 4%. The uncertainties due to neglecting the seasonal variations of the lightning NOx production and its simplified latitude distribution are about 2 times less (1.5-2%). The type of altitude distribution for the lightning NOx production does not significally impact the column ozone, but is very important for the assessment studies of aircraft perturbations of atmospheric ozone. Increased global lightning NOx production causes increased total ozone, but for assessment of the column ozone response to supersonic aircraft emissions, the increase of lightning NOx production leads to column ozone

  20. Uncertainties in risk assessment at USDOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  1. Uncertainties in risk assessment at USDOE facilities

    Energy Technology Data Exchange (ETDEWEB)

    Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.

    1994-01-01

    The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.

  2. An assessment of uncertainties in using volume-area modelling for computing the twenty-first century glacier contribution to sea-level change

    Directory of Open Access Journals (Sweden)

    A. B. A. Slangen

    2011-08-01

    Full Text Available A large part of present-day sea-level change is formed by the melt of glaciers and ice caps (GIC. This study focuses on the uncertainties in the calculation of the GIC contribution on a century timescale. The model used is based on volume-area scaling, combined with the mass balance sensitivity of the GIC. We assess different aspects that contribute to the uncertainty in the prediction of the contribution of GIC to future sea-level rise, such as (1 the volume-area scaling method (scaling factor, (2 the glacier data, (3 the climate models, and (4 the emission scenario. Additionally, a comparison of the model results to the 20th century GIC contribution is presented.

    We find that small variations in the scaling factor cause significant variations in the initial volume of the glaciers, but only limited variations in the glacier volume change. If two existing glacier inventories are tuned such that the initial volume is the same, the GIC sea-level contribution over 100 yr differs by 0.027 m or 18 %. It appears that the mass balance sensitivity is also important: variations of 20 % in the mass balance sensitivity have an impact of 17 % on the resulting sea-level projections. Another important factor is the choice of the climate model, as the GIC contribution to sea-level change largely depends on the temperature and precipitation taken from climate models. Connected to this is the choice of emission scenario, used to drive the climate models. Combining all the uncertainties examined in this study leads to a total uncertainty of 0.052 m or 35 % in the GIC contribution to global mean sea level. Reducing the variance in the climate models and improving the glacier inventories will significantly reduce the uncertainty in calculating the GIC contributions, and are therefore crucial actions to improve future sea-level projections.

  3. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  4. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  5. Are models, uncertainty, and dispute resolution compatible?

    Science.gov (United States)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  6. An Assessment of Uncertainties in the NASA GISS ModelE GCM due to Variations in the Representation of Aerosol/Cloud Interactions

    Science.gov (United States)

    Persad, G. G.; Menon, S.; Sednev, I.

    2008-12-01

    Aerosol indirect effects are known to have a significant impact on the evolution of the climate system. However, their representation via cloud/aerosol microphysics remains a major source of uncertainty in climate models. This study assesses uncertainties in the NASA Goddard Institute for Space Studies (GISS) ModelE global climate model produced by different representations of the cloud/aerosol interaction scheme. By varying the complexity of the cloud microphysics scheme included in the model and analyzing the range of results against cloud properties obtained from satellite retrievals, we evaluate the effect of the different schemes on climate. We examine four sets of simulations with the GISS ModelE: (1) using a new aerosol/cloud microphysics package implemented in ModelE (based on the two-moment cloud microphysics scheme recently implemented in CCSM), (2) using a version of the microphysics scheme previously included in ModelE, (3) using prescribed aerosol concentrations and fixed cloud droplet number (the main link between aerosols and the cloud microphysics scheme), and (4) varying the environment conditions with which the new aerosol/cloud microphysics package is run. The global mean cloud properties are analyzed and compared to global mean ranges as obtained from satellite retrievals. Results show that important climate parameters, such as total cloud cover, can be underestimated by 8-15% using the new aerosol/cloud microphysics scheme. Liquid water path (LWP) is particularly affected by variations to the aerosol/cloud microphysics representation, exhibiting both global mean variations of ~20% and strong regional differences. Significant variability in LWP between the various simulations may be attributed to differences in the autoconversion scheme used in the differing representations of aerosol/cloud interactions. These LWP differences significantly affect radiative parameters, such as cloud optical depth and net cloud forcing (used to evaluate the

  7. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...

  8. Informational uncertainties of risk assessment about accidents of chemicals

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    An analysis system of informational uncertainties for accidental risk assessment of chemicals is introduced. Statistical test methods and fuzzy sets method can do the quantitative analysis of the input parameters. The uncertainties of the model can be used by quantitative compared method for the leakage accidents of chemicals. The estimation of the leaking time is important for discussing accidental source term. The uncertain analyses of the release accident for pipeline gas (CO) liquid chlorine and liquid propane gas (LPG) have been discussed.

  9. An integrated assessment modeling framework for uncertainty studies in global and regional climate change: the MIT IGSM-CAM (version 1.0)

    Science.gov (United States)

    Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.

    2013-12-01

    This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen

  10. Regional statistical assessment of WRF-Hydro and IFC Model stream Flow uncertainties over the State of Iowa

    Science.gov (United States)

    ElSaadani, M.; Quintero, F.; Goska, R.; Krajewski, W. F.; Lahmers, T.; Small, S.; Gochis, D. J.

    2015-12-01

    This study examines the performance of different Hydrologic models in estimating peak flows over the state of Iowa. In this study I will compare the output of the Iowa Flood Center (IFC) hydrologic model and WRF-Hydro (NFIE configuration) to the observed flows at the USGS stream gauges. During the National Flood Interoperability Experiment I explored the performance of WRF-Hydro over the state of Iowa using different rainfall products and the resulting hydrographs showed a "flashy" behavior of the model output due to lack of calibration and bad initial flows due to short model spin period. I would like to expand this study by including a second well established hydrologic model and include more rain gauge vs. radar rainfall direct comparisons. The IFC model is expected to outperform WRF-Hydro's out of the box results, however, I will test different calibration options for both the Noah-MP land surface model and RAPID, which is the routing component of the NFIE-Hydro configuration, to see if this will improve the model results. This study will explore the statistical structure of model output uncertainties across scales (as a function of drainage areas and/or stream orders). I will also evaluate the performance of different radar-based Quantitative Precipitation Estimation (QPE) products (e.g. Stage IV, MRMS and IFC's NEXRAD based radar rainfall product. Different basins will be evaluated in this study and they will be selected based on size, amount of rainfall received over the basin area and location. Basin location will be an important factor in this study due to our prior knowledge of the performance of different NEXRAD radars that cover the region, this will help observe the effect of rainfall biases on stream flows. Another possible addition to this study is to apply controlled spatial error fields to rainfall inputs and observer the propagation of these errors through the stream network.

  11. Using high-resolution soil moisture modelling to assess the uncertainty of microwave remotely sensed soil moisture products at the correct spatial and temporal support

    Science.gov (United States)

    Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.

    2012-04-01

    Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture content throughout the river basin. Space-borne remote sensing may provide this information with a high temporal and spatial resolution and with a global coverage. Currently three microwave soil moisture products are available: AMSR-E, ASCAT and SMOS. The quality of these satellite-based products is often assessed by comparing them with in-situ observations of soil moisture. This comparison is however hampered by the difference in spatial and temporal support (i.e., resolution, scale), because the spatial resolution of microwave satellites is rather low compared to in-situ field measurements. Thus, the aim of this study is to derive a method to assess the uncertainty of microwave satellite soil moisture products at the correct spatial support. To overcome the difference in support size between in-situ soil moisture observations and remote sensed soil moisture, we used a stochastic, distributed unsaturated zone model (SWAP, van Dam (2000)) that is upscaled to the support of different satellite products. A detailed assessment of the SWAP model uncertainty is included to ensure that the uncertainty in satellite soil moisture is not overestimated due to an underestimation of the model uncertainty. We simulated unsaturated water flow up to a depth of 1.5m with a vertical resolution of 1 to 10 cm and on a horizontal grid of 1 km2 for the period Jan 2010 - Jun 2011. The SWAP model was first calibrated and validated on in-situ data of the REMEDHUS soil moisture network (Spain). Next, to evaluate the satellite products, the model was run for areas in the proximity of 79 meteorological stations in Spain, where model results were aggregated to the correct support of the satellite

  12. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  13. Uncertainty of Energy Consumption Assessment of Domestic Buildings

    DEFF Research Database (Denmark)

    Brohus, Henrik; Heiselberg, Per; Simonsen, A.

    2009-01-01

    In order to assess the influence of energy reduction initiatives, to determine the expected annual cost, to calculate life cycle cost, emission impact, etc. it is crucial to be able to assess the energy consumption reasonably accurate. The present work undertakes a theoretical and empirical study...... of the uncertainty of energy consumption assessment of domestic buildings. The calculated energy consumption of a number of almost identical domestic buildings in Denmark is compared with the measured energy consumption. Furthermore, the uncertainty is determined by means of stochastic modelling based on input...

  14. New challenges on uncertainty propagation assessment of flood risk analysis

    Science.gov (United States)

    Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés

    2016-04-01

    Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis

  15. Uncertainty in Regional Air Quality Modeling

    Science.gov (United States)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  16. Assessing positive matrix factorization model fit: a new method to estimate uncertainty and bias in factor contributions at the measurement time scale

    Directory of Open Access Journals (Sweden)

    J. G. Hemann

    2009-01-01

    Full Text Available A Positive Matrix Factorization receptor model for aerosol pollution source apportionment was fit to a synthetic dataset simulating one year of daily measurements of ambient PM2.5 concentrations, comprised of 39 chemical species from nine pollutant sources. A novel method was developed to estimate model fit uncertainty and bias at the daily time scale, as related to factor contributions. A circular block bootstrap is used to create replicate datasets, with the same receptor model then fit to the data. Neural networks are trained to classify factors based upon chemical profiles, as opposed to correlating contribution time series, and this classification is used to align factor orderings across the model results associated with the replicate datasets. Factor contribution uncertainty is assessed from the distribution of results associated with each factor. Comparing modeled factors with input factors used to create the synthetic data assesses bias. The results indicate that variability in factor contribution estimates does not necessarily encompass model error: contribution estimates can have small associated variability across results yet also be very biased. These findings are likely dependent on characteristics of the data.

  17. Quantifying uncertainty in LCA-modelling of waste management systems.

    Science.gov (United States)

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  18. Representing Turbulence Model Uncertainty with Stochastic PDEs

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2012-11-01

    Validation of and uncertainty quantification for extrapolative predictions of RANS turbulence models are necessary to ensure that the models are not used outside of their domain of applicability and to properly inform decisions based on such predictions. In previous work, we have developed and calibrated statistical models for these purposes, but it has been found that incorporating all the knowledge of a domain expert--e.g., realizability, spatial smoothness, and known scalings--in such models is difficult. Here, we explore the use of stochastic PDEs for this purpose. The goal of this formulation is to pose the uncertainty model in a setting where it is easier for physical modelers to express what is known. To explore the approach, multiple stochastic models describing the error in the Reynolds stress are coupled with multiple deterministic turbulence models to make uncertain predictions of channel flow. These predictions are compared with DNS data to assess their credibility. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  19. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  20. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.;

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...

  1. Errors and uncertainties introduced by a regional climate model in climate impact assessments: example of crop yield simulations in West Africa

    Science.gov (United States)

    Ramarohetra, Johanna; Pohl, Benjamin; Sultan, Benjamin

    2015-12-01

    The challenge of estimating the potential impacts of climate change has led to an increasing use of dynamical downscaling to produce fine spatial-scale climate projections for impact assessments. In this work, we analyze if and to what extent the bias in the simulated crop yield can be reduced by using the Weather Research and Forecasting (WRF) regional climate model to downscale ERA-Interim (European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis) rainfall and radiation data. Then, we evaluate the uncertainties resulting from both the choice of the physical parameterizations of the WRF model and its internal variability. Impact assessments were performed at two sites in Sub-Saharan Africa and by using two crop models to simulate Niger pearl millet and Benin maize yields. We find that the use of the WRF model to downscale ERA-Interim climate data generally reduces the bias in the simulated crop yield, yet this reduction in bias strongly depends on the choices in the model setup. Among the physical parameterizations considered, we show that the choice of the land surface model (LSM) is of primary importance. When there is no coupling with a LSM, or when the LSM is too simplistic, the simulated precipitation and then the simulated yield are null, or respectively very low; therefore, coupling with a LSM is necessary. The convective scheme is the second most influential scheme for yield simulation, followed by the shortwave radiation scheme. The uncertainties related to the internal variability of the WRF model are also significant and reach up to 30% of the simulated yields. These results suggest that regional models need to be used more carefully in order to improve the reliability of impact assessments.

  2. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-03-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the model representations. The likelihood of the simulated glacier mass balance and snow cover are used for further assessing model credibility. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  3. International symposium on engineering under uncertainty : safety assessment and management

    CERN Document Server

    Bhattacharya, Gautam; ISEUSAM - 2012

    2013-01-01

    International Symposium on Engineering under Uncertainty: Safety Assessment and Management (ISEUSAM - 2012) is organized by Bengal Engineering and Science University, India during the first week of January 2012 at Kolkata.The primary aim of ISEUSAM 2012 is to provide a platform to facilitate the discussion for a better understanding and management of uncertainty and risk, encompassing various aspects of safety and reliability of engineering systems. The conference received an overwhelming response from national as well as international scholars, experts and delegates from different parts of the world.  Papers were received from authors of several countries including Australia, Canada, China, Germany, Italy, UAE, UK and USA, besides India. More than two hundred authors have shown their interest in the symposium. The Proceedings presents ninety two high quality papers which address issues of uncertainty encompassing various fields of engineering, i.e. uncertainty analysis and modelling, structural reliability...

  4. Uncertainty propagation within the UNEDF models

    CERN Document Server

    Haverinen, T

    2016-01-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties on binding energies for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  5. Uncertainty propagation within the UNEDF models

    Science.gov (United States)

    Haverinen, T.; Kortelainen, M.

    2017-04-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  6. Uncertainty Assessment in Life Cycle Cost Analysis.

    Science.gov (United States)

    1985-05-01

    of this report) Unclassified 15. DECL ASSI FICATION/ DOWNGRADING SCHEDULE 1S. DISTRIBUTION STATEMENT (of this Report) Approved for public release...data base oriented. 7. Risk Analysis and Decision Models in the Planning of Housing Projects, by Jorge A. Machado , Report No. R72-44, Structures...1979). Lewis, L., "Range Estimating -- Managing Uncertainty," AACE Bulletin, Vol. 19, No. 6, Nov/Dec 1977. Machado , J. A., "Risk Analysis and Decision

  7. Parameter and Uncertainty Estimation in Groundwater Modelling

    DEFF Research Database (Denmark)

    Jensen, Jacob Birk

    The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... and uncertainty estimation. Essential issues relating to calibration are discussed. The classical regression methods are described; however, the main focus is on the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The next two chapters describe case studies in which the GLUE methodology...

  8. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...

  9. Uncertainty in spatially explicit animal dispersal models

    NARCIS (Netherlands)

    Mooij, W.M.; DeAngelis, D.L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three level

  10. Methodology for qualitative uncertainty assessment of climate impact indicators

    Science.gov (United States)

    Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela

    2016-04-01

    The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections

  11. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  12. Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE.

    Science.gov (United States)

    Simon-Cornu, M; Beaugelin-Seiller, K; Boyer, P; Calmon, P; Garcia-Sanchez, L; Mourlon, C; Nicoulaud, V; Sy, M; Gonze, M A

    2015-01-01

    SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    Science.gov (United States)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate

  14. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-08-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the parameter vector. The plausibility of the simulated glacier mass balance and snow cover are used for further constraining the model representations. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  15. Utility of population models to reduce uncertainty and increase value relevance in ecological risk assessments of pesticides: an example based on acute mortality data for daphnids.

    Science.gov (United States)

    Hanson, Niklas; Stark, John D

    2012-04-01

    Traditionally, ecological risk assessments (ERA) of pesticides have been based on risk ratios, where the predicted concentration of the chemical is compared to the concentration that causes biological effects. The concentration that causes biological effect is mostly determined from laboratory experiments using endpoints on the level of the individual (e.g., mortality and reproduction). However, the protection goals are mostly defined at the population level. To deal with the uncertainty in the necessary extrapolations, safety factors are used. Major disadvantages with this simplified approach is that it is difficult to relate a risk ratio to the environmental protection goals, and that the use of fixed safety factors can result in over- as well as underprotective assessments. To reduce uncertainty and increase value relevance in ERA, it has been argued that population models should be used more frequently. In the present study, we have used matrix population models for 3 daphnid species (Ceriodaphnia dubia, Daphnia magna, and D. pulex) to reduce uncertainty and increase value relevance in the ERA of a pesticide (spinosad). The survival rates in the models were reduced in accordance with data from traditional acute mortality tests. As no data on reproductive effects were available, the conservative assumption that no reproduction occurred during the exposure period was made. The models were used to calculate the minimum population size and the time to recovery. These endpoints can be related to the European Union (EU) protection goals for aquatic ecosystems in the vicinity of agricultural fields, which state that reversible population level effects are acceptable if there is recovery within an acceptable (undefined) time frame. The results of the population models were compared to the acceptable (according to EU documents) toxicity exposure ratio (TER) that was based on the same data. At the acceptable TER, which was based on the most sensitive species (C. dubia

  16. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...

  17. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Indian Academy of Sciences (India)

    Diego Rivera; Yessica Rivas; Alex Godoy

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s−1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  18. Uncertainty and the Conceptual Site Model

    Science.gov (United States)

    Price, V.; Nicholson, T. J.

    2007-12-01

    Our focus is on uncertainties in the underlying conceptual framework upon which all subsequent steps in numerical and/or analytical modeling efforts depend. Experienced environmental modelers recognize the value of selecting an optimal conceptual model from several competing site models, but usually do not formally explore possible alternative models, in part due to incomplete or missing site data, as well as relevant regional data for establishing boundary conditions. The value in and approach for developing alternative conceptual site models (CSM) is demonstrated by analysis of case histories. These studies are based on reported flow or transport modeling in which alternative site models are formulated using data that were not available to, or not used by, the original modelers. An important concept inherent to model abstraction of these alternative conceptual models is that it is "Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise." (Tukey, 1962) The case histories discussed here illustrate the value of formulating alternative models and evaluating them using site-specific data: (1) Charleston Naval Site where seismic characterization data allowed significant revision of the CSM and subsequent contaminant transport modeling; (2) Hanford 300-Area where surface- and ground-water interactions affecting the unsaturated zone suggested an alternative component to the site model; (3) Savannah River C-Area where a characterization report for a waste site within the modeled area was not available to the modelers, but provided significant new information requiring changes to the underlying geologic and hydrogeologic CSM's used; (4) Amargosa Desert Research Site (ADRS) where re-interpretation of resistivity sounding data and water-level data suggested an alternative geologic model. Simple 2-D spreadsheet modeling of the ADRS with the revised CSM provided an improved

  19. Uncertainty analysis for a field-scale P loss model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  20. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  1. A Bayesian Hierarchical Model for Spatio-Temporal Prediction and Uncertainty Assessment Using Repeat LiDAR Acquisitions for the Kenai Peninsula, AK, USA

    Science.gov (United States)

    Babcock, C. R.; Andersen, H. E.; Finley, A. O.; Cook, B.; Morton, D. C.

    2015-12-01

    Models using repeat LiDAR and field campaigns may be one mechanism to monitor carbon storage and flux in forested regions. Considering the ability of multi-temporal LiDAR to estimate growth, it is not surprising that there is great interest in developing forest carbon monitoring strategies that rely on repeated LiDAR acquisitions. Allowing for sparser field campaigns, LiDAR stands to make monitoring forest carbon cheaper and more efficient than field-only sampling procedures. Here, we look to the spatio-temporally data-rich Kenai Peninsula in Alaska to examine the potential for Bayesian spatio-temporal mapping of forest carbon storage and uncertainty. The framework explored here can predict forest carbon through space and time, while formally propagating uncertainty through to prediction. Bayesian spatio-temporal models are flexible frameworks allowing for forest growth processes to be formally integrated into the model. By incorporating a mechanism for growth---using temporally repeated field and LiDAR data---we can more fully exploit the information-rich inventory network to improve prediction accuracy. LiDAR data for the Kenai Peninsula has been collected on four different occasions---spatially coincident LiDAR strip samples in 2004, 09 and 14, along with a wall-to-wall collection in 2008. There were 436 plots measured twice between 2002 and 2014. LiDAR was acquired at least once over most inventory plots with many having LiDAR collected during 2, 3 or 4 different campaigns. Results from this research will impact how forests are inventoried. It is too expensive to monitor terrestrial carbon using field-only sampling strategies and currently proposed LiDAR model-based techniques lack the ability to properly utilize temporally repeated and misaligned data. Bayesian hierarchical spatio-temporal models offer a solution to these shortcomings and allow for formal predictive error assessment, which is useful for policy development and decision making.

  2. Uncertainty analysis of fluvial outcrop data for stochastic reservoir modelling

    Energy Technology Data Exchange (ETDEWEB)

    Martinius, A.W. [Statoil Research Centre, Trondheim (Norway); Naess, A. [Statoil Exploration and Production, Stjoerdal (Norway)

    2005-07-01

    Uncertainty analysis and reduction is a crucial part of stochastic reservoir modelling and fluid flow simulation studies. Outcrop analogue studies are often employed to define reservoir model parameters but the analysis of uncertainties associated with sedimentological information is often neglected. In order to define uncertainty inherent in outcrop data more accurately, this paper presents geometrical and dimensional data from individual point bars and braid bars, from part of the low net:gross outcropping Tortola fluvial system (Spain) that has been subjected to a quantitative and qualitative assessment. Four types of primary outcrop uncertainties are discussed: (1) the definition of the conceptual depositional model; (2) the number of observations on sandstone body dimensions; (3) the accuracy and representativeness of observed three-dimensional (3D) sandstone body size data; and (4) sandstone body orientation. Uncertainties related to the depositional model are the most difficult to quantify but can be appreciated qualitatively if processes of deposition related to scales of time and the general lack of information are considered. Application of the N0 measure is suggested to assess quantitatively whether a statistically sufficient number of dimensional observations is obtained to reduce uncertainty to an acceptable level. The third type of uncertainty is evaluated in a qualitative sense and determined by accurate facies analysis. The orientation of sandstone bodies is shown to influence spatial connectivity. As a result, an insufficient number or quality of observations may have important consequences for estimated connected volumes. This study will give improved estimations for reservoir modelling. (author)

  3. Simulation model analysis of the most promising geological sequestration formation candidates in the Rocky Mountain region, USA, with focus on uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Si-Yong [Univ. of Utah, Salt Lake City, UT (United States); Zaluski, Wade [Schlumberger Carbon Services, Houston, TX (United States); Will, Robert [Schlumberger Carbon Services, Houston, TX (United States); Eisinger, Chris [Colorado Geological Survey, Golden, CO (United States); Matthews, Vince [Colorado Geological Survey, Golden, CO (United States); McPherson, Brian [Univ. of Utah, Salt Lake City, UT (United States)

    2013-12-31

    The purpose of this report is to report results of reservoir model simulation analyses for forecasting subsurface CO2 storage capacity estimation for the most promising formations in the Rocky Mountain region of the USA. A particular emphasis of this project was to assess uncertainty of the simulation-based forecasts. Results illustrate how local-scale data, including well information, number of wells, and location of wells, affect storage capacity estimates and what degree of well density (number of wells over a fixed area) may be required to estimate capacity within a specified degree of confidence. A major outcome of this work was development of a new workflow of simulation analysis, accommodating the addition of “random pseudo wells” to represent virtual characterization wells.

  4. Monitoring, chemical fate modelling and uncertainty assessment in combination: a tool for evaluating emission control scenarios for micropollutants in stormwater systems

    DEFF Research Database (Denmark)

    Mikkelsen, Peter Steen; Vezzaro, Luca; Birch, Heidi

    2012-01-01

    Stormwater discharges can represent significant sources of micropollutants (MP), including heavy metals and xenobiotic organic compounds that may pose a toxicity risk to aquatic ecosystems. Control of stormwater quality and reduction of MP loads is therefore necessary for a sustainable stormwater...... management in urban areas, but it is strongly hampered by the general lack of field data on these substances. A framework for combining field monitoring campaigns with dynamic MP modelling tools and statistical methods for uncertainty analysis was hence developed to estimate MP fluxes and fate in stormwater...... on land usage allowed characterizing the catchment and identifying the major potential sources of stormwater MP. Monitoring of the pond inlet and outlet, as well as sediment analyses, allowed assessing the current situation and highlighted potential risks for the downstream surface water environment...

  5. Simulation model analysis of the most promising geological sequestration formation candidates in the Rocky Mountain region, USA, with focus on uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Si-Yong [Univ. of Utah, Salt Lake City, UT (United States); Zaluski, Wade [Schlumberger Carbon Services, Houston, TX (United States); Will, Robert [Schlumberger Carbon Services, Houston, TX (United States); Eisinger, Chris [Colorado Geological Survey, Golden, CO (United States); Matthews, Vince [Colorado Geological Survey, Golden, CO (United States); McPherson, Brian [Univ. of Utah, Salt Lake City, UT (United States)

    2013-12-31

    The purpose of this report is to report results of reservoir model simulation analyses for forecasting subsurface CO2 storage capacity estimation for the most promising formations in the Rocky Mountain region of the USA. A particular emphasis of this project was to assess uncertainty of the simulation-based forecasts. Results illustrate how local-scale data, including well information, number of wells, and location of wells, affect storage capacity estimates and what degree of well density (number of wells over a fixed area) may be required to estimate capacity within a specified degree of confidence. A major outcome of this work was development of a new workflow of simulation analysis, accommodating the addition of “random pseudo wells” to represent virtual characterization wells.

  6. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana;

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...... probability distributions (often used for sensitivity analyses) and prediction intervals. To demonstrate the new method, it is applied to a conceptual rainfall-runoff model using a dataset collected from Melbourne, Australia....

  7. Uncertainty of Energy Consumption Assessment of Domestic Buildings

    DEFF Research Database (Denmark)

    Brohus, Henrik; Heiselberg, Per; Simonsen, A.;

    2009-01-01

    of the uncertainty of energy consumption assessment of domestic buildings. The calculated energy consumption of a number of almost identical domestic buildings in Denmark is compared with the measured energy consumption. Furthermore, the uncertainty is determined by means of stochastic modelling based on input......In order to assess the influence of energy reduction initiatives, to determine the expected annual cost, to calculate life cycle cost, emission impact, etc. it is crucial to be able to assess the energy consumption reasonably accurate. The present work undertakes a theoretical and empirical study...... to correspond reasonably well; however, it is also found that significant differences may occur between calculated and measured energy consumption due to the spread and due to the fact that the result can only be determined with a certain probability. It is found that occupants' behaviour is the major...

  8. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario...... of the meteorological model results. These uncertainties stem from e.g. limits in meteorological obser-vations used to initialise meteorological forecast series. By perturbing the initial state of an NWP model run in agreement with the available observa-tional data, an ensemble of meteorological forecasts is produced....... However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties...

  9. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble......’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent...

  10. Predicting the Term Structure of Interest Rates: Incorporating parameter uncertainty, model uncertainty and macroeconomic information

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); D.J.C. van Dijk (Dick)

    2007-01-01

    textabstractWe forecast the term structure of U.S. Treasury zero-coupon bond yields by analyzing a range of models that have been used in the literature. We assess the relevance of parameter uncertainty by examining the added value of using Bayesian inference compared to frequentist estimation

  11. Accuracy assessments and uncertainty analysis of spatially explicit modeling for land use/cover change and urbanization:A case in Beijing metropolitan area

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Spatially explicit modeling plays a vital role in land use/cover change and urbanization research as well as resources management;however,current models lack proper validation and fail to incorporate uncertainty into the formulation of model predictions.Consequently,policy makers and the general public may develop opinions based on potentially misleading research,which fails to allow for truly informed decisions.Here we use an uncertainty strategy of spatially explicit modeling combined with the series statistic of Kappa index for location and quantity to estimate the uncertainty of future predications and to determine model accuracy.We take the Beijing metropolitan area as an example to demonstrate the uncertainty in extrapolations of predictive land use change and urban sprawl with spatially explicit modeling at multiple resolutions.The sensitivity of scale effects is also discussed.The results show that an improvement in specification of location is more helpful in increasing accuracy as compared to an improvement in the specification of quantity at fine spatial resolutions.However,the spatial scale has great effects on modeling accuracy and correct due to chance tends to increase as resolution becomes coarser.The results allow us to understand the uncertainty when using spatially explicit models for land-use change or urbanization estimates.

  12. Uncertainties in landscape analysis and ecosystem service assessment.

    Science.gov (United States)

    Hou, Y; Burkhard, B; Müller, F

    2013-09-01

    Landscape analysis and ecosystem service assessment have drawn increasing concern from research and application at the landscape scale. Thanks to the continuously emerging assessments as well as studies aiming at evaluation method improvement, policy makers and landscape managers have an increasing interest in integrating ecosystem services into their decisions. However, the plausible assessments carry numerous sources of uncertainties, which regrettably tend to be ignored or disregarded by the actors or researchers. In order to cope with uncertainties and make them more transparent for landscape managers, we demonstrate them by reviewing literature, describing an example and proposing approaches for uncertainty analysis. Additionally, we conclude with potential actions to reduce the insecurities accompanying landscape analysis and ecosystem service assessments. As for landscape analysis, the fundamental uncertainty origins are landscape complexity and methodological uncertainties. Concerning the uncertainty sources of ecosystem service assessments, the complexity of the natural system, respondents' preferences and technical problems play essential roles. By analyzing the assessment process, we find that initial data uncertainty pervades the whole assessment and argue that the limited knowledge about the complexity of ecosystems is the focal origin of uncertainties. For analyzing uncertainties in assessments, we propose systems analysis, scenario simulation and the comparison method as promising strategies. To reduce uncertainties, we assume that actions should integrate continuous learning, expanding respondent numbers and sources, considering representativeness, improving and standardizing assessment methods and optimizing spatial and geobiophysical data. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Quantification for complex assessment: uncertainty estimation in final year project thesis assessment

    Science.gov (United States)

    Kim, Ho Sung

    2013-12-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final year project thesis assessment including peer assessment. A guide map can be generated by the method for finding expected uncertainties prior to the assessment implementation with a given set of variables. It employs a scale for visualisation of expertise levels, derivation of which is based on quantified clarities of mental images for levels of the examiner's expertise and the examinee's expertise achieved. To identify the relevant expertise areas that depend on the complexity in assessment format, a graphical continuum model was developed. The continuum model consists of assessment task, assessment standards and criterion for the transition towards the complex assessment owing to the relativity between implicitness and explicitness and is capable of identifying areas of expertise required for scale development.

  14. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  15. A Robust Profitability Assessment Tool for Targeting Agricultural Investments in Developing Countries: Modeling Spatial Heterogeneity and Uncertainty

    Science.gov (United States)

    Quinn, J. D.; Zeng, Z.; Shoemaker, C. A.; Woodard, J.

    2014-12-01

    In sub-Saharan Africa, where the majority of the population earns their living from agriculture, government expenditures in many countries are being re-directed to the sector to increase productivity and decrease poverty. However, many of these investments are seeing low returns because they are poorly targeted. A geographic tool that accounts for spatial heterogeneity and temporal variability in the factors of production would allow governments and donors to optimize their investments by directing them to farmers for whom they are most profitable. One application for which this is particularly relevant is fertilizer recommendations. It is well-known that soil fertility in much of sub-Saharan Africa is declining due to insufficient nutrient inputs to replenish those lost through harvest. Since fertilizer application rates in sub-Saharan Africa are several times smaller than in other developing countries, it is often assumed that African farmers are under-applying fertilizer. However, this assumption ignores the risk farmers face in choosing whether or how much fertilizer to apply. Simply calculating the benefit/cost ratio of applying a given level of fertilizer in a particular year over a large, aggregated region (as is often done) overlooks the variability in yield response seen at different sites within the region, and at the same site from year to year. Using Ethiopia as an example, we are developing a 1 km resolution fertilizer distribution tool that provides pre-season fertilizer recommendations throughout the agricultural regions of the country, conditional on seasonal climate forecasts. By accounting for spatial heterogeneity in soil, climate, market and travel conditions, as well as uncertainty in climate and output prices at the time a farmer must purchase fertilizer, this stochastic optimization tool gives better recommendations to governments, fertilizer companies, and aid organizations looking to optimize the welfare benefits achieved by their

  16. UNCERTAINTIES IN TRICHLOROETHYLENE PHARMACOKINETIC MODELS

    Science.gov (United States)

    Understanding the pharmacokinetics of a chemical¯its absorption, distribution, metabolism, and excretion in humans and laboratory animals ¯ is critical to the assessment of its human health risks. For trichloroethylene (TCE), numerous physiologically-based pharmacokinetic (PBPK)...

  17. Selective Maintenance Model Considering Time Uncertainty

    OpenAIRE

    Le Chen; Zhengping Shu; Yuan Li; Xuezhi Lv

    2012-01-01

    This study proposes a selective maintenance model for weapon system during mission interval. First, it gives relevant definitions and operational process of material support system. Then, it introduces current research on selective maintenance modeling. Finally, it establishes numerical model for selecting corrective and preventive maintenance tasks, considering time uncertainty brought by unpredictability of maintenance procedure, indetermination of downtime for spares and difference of skil...

  18. Uncertainty calculation in transport models and forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Prato, Carlo Giacomo

    in a four-stage transport model related to different variable distributions (to be used in a Monte Carlo simulation procedure), assignment procedures and levels of congestion, at both the link and the network level. The analysis used as case study the Næstved model, referring to the Danish town of Næstved2...... the uncertainty propagation pattern over time specific for key model outputs becomes strategically important. 1 Manzo, S., Nielsen, O. A. & Prato, C. G. (2014). The Effects of uncertainty in speed-flow curve parameters on a large-scale model. Transportation Research Record, 1, 30-37. 2 Manzo, S., Nielsen, O. A...

  19. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  20. Parametric uncertainty modeling for robust control

    DEFF Research Database (Denmark)

    Rasmussen, K.H.; Jørgensen, Sten Bay

    1999-01-01

    The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... method can be utilized in identification of a nominal model with uncertainty description. The method is demonstrated on a binary distillation column operating in the LV configuration. The dynamics of the column is approximated by a second order linear model, wherein the parameters vary as the operating...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...

  1. A 3D geological model for the Ruiz-Tolima Volcanic Massif (Colombia): Assessment of geological uncertainty using a stochastic approach based on Bézier curve design

    Science.gov (United States)

    González-Garcia, Javier; Jessell, Mark

    2016-09-01

    The Ruiz-Tolima Volcanic Massif (RTVM) is an active volcanic complex in the Northern Andes, and understanding its geological structure is critical for hazard mitigation and guiding future geothermal exploration. However, the sparsity of data available to constrain the interpretation of this volcanic system hinders the application of standard 3D modelling techniques. Furthermore, some features related to the volcanic system are not entirely understood, such as the connectivity between the plutons present in its basement (i.e. Manizales Stock, El Bosque Batholith). We have developed a methodology where two independent working hypotheses were formulated and modelled independently (i.e. a case where both plutons constitute distinct bodies, and an alternative case where they form one single batholith). A Monte Carlo approach was used to characterise the geological uncertainty in each case. Bézier curve design was used to represent geological contacts on input cross sections. Systematic variations in the control points of these curves allows us to generate multiple realisations of geological interfaces, resulting in stochastic models that were grouped into suites used to apply quantitative estimators of uncertainty. This process results in a geological representation based on fuzzy logic and in maps of model uncertainty distribution. The results are consistent with expected regions of high uncertainty near under-constrained geological contacts, while the non-unique nature of the conceptual model indicates that the dominant source of uncertainty in the area is the nature of the batholith structure.

  2. Reduction of uncertainty associated with future changes in Indian summer monsoon projected by climate models and assessment of monsoon teleconnections

    Science.gov (United States)

    Rajendran, Kavirajan; Surendran, Sajani; Kitoh, Akio; Varghese, Stella Jes

    2016-05-01

    Coupled Model Intercomparison Project phase 5 (CMIP5) coupled global climate model (CGCM) Representative Concentration Pathway (RCP) simulations project clear future temperature increase but diverse changes in Indian summer monsoon rainfall (ISMR) with substantial inter-model spread. Robust signals of projected changes are derived based on objective criteria and the physically consistent simulations with the highest reliability suggest future reduction in the frequency of light rainfall but increase in high to extreme rainfall. The role of equatorial Indian and Pacific Oceans on the projected changes in monsoon rainfall is investigated. The results of coupled model projections are also compared with the corresponding projections from high resolution AGCM time-slice, multi-physics and multi-forcing ensemble experiments.

  3. Evaluating sub-national building-energy efficiency policy options under uncertainty: Efficient sensitivity testing of alternative climate, technolgical, and socioeconomic futures in a regional intergrated-assessment model.

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Michael J.; Daly, Don S.; Zhou, Yuyu; Rice, Jennie S.; Patel, Pralit L.; McJeon, Haewon C.; Kyle, G. Page; Kim, Son H.; Eom, Jiyong; Clarke, Leon E.

    2014-05-01

    Improving the energy efficiency of the building stock, commercial equipment and household appliances can have a major impact on energy use, carbon emissions, and building services. Subnational regions such as U.S. states wish to increase their energy efficiency, reduce carbon emissions or adapt to climate change. Evaluating subnational policies to reduce energy use and emissions is difficult because of the uncertainties in socioeconomic factors, technology performance and cost, and energy and climate policies. Climate change may undercut such policies. Assessing these uncertainties can be a significant modeling and computation burden. As part of this uncertainty assessment, this paper demonstrates how a decision-focused sensitivity analysis strategy using fractional factorial methods can be applied to reveal the important drivers for detailed uncertainty analysis.

  4. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  5. The multi temporal/multi-model approach to predictive uncertainty assessment in real-time flood forecasting

    Science.gov (United States)

    Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio

    2017-08-01

    This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.

  6. Calculating Impacts of Energy Standards on Energy Demand in U.S. Buildings under Uncertainty with an Integrated Assessment Model: Technical Background Data

    Energy Technology Data Exchange (ETDEWEB)

    Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2014-12-06

    This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.

  7. Downscaled climate change projections with uncertainty assessment over India using a high resolution multi-model approach.

    Science.gov (United States)

    Kumar, Pankaj; Wiltshire, Andrew; Mathison, Camilla; Asharaf, Shakeel; Ahrens, Bodo; Lucas-Picher, Philippe; Christensen, Jens H; Gobiet, Andreas; Saeed, Fahad; Hagemann, Stefan; Jacob, Daniela

    2013-12-01

    This study presents the possible regional climate change over South Asia with a focus over India as simulated by three very high resolution regional climate models (RCMs). One of the most striking results is a robust increase in monsoon precipitation by the end of the 21st century but regional differences in strength. First the ability of RCMs to simulate the monsoon climate is analyzed. For this purpose all three RCMs are forced with ECMWF reanalysis data for the period 1989-2008 at a horizontal resolution of ~25 km. The results are compared against independent observations. In order to simulate future climate the models are driven by lateral boundary conditions from two global climate models (GCMs: ECHAM5-MPIOM and HadCM3) using the SRES A1B scenario, except for one RCM, which only used data from one GCM. The results are presented for the full transient simulation period 1970-2099 and also for several time slices. The analysis concentrates on precipitation and temperature over land. All models show a clear signal of gradually wide-spread warming throughout the 21st century. The ensemble-mean warming over India is 1.5°C at the end of 2050, whereas it is 3.9°C at the end of century with respect to 1970-1999. The pattern of projected precipitation changes shows considerable spatial variability, with an increase in precipitation over the peninsular of India and coastal areas and, either no change or decrease further inland. From the analysis of a larger ensemble of global climate models using the A1B scenario a wide spread warming (~3.2°C) and an overall increase (~8.5%) in mean monsoon precipitation by the end of the 21st century is very likely. The influence of the driving GCM on the projected precipitation change simulated with each RCM is as strong as the variability among the RCMs driven with one.

  8. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders

    In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the param......In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty...... by a simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been chosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  9. Uncertainty in surface water flood risk modelling

    Science.gov (United States)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    Two thirds of the flooding that occurred in the UK during summer 2007 was as a result of surface water (otherwise known as ‘pluvial') rather than river or coastal flooding. In response, the Environment Agency and Interim Pitt Reviews have highlighted the need for surface water risk mapping and warning tools to identify, and prepare for, flooding induced by heavy rainfall events. This need is compounded by the likely increase in rainfall intensities due to climate change. The Association of British Insurers has called for the Environment Agency to commission nationwide flood risk maps showing the relative risk of flooding from all sources. At the wider European scale, the recently-published EC Directive on the assessment and management of flood risks will require Member States to evaluate, map and model flood risk from a variety of sources. As such, there is now a clear and immediate requirement for the development of techniques for assessing and managing surface water flood risk across large areas. This paper describes an approach for integrating rainfall, drainage network and high-resolution topographic data using Flowroute™, a high-resolution flood mapping and modelling platform, to produce deterministic surface water flood risk maps. Information is provided from UK case studies to enable assessment and validation of modelled results using historical flood information and insurance claims data. Flowroute was co-developed with flood scientists at Cambridge University specifically to simulate river dynamics and floodplain inundation in complex, congested urban areas in a highly computationally efficient manner. It utilises high-resolution topographic information to route flows around individual buildings so as to enable the prediction of flood depths, extents, durations and velocities. As such, the model forms an ideal platform for the development of surface water flood risk modelling and mapping capabilities. The 2-dimensional component of Flowroute employs

  10. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    1984-01-01

    is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  11. Coping with Uncertainty Modeling and Policy Issues

    CERN Document Server

    Marti, Kurt; Makowski, Marek

    2006-01-01

    Ongoing global changes bring fundamentally new scientific problems requiring new concepts and tools. The complexity of new problems does not allow to achieve enough certainty by increasing the resolution of models or by bringing in more links. This book talks about new tools for modeling and management of uncertainty.

  12. Uncertainties of Assessing Projected Changes in Precipitation Extremes

    Science.gov (United States)

    Brekke, L. D.; Barsugli, J. J.

    2011-12-01

    Water resource managers share a common challenge in understanding what climate change could mean for future hydroclimate extremes. Understanding the uncertainty of projected changes in extremes is critical to making decisions about whether to invest in adaptation measures today or delay until more credible information becomes available. Uncertainties arise from several methodological choices including, including criteria that drive selection of global climate projection information to frame the assessment, whether and how to bias-correct global projection information, and how to represent local controls on how to spatially downscale translations of these projections. This presentation highlights such uncertainties, focusing on projected changes in precipitation indicated by two metrics: annual total and annual maximum daily amount. Attention is given to metric conditions varying from typical (i.e. metrics having 0.50 cumulative probability) to extreme (i.e. annual totals having 0.01 and 0.05 cumulative probabilities, which are relevant to drought, and annual maximum daily amounts having 0.95 and 0.99 cumulative probabilities, which are relevant to floods). The assessment is informed by an ensemble of 53 daily CMIP3 precipitation projections from the "Bias Corrected and Downscaled WCRP CMIP3 Climate Projections" web-archive (see URL), regridded over the contiguous United States from native climate model resolution to a common 2° grid and reported during 1961-2000, 2046-2065 and 2081-2100. Focusing on changes between 20-year periods, evaluations include (a) assessing changes in typical metric conditions and determining whether changes in metric distributions are statistically significant, (b) characterizing metric extremes using parametric techniques and assessing for changes in metric extremes, (c) assessing how uncertainties in projected typical and extreme metrics associate with three sources of global climate projection uncertainty (emissions scenario, global

  13. Uncertainty in spatially explicit animal dispersal models

    Science.gov (United States)

    Mooij, Wolf M.; DeAngelis, Donald L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.

  14. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  15. Model and parameter uncertainty in IDF relationships under climate change

    Science.gov (United States)

    Chandra, Rupa; Saha, Ujjwal; Mujumdar, P. P.

    2015-05-01

    Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty.

  16. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    Science.gov (United States)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  17. Uncertainties in risk assessment of CO2 pipelines

    NARCIS (Netherlands)

    Koornneef, J.M.; Spruijt, M.; Molag, M.; Ramirez, C.A.; Faaij, A.P.C.; Turkenburg, W.C.

    2009-01-01

    The main goal of this study is to identify knowledge gaps and uncertainties in Quantitative Risk Assessments (QRA) for CO2 pipelines and to assess to what extent those gaps and uncertainties affect the final outcome of the QRA. The impact of methodological choices and uncertain values for input para

  18. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    methodology for basin discharge and groundwater heads. The ensemble of 11 climate models varied in strength, significance, and sometimes in direction of the climate change signal. The more complex daily DBS correction methods were more accurate at transferring precipitation changes in mean as well...... as the variance, and improving the characterisation of day to day variation as well as heavy events. However, the most highly parameterised of the DBS methods were less robust under climate change conditions. The spatial characteristics of groundwater head and stream discharge were best represented by DBS methods...... applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current...

  19. Importance of hydrological uncertainty assessment methods in climate change impact studies

    Science.gov (United States)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-01-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with a recent boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the following decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time-series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The source, structure and composition of uncertainty depended strongly on the uncertainty assessment method. This demonstrated that one could arrive to rather different conclusions about predictive uncertainty for the same

  20. The importance of hydrological uncertainty assessment methods in climate change impact studies

    Science.gov (United States)

    Honti, M.; Scheidegger, A.; Stamm, C.

    2014-08-01

    Climate change impact assessments have become more and more popular in hydrology since the middle 1980s with a recent boost after the publication of the IPCC AR4 report. From hundreds of impact studies a quasi-standard methodology has emerged, to a large extent shaped by the growing public demand for predicting how water resources management or flood protection should change in the coming decades. The "standard" workflow relies on a model cascade from global circulation model (GCM) predictions for selected IPCC scenarios to future catchment hydrology. Uncertainty is present at each level and propagates through the model cascade. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. Our hypothesis was that the relative importance of climatic and hydrologic uncertainty is (among other factors) heavily influenced by the uncertainty assessment method. To test this we carried out a climate change impact assessment and estimated the relative importance of the uncertainty sources. The study was performed on two small catchments in the Swiss Plateau with a lumped conceptual rainfall runoff model. In the climatic part we applied the standard ensemble approach to quantify uncertainty but in hydrology we used formal Bayesian uncertainty assessment with two different likelihood functions. One was a time series error model that was able to deal with the complicated statistical properties of hydrological model residuals. The second was an approximate likelihood function for the flow quantiles. The results showed that the expected climatic impact on flow quantiles was small compared to prediction uncertainty. The choice of uncertainty assessment method actually determined what sources of uncertainty could be identified at all. This demonstrated that one could arrive at rather different conclusions about the causes behind

  1. County-Level Climate Uncertainty for Risk Assessments: Volume 1.

    Energy Technology Data Exchange (ETDEWEB)

    Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.

    2017-06-01

    This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.

  2. An assessment of uncertainties in using volume-area modelling for computing the twenty-first century glacier contribution to sea-level change

    NARCIS (Netherlands)

    Slangen, A.B.A.; van de Wal, R.S.W.

    2011-01-01

    A large part of present-day sea-level change is formed by the melt of glaciers and ice caps (GIC). This study focuses on the uncertainties in the calculation of the GIC contribution on a century timescale. The model used is based on volume-area scaling, 5 combined with the mass balance sensitivity o

  3. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  4. Uncertainties in Surface Layer Modeling

    Science.gov (United States)

    Pendergrass, W.

    2015-12-01

    A central problem for micrometeorologists has been the relationship of air-surface exchange rates of momentum and heat to quantities that can be predicted with confidence. The flux-gradient profile developed through Monin-Obukhov Similarity Theory (MOST) provides an integration of the dimensionless wind shear expression where is an empirically derived expression for stable and unstable atmospheric conditions. Empirically derived expressions are far from universally accepted (Garratt, 1992, Table A5). Regardless of what form of these relationships might be used, their significance over any short period of time is questionable since all of these relationships between fluxes and gradients apply to averages that might rarely occur. It is well accepted that the assumption of stationarity and homogeneity do not reflect the true chaotic nature of the processes that control the variables considered in these relationships, with the net consequence that the levels of predictability theoretically attainable might never be realized in practice. This matter is of direct relevance to modern prognostic models which construct forecasts by assuming the universal applicability of relationships among averages for the lower atmosphere, which rarely maintains an average state. Under a Cooperative research and Development Agreement between NOAA and Duke Energy Generation, NOAA/ATDD conducted atmospheric boundary layer (ABL) research using Duke renewable energy sites as research testbeds. One aspect of this research has been the evaluation of legacy flux-gradient formulations (the ϕ functions, see Monin and Obukhov, 1954) for the exchange of heat and momentum. At the Duke Energy Ocotillo site, NOAA/ATDD installed sonic anemometers reporting wind and temperature fluctuations at 10Hz at eight elevations. From these observations, ϕM and ϕH were derived from a two-year database of mean and turbulent wind and temperature observations. From this extensive measurement database, using a

  5. Crop model improvement reduces the uncertainty of the response to temperature of multi-model ensembles

    DEFF Research Database (Denmark)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold

    2017-01-01

    To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of mo...

  6. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  7. Optical Model and Cross Section Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.

    2009-10-05

    Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.

  8. Uncertainty quantification for Markov chain models.

    Science.gov (United States)

    Meidani, Hadi; Ghanem, Roger

    2012-12-01

    Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.

  9. Controls on inorganic nitrogen leaching from Finnish catchments assessed using a sensitivity and uncertainty analysis of the INCA-N model

    Energy Technology Data Exchange (ETDEWEB)

    Rankinen, K.; Granlund, K. [Finnish Environmental Inst., Helsinki (Finland); Futter, M. N. [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden)

    2013-11-01

    The semi-distributed, dynamic INCA-N model was used to simulate the behaviour of dissolved inorganic nitrogen (DIN) in two Finnish research catchments. Parameter sensitivity and model structural uncertainty were analysed using generalized sensitivity analysis. The Mustajoki catchment is a forested upstream catchment, while the Savijoki catchment represents intensively cultivated lowlands. In general, there were more influential parameters in Savijoki than Mustajoki. Model results were sensitive to N-transformation rates, vegetation dynamics, and soil and river hydrology. Values of the sensitive parameters were based on long-term measurements covering both warm and cold years. The highest measured DIN concentrations fell between minimum and maximum values estimated during the uncertainty analysis. The lowest measured concentrations fell outside these bounds, suggesting that some retention processes may be missing from the current model structure. The lowest concentrations occurred mainly during low flow periods; so effects on total loads were small. (orig.)

  10. Multiphysics modeling and uncertainty quantification for an active composite reflector

    Science.gov (United States)

    Peterson, Lee D.; Bradford, S. C.; Schiermeier, John E.; Agnes, Gregory S.; Basinger, Scott A.

    2013-09-01

    A multiphysics, high resolution simulation of an actively controlled, composite reflector panel is developed to extrapolate from ground test results to flight performance. The subject test article has previously demonstrated sub-micron corrected shape in a controlled laboratory thermal load. This paper develops a model of the on-orbit performance of the panel under realistic thermal loads, with an active heater control system, and performs an uncertainty quantification of the predicted response. The primary contribution of this paper is the first reported application of the Sandia developed Sierra mechanics simulation tools to a spacecraft multiphysics simulation of a closed-loop system, including uncertainty quantification. The simulation was developed so as to have sufficient resolution to capture the residual panel shape error that remains after the thermal and mechanical control loops are closed. An uncertainty quantification analysis was performed to assess the predicted tolerance in the closed-loop wavefront error. Key tools used for the uncertainty quantification are also described.

  11. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  12. Uncertainties in life cycle assessment of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Christensen, Thomas Højlund

    2011-01-01

    Life cycle assessment has been used to assess environmental performances of waste management systems in many studies. The uncertainties inherent to its results are often pointed out but not always quantified, which should be the case to ensure a good decisionmaking process. This paper proposes...... a method to assess all parameter uncertainties and quantify the overall uncertainty of the assessment. The method is exemplified in a case study, where the goal is to determine if anaerobic digestion of organic waste is more beneficial than incineration in Denmark, considering only the impact on global...

  13. Representing uncertainty on model analysis plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-12-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  14. Uncertainty Quantification for Optical Model Parameters

    CERN Document Server

    Lovell, A E; Sarich, J; Wild, S M

    2016-01-01

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of this work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fit and create corresponding 95\\% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. We study a number of reactions involving neutron and deuteron p...

  15. Assessing uncertainty in radar measurements on simplified meteorological scenarios

    Directory of Open Access Journals (Sweden)

    L. Molini

    2006-01-01

    Full Text Available A three-dimensional radar simulator model (RSM developed by Haase (1998 is coupled with the nonhydrostatic mesoscale weather forecast model Lokal-Modell (LM. The radar simulator is able to model reflectivity measurements by using the following meteorological fields, generated by Lokal Modell, as inputs: temperature, pressure, water vapour content, cloud water content, cloud ice content, rain sedimentation flux and snow sedimentation flux. This work focuses on the assessment of some uncertainty sources associated with radar measurements: absorption by the atmospheric gases, e.g., molecular oxygen, water vapour, and nitrogen; attenuation due to the presence of a highly reflecting structure between the radar and a "target structure". RSM results for a simplified meteorological scenario, consisting of a humid updraft on a flat surface and four cells placed around it, are presented.

  16. Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results

    Energy Technology Data Exchange (ETDEWEB)

    Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory

    2009-01-01

    The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.

  17. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  18. A Simplified Model of Choice Behavior under Uncertainty

    OpenAIRE

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that m...

  19. A simplified model of choice behavior under uncertainty

    OpenAIRE

    Ching-Hung Lin; Yu-Kai Lin; Tzu-Jiun Song; Jong-Tsun Huang; Yao-Chu Chiu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the pr...

  20. Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)

    Science.gov (United States)

    Sparks, R. S.

    2009-12-01

    many sources of uncertainty in forecasting the areas that volcanic activity will effect and the severity of the effects. Uncertainties arise from: natural variability, inadequate data, biased data, incomplete data, lack of understanding of the processes, limitations to predictive models, ambiguity, and unknown unknowns. The description of volcanic hazards is thus necessarily probabilistic and requires assessment of the attendant uncertainties. Several issues arise from the probabilistic nature of volcanic hazards and the intrinsic uncertainties. Although zonation maps require well-defined boundaries for administrative pragmatism, such boundaries cannot divide areas that are completely safe from those that are unsafe. Levels of danger or safety need to be defined to decide on and justify boundaries through the concepts of vulnerability and risk. More data, better observations, improved models may reduce uncertainties, but can increase uncertainties and may lead to re-appraisal of zone boundaries. Probabilities inferred by statistical techniques are hard to communicate. Expert elicitation is an emerging methodology for risk assessment and uncertainty evaluation. The method has been applied at one major volcanic crisis (Soufrière Hills Volcano, Montserrat), and is being applied in planning for volcanic crises at Vesuvius.

  1. Uncertainty and Sensitivity in Surface Dynamics Modeling

    Science.gov (United States)

    Kettner, Albert J.; Syvitski, James P. M.

    2016-05-01

    Papers for this special issue on 'Uncertainty and Sensitivity in Surface Dynamics Modeling' heralds from papers submitted after the 2014 annual meeting of the Community Surface Dynamics Modeling System or CSDMS. CSDMS facilitates a diverse community of experts (now in 68 countries) that collectively investigate the Earth's surface-the dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere, by promoting, developing, supporting and disseminating integrated open source software modules. By organizing more than 1500 researchers, CSDMS has the privilege of identifying community strengths and weaknesses in the practice of software development. We recognize, for example, that progress has been slow on identifying and quantifying uncertainty and sensitivity in numerical modeling of earth's surface dynamics. This special issue is meant to raise awareness for these important subjects and highlight state-of-the-art progress.

  2. Dealing with uncertainties in environmental burden of disease assessment

    Directory of Open Access Journals (Sweden)

    van der Sluijs Jeroen P

    2009-04-01

    Full Text Available Abstract Disability Adjusted Life Years (DALYs combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making.

  3. Assessment of dose measurement uncertainty using RisøScan

    DEFF Research Database (Denmark)

    Helt-Hansen, J.; Miller, A.

    2006-01-01

    The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectiv...

  4. New techniques for landslide hazard assessments: opportunities, methodology, and uncertainty

    Science.gov (United States)

    Kirschbaum, D. B.; Peters-Lidard, C. D.; Adler, R. F.; Hong, Y.

    2009-12-01

    An emerging global rainfall-triggered landslide hazard algorithm employs an empirical framework to identify potentially susceptible areas to rainfall-triggered landslides in near real-time. This methodology couples a satellite-derived estimate of cumulative rainfall with a static surface susceptibility map to highlight regions of anticipated landslide activity. While this algorithm represents an important first step in developing a larger-scale landslide prediction framework, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of upscaling a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and more effectively communicate the model skill.

  5. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing model projection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in

  6. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    DEFF Research Database (Denmark)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.

    2015-01-01

    uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties....... This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions...

  7. Physical and Model Uncertainty for Fatigue Design of Composite Material

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...

  8. Radioecological assessment of marine environment: complexity, sensitivity and uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Iosjpe, Mikhail [Norwegian Radiation Protection Authority, P.O. Box 55, N-1332 Oesteraas (Norway)

    2014-07-01

    A compartment modelling approach is widely used to evaluate the consequences after the release of radionuclides into the marine environment, by taking into account: (i) dispersion of radionuclides in water and sediment phases, (ii) bioaccumulation of radionuclides in biota and (iii) dose assessments for marine organisms and human populations. The NRPA box model includes site-specific information for the compartments, advection of radioactivity between compartments, sedimentation, diffusion of radioactivity through pore water in sediment, resuspension, mixing due to bioturbation, particle mixing, a burial process for radionuclides in deep sediment layers and radioactive decay. The contamination of biota is calculated from the known radionuclide concentrations in filtered seawater in the different water regions. Doses to man are calculated on the basis of seafood consumption, in accordance with available data for seafood catches and assumptions about human diet in the respective areas. Dose to biota is calculated on the basis of radionuclide concentrations in marine organisms, water and sediment, using dose conversion factors. This modelling approach requires the use of a large set of parameters (up to several thousand), some of which have high uncertainties linked to them. This work consists of two parts: A radioecological assessment as described above, and a sensitivity and uncertainty analysis, which was applied to two release scenarios: (i) a potential accident with a nuclear submarine and (ii) unit uniform atmospheric deposition to selected marine areas. The sensitivity and uncertainty analysis is based on the calculation of local and global sensitivity indexes, and then compare this approach to the Monte-Carlo Methods. The simulations clearly demonstrate the complexities encountered when using the compartment modelling approach. It is shown that the results can strongly depend on the time being analyzed. For example, the change of a given parameter may either

  9. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  10. System Level Uncertainty Assessment for Collaborative RLV Design

    Science.gov (United States)

    Charania, A. C.; Bradford, John E.; Olds, John R.; Graham, Matthew

    2002-01-01

    A collaborative design process utilizing Probabilistic Data Assessment (PDA) is showcased. Given the limitation of financial resources by both the government and industry, strategic decision makers need more than just traditional point designs, they need to be aware of the likelihood of these future designs to meet their objectives. This uncertainty, an ever-present character in the design process, can be embraced through a probabilistic design environment. A conceptual design process is presented that encapsulates the major engineering disciplines for a Third Generation Reusable Launch Vehicle (RLV). Toolsets consist of aerospace industry standard tools in disciplines such as trajectory, propulsion, mass properties, cost, operations, safety, and economics. Variations of the design process are presented that use different fidelities of tools. The disciplinary engineering models are used in a collaborative engineering framework utilizing Phoenix Integration's ModelCenter and AnalysisServer environment. These tools allow the designer to join disparate models and simulations together in a unified environment wherein each discipline can interact with any other discipline. The design process also uses probabilistic methods to generate the system level output metrics of interest for a RLV conceptual design. The specific system being examined is the Advanced Concept Rocket Engine 92 (ACRE-92) RLV. Previous experience and knowledge (in terms of input uncertainty distributions from experts and modeling and simulation codes) can be coupled with Monte Carlo processes to best predict the chances of program success.

  11. Quantifying uncertainty in stable isotope mixing models

    Science.gov (United States)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  12. Uncertainty analysis in dissolved oxygen modeling in streams.

    Science.gov (United States)

    Hamed, Maged M; El-Beshry, Manar Z

    2004-08-01

    Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter-Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors.

  13. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    Directory of Open Access Journals (Sweden)

    Finlay Scott

    Full Text Available Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates, model selection (e.g. choosing growth or stock assessment models and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model

  14. An Applied Framework for Incorporating Multiple Sources of Uncertainty in Fisheries Stock Assessments.

    Science.gov (United States)

    Scott, Finlay; Jardim, Ernesto; Millar, Colin P; Cerviño, Santiago

    2016-01-01

    Estimating fish stock status is very challenging given the many sources and high levels of uncertainty surrounding the biological processes (e.g. natural variability in the demographic rates), model selection (e.g. choosing growth or stock assessment models) and parameter estimation. Incorporating multiple sources of uncertainty in a stock assessment allows advice to better account for the risks associated with proposed management options, promoting decisions that are more robust to such uncertainty. However, a typical assessment only reports the model fit and variance of estimated parameters, thereby underreporting the overall uncertainty. Additionally, although multiple candidate models may be considered, only one is selected as the 'best' result, effectively rejecting the plausible assumptions behind the other models. We present an applied framework to integrate multiple sources of uncertainty in the stock assessment process. The first step is the generation and conditioning of a suite of stock assessment models that contain different assumptions about the stock and the fishery. The second step is the estimation of parameters, including fitting of the stock assessment models. The final step integrates across all of the results to reconcile the multi-model outcome. The framework is flexible enough to be tailored to particular stocks and fisheries and can draw on information from multiple sources to implement a broad variety of assumptions, making it applicable to stocks with varying levels of data availability The Iberian hake stock in International Council for the Exploration of the Sea (ICES) Divisions VIIIc and IXa is used to demonstrate the framework, starting from length-based stock and indices data. Process and model uncertainty are considered through the growth, natural mortality, fishing mortality, survey catchability and stock-recruitment relationship. Estimation uncertainty is included as part of the fitting process. Simple model averaging is used to

  15. Uncertainty Visualization in Forward and Inverse Cardiac Models.

    Science.gov (United States)

    Burton, Brett M; Erem, Burak; Potter, Kristin; Rosen, Paul; Johnson, Chris R; Brooks, Dana H; Macleod, Rob S

    2013-01-01

    Quantification and visualization of uncertainty in cardiac forward and inverse problems with complex geometries is subject to various challenges. Specific to visualization is the observation that occlusion and clutter obscure important regions of interest, making visual assessment difficult. In order to overcome these limitations in uncertainty visualization, we have developed and implemented a collection of novel approaches. To highlight the utility of these techniques, we evaluated the uncertainty associated with two examples of modeling myocardial activity. In one case we studied cardiac potentials during the repolarization phase as a function of variability in tissue conductivities of the ischemic heart (forward case). In a second case, we evaluated uncertainty in reconstructed activation times on the epicardium resulting from variation in the control parameter of Tikhonov regularization (inverse case). To overcome difficulties associated with uncertainty visualization, we implemented linked-view windows and interactive animation to the two respective cases. Through dimensionality reduction and superimposed mean and standard deviation measures over time, we were able to display key features in large ensembles of data and highlight regions of interest where larger uncertainties exist.

  16. Modeling and inverse problems in the presence of uncertainty

    CERN Document Server

    Banks, H T; Thompson, W Clayton

    2014-01-01

    Modeling and Inverse Problems in the Presence of Uncertainty collects recent research-including the authors' own substantial projects-on uncertainty propagation and quantification. It covers two sources of uncertainty: where uncertainty is present primarily due to measurement errors and where uncertainty is present due to the modeling formulation itself. After a useful review of relevant probability and statistical concepts, the book summarizes mathematical and statistical aspects of inverse problem methodology, including ordinary, weighted, and generalized least-squares formulations. It then

  17. Managing geological uncertainty in CO2-EOR reservoir assessments

    Science.gov (United States)

    Welkenhuysen, Kris; Piessens, Kris

    2014-05-01

    therefore not suited for cost-benefit analysis. They likely result in too optimistic results because onshore configurations are cheaper and different. We propose to translate the detailed US data to the North Sea, retaining their uncertainty ranges. In a first step, a general cost correction can be applied to account for costs specific to the EU and the offshore setting. In a second step site-specific data, including laboratory tests and reservoir modelling, are used to further adapt the EOR ratio values taking into account all available geological reservoir-specific knowledge. And lastly, an evaluation of the field configuration will have an influence on both the cost and local geology dimension, because e.g. horizontal drilling is needed (cost) to improve injectivity (geology). As such, a dataset of the EOR field is obtained which contains all aspects and their uncertainty ranges. With these, a geologically realistic basis is obtained for further cost-benefit analysis of a specific field, where the uncertainties are accounted for using a stochastic evaluation. Such ad-hoc evaluation of geological parameters will provide a better assessment of the CO2-EOR potential of the North Sea oil fields.

  18. Characterizing uncertainty when evaluating risk management metrics: risk assessment modeling of Listeria monocytogenes contamination in ready-to-eat deli meats.

    Science.gov (United States)

    Gallagher, Daniel; Ebel, Eric D; Gallagher, Owen; Labarre, David; Williams, Michael S; Golden, Neal J; Pouillot, Régis; Dearfield, Kerry L; Kause, Janell

    2013-04-01

    This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence

  19. Incorporating Uncertainty into Backward Erosion Piping Risk Assessments

    Directory of Open Access Journals (Sweden)

    Robbins Bryant A.

    2016-01-01

    Full Text Available Backward erosion piping (BEP is a type of internal erosion that typically involves the erosion of foundation materials beneath an embankment. BEP has been shown, historically, to be the cause of approximately one third of all internal erosion related failures. As such, the probability of BEP is commonly evaluated as part of routine risk assessments for dams and levees in the United States. Currently, average gradient methods are predominantly used to perform these assessments, supported by mean trends of critical gradient observed in laboratory flume tests. Significant uncertainty exists surrounding the mean trends of critical gradient used in practice. To quantify this uncertainty, over 100 laboratory-piping tests were compiled and analysed to assess the variability of laboratory measurements of horizontal critical gradient. Results of these analyses indicate a large amount of uncertainty surrounding critical gradient measurements for all soils, with increasing uncertainty as soils become less uniform.

  20. Fault Detection under Fuzzy Model Uncertainty

    Institute of Scientific and Technical Information of China (English)

    Marek Kowal; Józef Korbicz

    2007-01-01

    The paper tackles the problem of robust fault detection using Takagi-Sugeno fuzzy models. A model-based strategy is employed to generate residuals in order to make a decision about the state of the process. Unfortunately, such a method is corrupted by model uncertainty due to the fact that in real applications there exists a model-reality mismatch. In order to ensure reliable fault detection the adaptive threshold technique is used to deal with the mentioned problem. The paper focuses also on fuzzy model design procedure. The bounded-error approach is applied to generating the rules for the model using available measurements. The proposed approach is applied to fault detection in the DC laboratory engine.

  1. Facets of Uncertainty in Digital Elevation and Slope Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jingxiong; LI Deren

    2005-01-01

    This paper investigates the differences that result from applying different approaches to uncertainty modeling and reports an experimental examining error estimation and propagation in elevation and slope,with the latter derived from the former. It is confirmed that significant differences exist between uncertainty descriptors, and propagation of uncertainty to end products is immensely affected by the specification of source uncertainty.

  2. Quantification of uncertainty in flood risk assessment for flood protection planning: a Bayesian approach

    Science.gov (United States)

    Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel

    2017-04-01

    Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.

  3. Assessment of uncertainties in radiation-induced cancer risk predictions at clinically relevant doses

    Energy Technology Data Exchange (ETDEWEB)

    Nguyen, J. [Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts 02114 and Department of Physics, Ruprecht-Karls-Universität Heidelberg, Heidelberg 69117 (Germany); Moteabbed, M.; Paganetti, H., E-mail: hpaganetti@mgh.harvard.edu [Department of Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts 02114 and Harvard Medical School, Boston, Massachusetts 02114 (United States)

    2015-01-15

    Purpose: Theoretical dose–response models offer the possibility to assess second cancer induction risks after external beam therapy. The parameters used in these models are determined with limited data from epidemiological studies. Risk estimations are thus associated with considerable uncertainties. This study aims at illustrating uncertainties when predicting the risk for organ-specific second cancers in the primary radiation field illustrated by choosing selected treatment plans for brain cancer patients. Methods: A widely used risk model was considered in this study. The uncertainties of the model parameters were estimated with reported data of second cancer incidences for various organs. Standard error propagation was then subsequently applied to assess the uncertainty in the risk model. Next, second cancer risks of five pediatric patients treated for cancer in the head and neck regions were calculated. For each case, treatment plans for proton and photon therapy were designed to estimate the uncertainties (a) in the lifetime attributable risk (LAR) for a given treatment modality and (b) when comparing risks of two different treatment modalities. Results: Uncertainties in excess of 100% of the risk were found for almost all organs considered. When applied to treatment plans, the calculated LAR values have uncertainties of the same magnitude. A comparison between cancer risks of different treatment modalities, however, does allow statistically significant conclusions. In the studied cases, the patient averaged LAR ratio of proton and photon treatments was 0.35, 0.56, and 0.59 for brain carcinoma, brain sarcoma, and bone sarcoma, respectively. Their corresponding uncertainties were estimated to be potentially below 5%, depending on uncertainties in dosimetry. Conclusions: The uncertainty in the dose–response curve in cancer risk models makes it currently impractical to predict the risk for an individual external beam treatment. On the other hand, the ratio

  4. Comparative evaluation of 1D and quasi-2D hydraulic models based on benchmark and real-world applications for uncertainty assessment in flood mapping

    Science.gov (United States)

    Dimitriadis, Panayiotis; Tegos, Aristoteles; Oikonomou, Athanasios; Pagana, Vassiliki; Koukouvinos, Antonios; Mamassis, Nikos; Koutsoyiannis, Demetris; Efstratiadis, Andreas

    2016-03-01

    One-dimensional and quasi-two-dimensional hydraulic freeware models (HEC-RAS, LISFLOOD-FP and FLO-2d) are widely used for flood inundation mapping. These models are tested on a benchmark test with a mixed rectangular-triangular channel cross section. Using a Monte-Carlo approach, we employ extended sensitivity analysis by simultaneously varying the input discharge, longitudinal and lateral gradients and roughness coefficients, as well as the grid cell size. Based on statistical analysis of three output variables of interest, i.e. water depths at the inflow and outflow locations and total flood volume, we investigate the uncertainty enclosed in different model configurations and flow conditions, without the influence of errors and other assumptions on topography, channel geometry and boundary conditions. Moreover, we estimate the uncertainty associated to each input variable and we compare it to the overall one. The outcomes of the benchmark analysis are further highlighted by applying the three models to real-world flood propagation problems, in the context of two challenging case studies in Greece.

  5. A protocol for assessment of uncertainty and strength of emissions data

    NARCIS (Netherlands)

    Risbey, James S.; Sluijs, J.P. van der; Ravetz, Jerome R.

    2006-01-01

    This method is intended to assist in characterizing uncertainties in emissions data for the Mileubalans and to identify critical issues related to uncertainty. The method assesses both quantitative and qualitative dimensions of uncertainty. Quantitative uncertainties are expressed by assigning proba

  6. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld

    2007-01-01

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....... volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...

  7. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  8. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models of th...... models, is applied to two different sets of measured plant data. The computed uncertainty bounds cover the measured plant output, while the nominal prediction is outside these uncertainty bounds for some samples in these examples.  ......Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models...... of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  9. Sensitivity analysis of a two-dimensional quantitative microbiological risk assessment: keeping variability and uncertainty separated.

    Science.gov (United States)

    Busschaert, Pieter; Geeraerd, Annemie H; Uyttendaele, Mieke; Van Impe, Jan F

    2011-08-01

    The aim of quantitative microbiological risk assessment is to estimate the risk of illness caused by the presence of a pathogen in a food type, and to study the impact of interventions. Because of inherent variability and uncertainty, risk assessments are generally conducted stochastically, and if possible it is advised to characterize variability separately from uncertainty. Sensitivity analysis allows to indicate to which of the input variables the outcome of a quantitative microbiological risk assessment is most sensitive. Although a number of methods exist to apply sensitivity analysis to a risk assessment with probabilistic input variables (such as contamination, storage temperature, storage duration, etc.), it is challenging to perform sensitivity analysis in the case where a risk assessment includes a separate characterization of variability and uncertainty of input variables. A procedure is proposed that focuses on the relation between risk estimates obtained by Monte Carlo simulation and the location of pseudo-randomly sampled input variables within the uncertainty and variability distributions. Within this procedure, two methods are used-that is, an ANOVA-like model and Sobol sensitivity indices-to obtain and compare the impact of variability and of uncertainty of all input variables, and of model uncertainty and scenario uncertainty. As a case study, this methodology is applied to a risk assessment to estimate the risk of contracting listeriosis due to consumption of deli meats.

  10. Model uncertainty and Bayesian model averaging in vector autoregressive processes

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2006-01-01

    textabstractEconomic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure i

  11. Vector network analyzer (VNA) measurements and uncertainty assessment

    CERN Document Server

    Shoaib, Nosherwan

    2017-01-01

    This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.

  12. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  13. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2012-06-01

    Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.

  14. Review of strategies for handling geological uncertainty in groundwater flow and transport modeling

    DEFF Research Database (Denmark)

    Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.;

    2012-01-01

    The geologically related uncertainty in groundwater modeling originates from two main sources: geological structures and hydraulic parameter values within these structures. Within a geological structural element the parameter values will always exhibit local scale heterogeneity, which can...... be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...

  15. Influence of model reduction on uncertainty of flood inundation predictions

    Science.gov (United States)

    Romanowicz, R. J.; Kiczko, A.; Osuch, M.

    2012-04-01

    Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of

  16. Probabilistic assessment of seawater intrusion under multiple sources of uncertainty

    Science.gov (United States)

    Riva, M.; Guadagnini, A.; Dell'Oca, A.

    2015-01-01

    Coastal aquifers are affected by seawater intrusion (SWI) on a worldwide scale. The Henry's problem has been often used as a benchmark to analyze this phenomenon. Here, we investigate the way an incomplete knowledge of the system properties impacts the assessment of global quantities (GQs) describing key characteristics of the saltwater wedge in the dispersive Henry's problem. We recast the problem in dimensionless form and consider four dimensionless quantities characterizing the SWI process, i.e., the gravity number, the permeability anisotropy ratio, and the transverse and longitudinal Péclet numbers. These quantities are affected by uncertainty due to the lack of exhaustive characterization of the subsurface. We rely on the Sobol indices to quantify the relative contribution of each of these uncertain terms to the total variance of each of the global descriptors considered. Such indices are evaluated upon representing the target GQs through a generalized Polynomial Chaos Expansion (gPCE) approximation. The latter also serves as a surrogate model of the global system behavior. It allows (a) computing and analyzing the joint and marginal probability density function (pdf) of each GQ in a Monte Carlo framework at an affordable computational cost, and (b) exploring the way the uncertainty associated with the prediction of these global descriptors can be reduced by conditioning of the joint pdf on available information. Corresponding analytical expressions of the marginal pdfs of the variables of interest are derived and analyzed.

  17. Uncertainty Assessment: What Good Does it Do? (Invited)

    Science.gov (United States)

    Oreskes, N.; Lewandowsky, S.

    2013-12-01

    the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.

  18. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    NARCIS (Netherlands)

    Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.

    2008-01-01

    By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tun

  19. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    NARCIS (Netherlands)

    Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.

    2008-01-01

    By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tun

  20. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  1. Integration of expert knowledge and uncertainty in natural risk assessment

    Science.gov (United States)

    Baruffini, Mirko; Jaboyedoff, Michel

    2010-05-01

    Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and

  2. Management of California Oak Woodlands: Uncertainties and Modeling

    Science.gov (United States)

    Jay E. Noel; Richard P. Thompson

    1995-01-01

    A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...

  3. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  4. Elevation uncertainty in coastal inundation hazard assessments

    Science.gov (United States)

    Gesch, Dean B.; Cheval, Sorin

    2012-01-01

    Coastal inundation has been identified as an important natural hazard that affects densely populated and built-up areas (Subcommittee on Disaster Reduction, 2008). Inundation, or coastal flooding, can result from various physical processes, including storm surges, tsunamis, intense precipitation events, and extreme high tides. Such events cause quickly rising water levels. When rapidly rising water levels overwhelm flood defenses, especially in heavily populated areas, the potential of the hazard is realized and a natural disaster results. Two noteworthy recent examples of such natural disasters resulting from coastal inundation are the Hurricane Katrina storm surge in 2005 along the Gulf of Mexico coast in the United States, and the tsunami in northern Japan in 2011. Longer term, slowly varying processes such as land subsidence (Committee on Floodplain Mapping Technologies, 2007) and sea-level rise also can result in coastal inundation, although such conditions do not have the rapid water level rise associated with other flooding events. Geospatial data are a critical resource for conducting assessments of the potential impacts of coastal inundation, and geospatial representations of the topography in the form of elevation measurements are a primary source of information for identifying the natural and human components of the landscape that are at risk. Recently, the quantity and quality of elevation data available for the coastal zone have increased markedly, and this availability facilitates more detailed and comprehensive hazard impact assessments.

  5. Gaze categorization under uncertainty: psychophysics and modeling.

    Science.gov (United States)

    Mareschal, Isabelle; Calder, Andrew J; Dadds, Mark R; Clifford, Colin W G

    2013-04-22

    The accurate perception of another person's gaze direction underlies most social interactions and provides important information about his or her future intentions. As a first step to measuring gaze perception, most experiments determine the range of gaze directions that observers judge as being direct: the cone of direct gaze. This measurement has revealed the flexibility of observers' perception of gaze and provides a useful benchmark against which to test clinical populations with abnormal gaze behavior. Here, we manipulated effective signal strength by adding noise to the eyes of synthetic face stimuli or removing face information. We sought to move beyond a descriptive account of gaze categorization by fitting a model to the data that relies on changing the uncertainty associated with an estimate of gaze direction as a function of the signal strength. This model accounts for all the data and provides useful insight into the visual processes underlying normal gaze perception.

  6. A simplified model of choice behavior under uncertainty

    Directory of Open Access Journals (Sweden)

    Ching-Hung Lin

    2016-08-01

    Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  7. Incorporating Fuzzy Systems Modeling and Possibility Theory in Hydrogeological Uncertainty Analysis

    Science.gov (United States)

    Faybishenko, B.

    2008-12-01

    Hydrogeological predictions are subject to numerous uncertainties, including the development of conceptual, mathematical, and numerical models, as well as determination of their parameters. Stochastic simulations of hydrogeological systems and the associated uncertainty analysis are usually based on the assumption that the data characterizing spatial and temporal variations of hydrogeological processes are random, and the output uncertainty is quantified using a probability distribution. However, hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete or subjective information. One of the modern approaches to modeling and uncertainty quantification of such systems is based on using a combination of statistical and fuzzy-logic uncertainty analyses. The aims of this presentation are to: (1) present evidence of fuzziness in developing conceptual hydrogeological models, and (2) give examples of the integration of the statistical and fuzzy-logic analyses in modeling and assessing both aleatoric uncertainties (e.g., caused by vagueness in assessing the subsurface system heterogeneities of fractured-porous media) and epistemic uncertainties (e.g., caused by the selection of different simulation models) involved in hydrogeological modeling. The author will discuss several case studies illustrating the application of fuzzy modeling for assessing the water balance and water travel time in unsaturated-saturated media. These examples will include the evaluation of associated uncertainties using the main concepts of possibility theory, a comparison between the uncertainty evaluation using probabilistic and possibility theories, and a transformation of the probabilities into possibilities distributions (and vice versa) for modeling hydrogeological processes.

  8. Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.

    Science.gov (United States)

    Proppe, Jonny; Reiher, Markus

    2017-07-11

    One of the major challenges in computational science is to determine the uncertainty of a virtual measurement, that is the prediction of an observable based on calculations. As highly accurate first-principles calculations are in general unfeasible for most physical systems, one usually resorts to parameteric property models of observables, which require calibration by incorporating reference data. The resulting predictions and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the (57)Fe Mössbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with 12 density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm s(-1) and 0.04-0.05 mm s(-1), respectively, the latter being close to the average experimental uncertainty of 0.02 mm s(-1). Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the squared coefficient of correlation, r(2), or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical M

  9. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  10. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  11. Workshop on Model Uncertainty and its Statistical Implications

    CERN Document Server

    1988-01-01

    In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.

  12. Operationalising uncertainty in data and models for integrated water resources management.

    Science.gov (United States)

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  13. The uncertainty cascade in flood risk assessment under changing climatic conditions - the Biala Tarnowska case study

    Science.gov (United States)

    Doroszkiewicz, Joanna; Romanowicz, Renata

    2016-04-01

    Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the

  14. Uncertainty of GIA models across the Greenland

    Science.gov (United States)

    Ruggieri, Gabriella

    2013-04-01

    In the last years various remote sensing techniques have been employed to estimate the current mass balance of the Greenland ice sheet (GIS). In this regards GRACE, laser and radar altimetry observations, employed to constrain the mass balance, consider the glacial isostatic adjustment (GIA) a source of noise. Several GIA models have been elaborated for the Greenland but they differ from each other for mantle viscosity profile and for time history of ice melting. In this work we use the well know ICE-5G (VM2) ice model by Peltier (2004) and two others alternative scenarios of ice melting, ANU05 by Lambeck et al. (1998) and the new regional ice model HUY2 by Simpson et al. (2009) in order to asses the amplitude of the uncertainty related to the GIA predictions. In particular we focus on rates of vertical displacement field, sea surface variations and sea-level change at regional scale. The GIA predictions are estimated using an improved version of SELEN code that solve the sea-level equation for a spherical self-gravitating, incompressible and viscoelastic Earth structure. GIA uncertainty shows a highly variable geographic distribution across the Greenland. Considering the spatial pattern of the GIA predictions related to the three ice models, the western sector of the Greenland Ice Sheets (GrIS) between Thule and Upernavik and around the area of Paamiut, show good agreement while the northeast portion of the Greenland is characterized by a large discrepancy of the GIA predictions inferred by the ice models tested in this work. These differences are ultimately the consequence of the different sets of global relative sea level data and modern geodetic observations used by the authors to constrain the model parameters. Finally GPS Network project (GNET), recently installed around the periphery of the GrIS, are used as a tool to discuss the discrepancies among the GIA models. Comparing the geodetic analysis recently available, appears that among the GPS sites the

  15. Assessing spatial uncertainties of land allocation using a scenario approach and sensitivity analysis: a study for land use in Europe.

    Science.gov (United States)

    Verburg, Peter H; Tabeau, Andrzej; Hatna, Erez

    2013-09-01

    Land change model outcomes are vulnerable to multiple types of uncertainty, including uncertainty in input data, structural uncertainties in the model and uncertainties in model parameters. In coupled model systems the uncertainties propagate between the models. This paper assesses uncertainty of changes in future spatial allocation of agricultural land in Europe as they arise from a general equilibrium model coupled to a spatial land use allocation model. Two contrasting scenarios are used to capture some of the uncertainty in the development of typical combinations of economic, demographic and policy variables. The scenario storylines include different measurable assumptions concerning scenario specific drivers (variables) and parameters. Many of these assumptions are estimations and thus include a certain level of uncertainty regarding their true values. This leads to uncertainty within the scenario outcomes. In this study we have explored how uncertainty in national-level assumptions within the contrasting scenario assumptions translates into uncertainty in the location of changes in agricultural land use in Europe. The results indicate that uncertainty in coarse-scale assumptions does not translate into a homogeneous spread of the uncertainty within Europe. Some regions are more certain than others in facing specific land change trajectories irrespective of the uncertainty in the macro-level assumptions. The spatial spread of certain and more uncertain locations of land change is dependent on location conditions as well as on the overall scenario conditions. Translating macro-level uncertainties to uncertainties in spatial patterns of land change makes it possible to better understand and visualize the land change consequences of uncertainties in model input variables.

  16. Assessing Uncertainty in LULC Classification Accuracy by Using Bootstrap Resampling

    Directory of Open Access Journals (Sweden)

    Lin-Hsuan Hsiao

    2016-08-01

    Full Text Available Supervised land-use/land-cover (LULC classifications are typically conducted using class assignment rules derived from a set of multiclass training samples. Consequently, classification accuracy varies with the training data set and is thus associated with uncertainty. In this study, we propose a bootstrap resampling and reclassification approach that can be applied for assessing not only the uncertainty in classification results of the bootstrap-training data sets, but also the classification uncertainty of individual pixels in the study area. Two measures of pixel-specific classification uncertainty, namely the maximum class probability and Shannon entropy, were derived from the class probability vector of individual pixels and used for the identification of unclassified pixels. Unclassified pixels that are identified using the traditional chi-square threshold technique represent outliers of individual LULC classes, but they are not necessarily associated with higher classification uncertainty. By contrast, unclassified pixels identified using the equal-likelihood technique are associated with higher classification uncertainty and they mostly occur on or near the borders of different land-cover.

  17. A market model: uncertainty and reachable sets

    Directory of Open Access Journals (Sweden)

    Raczynski Stanislaw

    2015-01-01

    Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.

  18. A framework for modeling uncertainty in regional climate change

    Science.gov (United States)

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  19. Uncertainty propagation in urban hydrology water quality modelling

    NARCIS (Netherlands)

    Torres Matallana, Arturo; Leopold, U.; Heuvelink, G.B.M.

    2016-01-01

    Uncertainty is often ignored in urban hydrology modelling. Engineering practice typically ignores uncertainties and uncertainty propagation. This can have large impacts, such as the wrong dimensioning of urban drainage systems and the inaccurate estimation of pollution in the environment caused by c

  20. Uncertainties in climate assessment for the case of aviation NO

    Science.gov (United States)

    Holmes, Christopher D.; Tang, Qi; Prather, Michael J.

    2011-01-01

    Nitrogen oxides emitted from aircraft engines alter the chemistry of the atmosphere, perturbing the greenhouse gases methane (CH4) and ozone (O3). We quantify uncertainties in radiative forcing (RF) due to short-lived increases in O3, long-lived decreases in CH4 and O3, and their net effect, using the ensemble of published models and a factor decomposition of each forcing. The decomposition captures major features of the ensemble, and also shows which processes drive the total uncertainty in several climate metrics. Aviation-specific factors drive most of the uncertainty for the short-lived O3 and long-lived CH4 RFs, but a nonaviation factor dominates for long-lived O3. The model ensemble shows strong anticorrelation between the short-lived and long-lived RF perturbations (R2 = 0.87). Uncertainty in the net RF is highly sensitive to this correlation. We reproduce the correlation and ensemble spread in one model, showing that processes controlling the background tropospheric abundance of nitrogen oxides are likely responsible for the modeling uncertainty in climate impacts from aviation. PMID:21690364

  1. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  2. Uncertainty "escalation" and use of machine learning to forecast residual and data model uncertainties

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    When speaking about model uncertainty many authors implicitly assume the data uncertainty (mainly in parameters or inputs) which is probabilistically described by distributions. Often however it is look also into the residual uncertainty as well. It is hence reasonable to classify the main approaches to uncertainty analysis with respect to the two main types of model uncertainty that can be distinguished: A. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. it uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. The following methods can be mentioned: (a) quantile regression (QR) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) a more recent approach that takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (neural networks, model trees etc.) - the UNEEC method [2,3,7] (c) and even more recent DUBRAUE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals (it corrects the model residual first and then carries out the uncertainty prediction by a autoregressive statistical model) [5] B. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. In case of simple functions representing models analytical approaches can be used, or approximation methods (e.g., first-order second moment method). However, for real complex non-linear models implemented in software there is no other choice except using

  3. Methods for uncertainty propagation in life cycle assessment

    NARCIS (Netherlands)

    Groen, E.A.; Heijungs, R.; Bokkers, E.A.M.; Boer, de I.J.M.

    2014-01-01

    Life cycle assessment (LCA) calculates the environmental impact of a product over its entire life cycle. Uncertainty analysis is an important aspect in LCA, and is usually performed using Monte Carlo sampling. In this study, Monte Carlo sampling, Latin hypercube sampling, quasi Monte Carlo sampling,

  4. Uncertainty in wave energy resource assessment. Part 1: Historic data

    Energy Technology Data Exchange (ETDEWEB)

    Mackay, Edward B.L.; Bahaj, AbuBakr S. [Sustainable Energy Research Group, School of Civil Engineering and the Environment, University of Southampton, Highfield, Southampton SO17 1BJ (United Kingdom); Challenor, Peter G. [Ocean Observing and Climate Group, National Oceanography Centre, Southampton SO14 3ZH (United Kingdom)

    2010-08-15

    The uncertainty in estimates of the energy yield from a wave energy converter (WEC) is considered. The study is presented in two articles. This first article deals with the accuracy of the historic data and the second article considers the uncertainty which arises from variability in the wave climate. Estimates of the historic resource for a specific site are usually calculated from wave model data calibrated against in-situ measurements. Both the calibration of model data and estimation of confidence bounds are made difficult by the complex structure of errors in model data. Errors in parameters from wave models exhibit non-linear dependence on multiple factors, seasonal and interannual changes in bias and short-term temporal correlation. An example is given using two hindcasts for the European Marine Energy Centre in Orkney. Before calibration, estimates of the long-term mean WEC power from the two hindcasts differ by around 20%. The difference is reduced to 5% after calibration. The short-term temporal evolution of errors in WEC power is represented using ARMA models. It is shown that this is sufficient to model the long-term uncertainty in estimated WEC yield from one hindcast. However, seasonal and interannual changes in model biases in the other hindcast cause the uncertainty in estimated long-term WEC yield to exceed that predicted by the ARMA model. (author)

  5. Uncertainty analysis of statistical downscaling models using general circulation model over an international wetland

    Science.gov (United States)

    Etemadi, H.; Samadi, S.; Sharifikia, M.

    2014-06-01

    Regression-based statistical downscaling model (SDSM) is an appropriate method which broadly uses to resolve the coarse spatial resolution of general circulation models (GCMs). Nevertheless, the assessment of uncertainty propagation linked with climatic variables is essential to any climate change impact study. This study presents a procedure to characterize uncertainty analysis of two GCM models link with Long Ashton Research Station Weather Generator (LARS-WG) and SDSM in one of the most vulnerable international wetland, namely "Shadegan" in an arid region of Southwest Iran. In the case of daily temperature, uncertainty is estimated by comparing monthly mean and variance of downscaled and observed daily data at a 95 % confidence level. Uncertainties were then evaluated from comparing monthly mean dry and wet spell lengths and their 95 % CI in daily precipitation downscaling using 1987-2005 interval. The uncertainty results indicated that the LARS-WG is the most proficient model at reproducing various statistical characteristics of observed data at a 95 % uncertainty bounds while the SDSM model is the least capable in this respect. The results indicated a sequences uncertainty analysis at three different climate stations and produce significantly different climate change responses at 95 % CI. Finally the range of plausible climate change projections suggested a need for the decision makers to augment their long-term wetland management plans to reduce its vulnerability to climate change impacts.

  6. Inspection Uncertainty and Model Uncertainty Updating for Ship Structures Subjected to Corrosion Deterioration

    Institute of Scientific and Technical Information of China (English)

    LIDian-qing; ZHANGSheng-kun

    2004-01-01

    The classical probability theory cannot effectively quantify the parameter uncertainty in probability of detection.Furthermore,the conventional data analytic method and expert judgment method fail to handle the problem of model uncertainty updating with the information from nondestructive inspection.To overcome these disadvantages,a Bayesian approach was proposed to quantify the parameter uncertainty in probability of detection.Furthermore,the formulae of the multiplication factors to measure the statistical uncertainties in the probability of detection following the Weibull distribution were derived.A Bayesian updating method was applied to compute the posterior probabilities of model weights and the posterior probability density functions of distribution parameters of probability of detection.A total probability model method was proposed to analyze the problem of multi-layered model uncertainty updating.This method was then applied to the problem of multilayered corrosion model uncertainty updating for ship structures.The results indicate that the proposed method is very effective in analyzing the problem of multi-layered model uncertainty updating.

  7. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    Science.gov (United States)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.; Refsgaard, Jens C.; Jensen, Karsten H.

    2015-12-01

    The ensemble Kalman filter (EnKF) is a popular data assimilation (DA) technique that has been extensively used in environmental sciences for combining complementary information from model predictions and observations. One of the major challenges in EnKF applications is the description of model uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties. This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions on the feasibility and efficiency of the assimilation. The synthetic data used in the assimilation study makes it possible to diagnose model uncertainty assumptions statistically. Besides the model uncertainty, other factors such as observation error, observation locations, and ensemble size are also analysed with respect to performance and sensitivity. Results show that inappropriate definition of model uncertainty can greatly degrade the assimilation performance, and an appropriate combination of different model uncertainty sources is advised.

  8. Uncertainty in a spatial evacuation model

    Science.gov (United States)

    Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De

    2017-08-01

    Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.

  9. Quantifying uncertainty in LCA-modelling of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  10. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    Science.gov (United States)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  11. Identification and communication of uncertainties of phenomenological models in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U.; Simola, K. [VTT Automation (Finland)

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  12. Epistemic uncertainties and natural hazard risk assessment – Part 1: A review of the issues

    Directory of Open Access Journals (Sweden)

    K. J. Beven

    2015-12-01

    Full Text Available Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.

  13. Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations

    Science.gov (United States)

    Niemeier, Wolfgang; Tengen, Dieter

    2017-06-01

    In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.

  14. Uncertainty and sensitivity analyses in seismic risk assessments on the example of Cologne, Germany

    Directory of Open Access Journals (Sweden)

    S. Tyagunov

    2013-12-01

    Full Text Available Both aleatory and epistemic uncertainties associated with different sources and components of risk (hazard, exposure, vulnerability are present at each step of seismic risk assessments. All individual sources of uncertainty contribute to the total uncertainty, which might be very high and, within the decision-making context, may therefore lead to either very conservative and expensive decisions or the perception of considerable risk. When anatomizing the structure of the total uncertainty, it is therefore important to propagate the different individual uncertainties through the computational chain and to quantify their contribution to the total value of risk. The present study analyzes different uncertainties associated with the hazard, vulnerability and loss components by the use of logic trees. The emphasis is on the analysis of epistemic uncertainties, which represent the reducible part of the total uncertainty, including a sensitivity analysis of the resulting seismic risk assessments with regards to the different uncertainty sources. This investigation, being a part of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe, is carried out for the example of, and with reference to, the conditions of the city of Cologne, Germany, which is one of the MATRIX test cases. At the same time, this particular study does not aim to revise nor to refine the hazard and risk level for Cologne; it is rather to show how large are the existing uncertainties and how they can influence seismic risk estimates, especially in less well-studied areas, if hazard and risk models adapted from other regions are used.

  15. Uncertainty and sensitivity analyses in seismic risk assessments on the example of Cologne, Germany

    Science.gov (United States)

    Tyagunov, S.; Pittore, M.; Wieland, M.; Parolai, S.; Bindi, D.; Fleming, K.; Zschau, J.

    2014-06-01

    Both aleatory and epistemic uncertainties associated with different sources and components of risk (hazard, exposure, vulnerability) are present at each step of seismic risk assessments. All individual sources of uncertainty contribute to the total uncertainty, which might be very high and, within the decision-making context, may therefore lead to either very conservative and expensive decisions or the perception of considerable risk. When anatomizing the structure of the total uncertainty, it is therefore important to propagate the different individual uncertainties through the computational chain and to quantify their contribution to the total value of risk. The present study analyses different uncertainties associated with the hazard, vulnerability and loss components by the use of logic trees. The emphasis is on the analysis of epistemic uncertainties, which represent the reducible part of the total uncertainty, including a sensitivity analysis of the resulting seismic risk assessments with regard to the different uncertainty sources. This investigation, being a part of the EU FP7 project MATRIX (New Multi-Hazard and Multi-Risk Assessment Methods for Europe), is carried out for the example of, and with reference to, the conditions of the city of Cologne, Germany, which is one of the MATRIX test cases. At the same time, this particular study does not aim to revise nor to refine the hazard and risk level for Cologne; it is rather to show how large are the existing uncertainties and how they can influence seismic risk estimates, especially in less well-studied areas, if hazard and risk models adapted from other regions are used.

  16. Assessment of boundary uncertainty in a coal deposit by means of probability kriging

    Energy Technology Data Exchange (ETDEWEB)

    Tercan, A.E. [Hacettepe University, Ankara (Turkey). Dept. of Mining Engineering

    1998-01-01

    Uncertainty over the boundary of a coal deposit must be quantified to allow evaluation of the risk involved in mine-planning decisions. Quantification of uncertainty calls for modelling of the conditional cumulative distribution function (ccdf) about an unknown value. Probability kriging is here used for approximating the ccdf. Thickness is introduced as a covariable in assessing the boundary uncertainty of the Kalburcayiri coal deposit in Kangal, Turkey at regular intervals over the deposit. Comparison of the probability map provided by probability kriging with that of indicator kriging showed there to be no difference between them, may be because of the undersampled covariable. 10 refs., 5 figs.

  17. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  18. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  19. Modelling sensitivity and uncertainty in a LCA model for waste management systems - EASETECH

    DEFF Research Database (Denmark)

    Damgaard, Anders; Clavreul, Julie; Baumeister, Hubert

    2013-01-01

    In the new model, EASETECH, developed for LCA modelling of waste management systems, a general approach for sensitivity and uncertainty assessment for waste management studies has been implemented. First general contribution analysis is done through a regular interpretation of inventory and impact...

  20. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  1. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  2. A global approach for sparse representation of uncertainty in Life Cycle Assessments of waste management systems

    DEFF Research Database (Denmark)

    Bisinella, Valentina; Conradsen, Knut; Christensen, Thomas Højlund;

    2016-01-01

    Purpose: Identification of key inputs and their effect on results from Life Cycle Assessment (LCA) models is fundamental. Because parameter importance varies greatly between cases due to the interaction of sensitivity and uncertainty, these features should never be defined a priori. However...... and uncertainty in a Global Sensitivity Analysis (GSA) framework. Methods: The proposed analytical method based on the calculation of sensitivity coefficients (SC) is evaluated against Monte Carlo sampling on traditional uncertainty assessment procedures, both for individual parameters and for full parameter sets...... of additivity of variances and GSA is tested on results from both uncertainty propagation methods. Then, we examine the differences in discernibility analyses results carried out with varying numbers of sampling points and parameters. Results and discussion: The proposed analytical method complies...

  3. Assessment and visualization of uncertainty for countrywide soil organic matter map of Hungary using local entropy

    Science.gov (United States)

    Szatmári, Gábor; Pásztor, László

    2016-04-01

    Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A

  4. Quantification of uncertainties in global grazing systems assessment

    Science.gov (United States)

    Fetzel, T.; Havlik, P.; Herrero, M.; Kaplan, J. O.; Kastner, T.; Kroisleitner, C.; Rolinski, S.; Searchinger, T.; Van Bodegom, P. M.; Wirsenius, S.; Erb, K.-H.

    2017-07-01

    Livestock systems play a key role in global sustainability challenges like food security and climate change, yet many unknowns and large uncertainties prevail. We present a systematic, spatially explicit assessment of uncertainties related to grazing intensity (GI), a key metric for assessing ecological impacts of grazing, by combining existing data sets on (a) grazing feed intake, (b) the spatial distribution of livestock, (c) the extent of grazing land, and (d) its net primary productivity (NPP). An analysis of the resulting 96 maps implies that on average 15% of the grazing land NPP is consumed by livestock. GI is low in most of the world's grazing lands, but hotspots of very high GI prevail in 1% of the total grazing area. The agreement between GI maps is good on one fifth of the world's grazing area, while on the remainder, it is low to very low. Largest uncertainties are found in global drylands and where grazing land bears trees (e.g., the Amazon basin or the Taiga belt). In some regions like India or Western Europe, massive uncertainties even result in GI > 100% estimates. Our sensitivity analysis indicates that the input data for NPP, animal distribution, and grazing area contribute about equally to the total variability in GI maps, while grazing feed intake is a less critical variable. We argue that a general improvement in quality of the available global level data sets is a precondition for improving the understanding of the role of livestock systems in the context of global environmental change or food security.

  5. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects, and thus…

  6. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  7. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  8. Estimating the magnitude of prediction uncertainties for the APLE model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  9. Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study

    Science.gov (United States)

    O'Neill, B. C.

    2015-12-01

    Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics

  10. Application of the emission inventory model TEAM: Uncertainties in dioxin emission estimates for central Europe

    NARCIS (Netherlands)

    Pulles, M.P.J.; Kok, H.; Quass, U.

    2006-01-01

    This study uses an improved emission inventory model to assess the uncertainties in emissions of dioxins and furans associated with both knowledge on the exact technologies and processes used, and with the uncertainties of both activity data and emission factors. The annual total emissions for the y

  11. Numerical Modelling of Structures with Uncertainties

    Directory of Open Access Journals (Sweden)

    Kahsin Maciej

    2017-04-01

    Full Text Available The nature of environmental interactions, as well as large dimensions and complex structure of marine offshore objects, make designing, building and operation of these objects a great challenge. This is the reason why a vast majority of investment cases of this type include structural analysis, performed using scaled laboratory models and complemented by extended computer simulations. The present paper focuses on FEM modelling of the offshore wind turbine supporting structure. Then problem is studied using the modal analysis, sensitivity analysis, as well as the design of experiment (DOE and response surface model (RSM methods. The results of modal analysis based simulations were used for assessing the quality of the FEM model against the data measured during the experimental modal analysis of the scaled laboratory model for different support conditions. The sensitivity analysis, in turn, has provided opportunities for assessing the effect of individual FEM model parameters on the dynamic response of the examined supporting structure. The DOE and RSM methods allowed to determine the effect of model parameter changes on the supporting structure response.

  12. Comparing the effects of climate and impact model uncertainty on climate impacts estimates for grain maize

    Science.gov (United States)

    Holzkämper, Annelie; Honti, Mark; Fuhrer, Jürg

    2015-04-01

    Crop models are commonly applied to estimate impacts of projected climate change and to anticipate suitable adaptation measures. Thereby, uncertainties from global climate models, regional climate models, and impacts models cascade down to impact estimates. It is essential to quantify and understand uncertainties in impact assessments in order to provide informed guidance for decision making in adaptation planning. A question that has hardly been investigated in this context is how sensitive climate impact estimates are to the choice of the impact model approach. In a case study for Switzerland we compare results of three different crop modelling approaches to assess the relevance of impact model choice in relation to other uncertainty sources. The three approaches include an expert-based, a statistical and a process-based model. With each approach impact model parameter uncertainty and climate model uncertainty (originating from climate model chain and downscaling approach) are accounted for. ANOVA-based uncertainty partitioning is performed to quantify the relative importance of different uncertainty sources. Results suggest that uncertainty in estimated yield changes originating from the choice of the crop modelling approach can be greater than uncertainty from climate model chains. The uncertainty originating from crop model parameterization is small in comparison. While estimates of yield changes are highly uncertain, the directions of estimated changes in climatic limitations are largely consistent. This leads us to the conclusion that by focusing on estimated changes in climate limitations, more meaningful information can be provided to support decision making in adaptation planning - especially in cases where yield changes are highly uncertain.

  13. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  14. Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City

    Science.gov (United States)

    Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo

    2014-05-01

    The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen

  15. Joint analysis of epistemic and aleatory uncertainty in stability analysis for geo-hazard assessments

    Science.gov (United States)

    Rohmer, Jeremy; Verdel, Thierry

    2017-04-01

    Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e

  16. Impact Assessment of Uncertainty Propagation of Ensemble NWP Rainfall to Flood Forecasting with Catchment Scale

    Directory of Open Access Journals (Sweden)

    Wansik Yu

    2016-01-01

    Full Text Available The common approach to quantifying the precipitation forecast uncertainty is ensemble simulations where a numerical weather prediction (NWP model is run for a number of cases with slightly different initial conditions. In practice, the spread of ensemble members in terms of flood discharge is used as a measure of forecast uncertainty due to uncertain precipitation forecasts. This study presents the uncertainty propagation of rainfall forecast into hydrological response with catchment scale through distributed rainfall-runoff modeling based on the forecasted ensemble rainfall of NWP model. At first, forecast rainfall error based on the BIAS is compared with flood forecast error to assess the error propagation. Second, the variability of flood forecast uncertainty according to catchment scale is discussed using ensemble spread. Then we also assess the flood forecast uncertainty with catchment scale using an estimation regression equation between ensemble rainfall BIAS and discharge BIAS. Finally, the flood forecast uncertainty with RMSE using specific discharge in catchment scale is discussed. Our study is carried out and verified using the largest flood event by typhoon “Talas” of 2011 over the 33 subcatchments of Shingu river basin (2,360 km2, which is located in the Kii Peninsula, Japan.

  17. Uncertainty in runoff based on Global Climate Model precipitation and temperature data – Part 2: Estimation and uncertainty of annual runoff and reservoir yield

    Directory of Open Access Journals (Sweden)

    M. C. Peel

    2014-05-01

    Full Text Available Two key sources of uncertainty in projections of future runoff for climate change impact assessments are uncertainty between Global Climate Models (GCMs and within a GCM. Within-GCM uncertainty is the variability in GCM output that occurs when running a scenario multiple times but each run has slightly different, but equally plausible, initial conditions. The limited number of runs available for each GCM and scenario combination within the Coupled Model Intercomparison Project phase 3 (CMIP3 and phase 5 (CMIP5 datasets, limits the assessment of within-GCM uncertainty. In this second of two companion papers, the primary aim is to approximate within-GCM uncertainty of monthly precipitation and temperature projections and assess its impact on modelled runoff for climate change impact assessments. A secondary aim is to assess the impact of between-GCM uncertainty on modelled runoff. Here we approximate within-GCM uncertainty by developing non-stationary stochastic replicates of GCM monthly precipitation and temperature data. These replicates are input to an off-line hydrologic model to assess the impact of within-GCM uncertainty on projected annual runoff and reservoir yield. To-date within-GCM uncertainty has received little attention in the hydrologic climate change impact literature and this analysis provides an approximation of the uncertainty in projected runoff, and reservoir yield, due to within- and between-GCM uncertainty of precipitation and temperature projections. In the companion paper, McMahon et al. (2014 sought to reduce between-GCM uncertainty by removing poorly performing GCMs, resulting in a selection of five better performing GCMs from CMIP3 for use in this paper. Here we present within- and between-GCM uncertainty results in mean annual precipitation (MAP, temperature (MAT and runoff (MAR, the standard deviation of annual precipitation (SDP and runoff (SDR and reservoir yield for five CMIP3 GCMs at 17 world-wide catchments

  18. Assessing damping uncertainty in space structures with fuzzy sets

    Science.gov (United States)

    Ross, Timothy J.; Hasselman, Timothy K.

    1991-01-01

    NASA has been interested in the development of methods for evaluating the predictive accuracy of structural dynamic models. This interest stems from the use of mathematical models in evaluating the structural integrity of all spacecraft prior to flight. Space structures are often too large and too weak to be tested fully assembled in a ground test lab. The predictive accuracy of a model depends on the nature and extent of its experimental verification. The further the test conditions depart from in-service conditions, the less accurate the model will be. Structural damping is known to be one source of uncertainty in models. The uncertainty in damping is explored in order to evaluate the accuracy of dynamic models. A simple mass-spring-dashpot system is used to illustrate a comparison among three methods for propagating uncertainty in structural dynamics models: the First Order Method, the Numerical Simulation Method, and the Fuzzy Set Method. The Fuzzy Set Method is shown to bound the range of possible responses and thus to provide a valuable limiting check on the First Order Method near resonant conditions. Fuzzy Methods are a relative inexpensive alternative to numerical simulation.

  19. Uncertainty treatment and sensitivity analysis of the European Probabilistic Seismic Hazard Assessment

    Science.gov (United States)

    Woessner, J.; Danciu, L.; Giardini, D.

    2013-12-01

    Probabilistic seismic hazard assessment (PSHA) aims to characterize the best available knowledge on seismic hazard of a study area, ideally taking into account all sources of uncertainty. The EC-FP7 funded project Seismic Hazard Harmonization for Europe (SHARE) generated a time-independent community-based hazard model for the European region for ground motion parameters spanning from spectral ordinates of PGA to 10s and annual exceedance probabilities from one-in-ten to one-in-ten thousand years. The results will serve as reference to define engineering applications within the EuroCode 8 and provide homogeneous input for state-of-the art seismic safety assessment of critical infrastructure. The SHARE model accounts for uncertainties, whether aleatory or epistemic, via a logic tree. Epistemic uncertainties within the seismic source-model are represented by three source models including a traditional area source model, a model that characterizes fault sources, and an approach that uses kernel-smoothing for seismicity and fault source moment release. Activity rates and maximum magnitudes in the source models are treated as aleatory uncertainties. For practical implementation and computational purposes, some of the epistemic uncertainties in the source model (i.e. dip and strike angles) are treated as aleatory, and a mean seismicity model is considered. Epistemic uncertainties for ground motions are considered by multiple Ground Motion Prediction Equations as a function of tectonic settings and treated as being correlated. The final results contain the full distribution of ground motion variability. We show how we used the logic-tree approach to consider the alternative models and how, based on the degree-of-belief in the models, we defined the weights of the single branches. This contribution features results and sensitivity analysis of the entire European hazard model and selected sites.

  20. Uncertainty modelling of atmospheric dispersion by stochastic response surface method under aleatory and epistemic uncertainties

    Indian Academy of Sciences (India)

    Rituparna Chutia; Supahi Mahanta; D Datta

    2014-04-01

    The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that available information is interpreted in probabilistic sense. Probability theory is a well-established theory to measure such kind of variability. However, not all available information, data or model parameters affected by variability, imprecision and uncertainty, can be handled by traditional probability theory. Uncertainty or imprecision may occur due to incomplete information or data, measurement error or data obtained from expert judgement or subjective interpretation of available data or information. Thus for model parameters, data may be affected by subjective uncertainty. Traditional probability theory is inappropriate to represent subjective uncertainty. Possibility theory is used as a tool to describe parameters with insufficient knowledge. Based on the polynomial chaos expansion, stochastic response surface method has been utilized in this article for the uncertainty propagation of atmospheric dispersion model under consideration of both probabilistic and possibility information. The proposed method has been demonstrated through a hypothetical case study of atmospheric dispersion.

  1. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  2. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  4. Environmental impact and risk assessments and key factors contributing to the overall uncertainties.

    Science.gov (United States)

    Salbu, Brit

    2016-01-01

    There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms

  5. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    Science.gov (United States)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  6. Multi-model ensemble hydrologic prediction and uncertainties analysis

    Directory of Open Access Journals (Sweden)

    S. Jiang

    2014-09-01

    Full Text Available Modelling uncertainties (i.e. input errors, parameter uncertainties and model structural errors inevitably exist in hydrological prediction. A lot of recent attention has focused on these, of which input error modelling, parameter optimization and multi-model ensemble strategies are the three most popular methods to demonstrate the impacts of modelling uncertainties. In this paper the Xinanjiang model, the Hybrid rainfall–runoff model and the HYMOD model were applied to the Mishui Basin, south China, for daily streamflow ensemble simulation and uncertainty analysis. The three models were first calibrated by two parameter optimization algorithms, namely, the Shuffled Complex Evolution method (SCE-UA and the Shuffled Complex Evolution Metropolis method (SCEM-UA; next, the input uncertainty was accounted for by introducing a normally-distributed error multiplier; then, the simulation sets calculated from the three models were combined by Bayesian model averaging (BMA. The results show that both these parameter optimization algorithms generate good streamflow simulations; specifically the SCEM-UA can imply parameter uncertainty and give the posterior distribution of the parameters. Considering the precipitation input uncertainty, the streamflow simulation precision does not improve very much. While the BMA combination not only improves the streamflow prediction precision, it also gives quantitative uncertainty bounds for the simulation sets. The SCEM-UA calculated prediction interval is better than the SCE-UA calculated one. These results suggest that considering the model parameters' uncertainties and doing multi-model ensemble simulations are very practical for streamflow prediction and flood forecasting, from which more precision prediction and more reliable uncertainty bounds can be generated.

  7. Assessing measurement uncertainty in meteorology in urban environments

    Science.gov (United States)

    Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.

    2017-10-01

    Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.

  8. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  9. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  10. A Random Matrix Approach for Quantifying Model-Form Uncertainties in Turbulence Modeling

    CERN Document Server

    Xiao, Heng; Ghanem, Roger G

    2016-01-01

    With the ever-increasing use of Reynolds-Averaged Navier--Stokes (RANS) simulations in mission-critical applications, the quantification of model-form uncertainty in RANS models has attracted attention in the turbulence modeling community. Recently, a physics-based, nonparametric approach for quantifying model-form uncertainty in RANS simulations has been proposed, where Reynolds stresses are projected to physically meaningful dimensions and perturbations are introduced only in the physically realizable limits. However, a challenge associated with this approach is to assess the amount of information introduced in the prior distribution and to avoid imposing unwarranted constraints. In this work we propose a random matrix approach for quantifying model-form uncertainties in RANS simulations with the realizability of the Reynolds stress guaranteed. Furthermore, the maximum entropy principle is used to identify the probability distribution that satisfies the constraints from available information but without int...

  11. Uncertainty models applied to the substation planning

    Energy Technology Data Exchange (ETDEWEB)

    Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)

    1994-12-31

    The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.

  12. Estimated Frequency Domain Model Uncertainties used in Robust Controller Design

    DEFF Research Database (Denmark)

    Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob;

    1994-01-01

    This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...

  13. Understanding uncertainties in model-based predictions of Aedes aegypti population dynamics.

    Directory of Open Access Journals (Sweden)

    Chonggang Xu

    2010-09-01

    Full Text Available Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model.This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1 uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2 uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss. Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty. Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level.This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for

  14. Understanding uncertainties in model-based predictions of Aedes aegypti population dynamics.

    Science.gov (United States)

    Xu, Chonggang; Legros, Mathieu; Gould, Fred; Lloyd, Alun L

    2010-09-28

    Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model. This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses) and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1) uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2) uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss). Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty) but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty). Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level. This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for identifying those

  15. Committee of machine learning predictors of hydrological models uncertainty

    Science.gov (United States)

    Kayastha, Nagendra; Solomatine, Dimitri

    2014-05-01

    In prediction of uncertainty based on machine learning methods, the results of various sampling schemes namely, Monte Carlo sampling (MCS), generalized likelihood uncertainty estimation (GLUE), Markov chain Monte Carlo (MCMC), shuffled complex evolution metropolis algorithm (SCEMUA), differential evolution adaptive metropolis (DREAM), particle swarm optimization (PSO) and adaptive cluster covering (ACCO)[1] used to build a predictive models. These models predict the uncertainty (quantiles of pdf) of a deterministic output from hydrological model [2]. Inputs to these models are the specially identified representative variables (past events precipitation and flows). The trained machine learning models are then employed to predict the model output uncertainty which is specific for the new input data. For each sampling scheme three machine learning methods namely, artificial neural networks, model tree, locally weighted regression are applied to predict output uncertainties. The problem here is that different sampling algorithms result in different data sets used to train different machine learning models which leads to several models (21 predictive uncertainty models). There is no clear evidence which model is the best since there is no basis for comparison. A solution could be to form a committee of all models and to sue a dynamic averaging scheme to generate the final output [3]. This approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model HBV in the Nzoia catchment in Kenya. [1] N. Kayastha, D. L. Shrestha and D. P. Solomatine. Experiments with several methods of parameter uncertainty estimation in hydrological modeling. Proc. 9th Intern. Conf. on Hydroinformatics, Tianjin, China, September 2010. [2] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press

  16. Model development and data uncertainty integration

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  17. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  18. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  19. Convex Set Theory for Reliability Assessment of Steel Beam with Bounded Uncertainty

    Institute of Scientific and Technical Information of China (English)

    Li-Zhe Jia; Yi-Ming Duan

    2014-01-01

    Probabilistic reliability model established by insufficient data is inaccessible. The convex model was applied to model the uncertainties of variables. A new non-probabilistic reliability model was proposed based on the robustness of system to uncertainty. The non-probabilistic reliability model, the infinite norm model, and the probabilistic model were used to assess the reliability of a steel beam, respectively. The results show that the resistance is allowed to couple with the action effect in the non-probabilistic reliability model. Additionally, the non-probabilistic reliability model becomes the same accurate as probabilistic model with the increase of the bounded uncertain information. The model is decided by the available data and information.

  20. Confronting Uncertainty in Life Cycle Assessment Used for Decision Support

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg; Hauschild, Michael Zwicky; Sohn, Michael D.

    2014-01-01

    The aim of this article is to help confront uncertainty in life cycle assessments (LCAs) used for decision support. LCAs offer a quantitative approach to assess environmental effects of products, technologies, and services and are conducted by an LCA practitioner or analyst (AN) to support...... be described as a variance simulation based on individual data points used in an LCA. This article develops and proposes a taxonomy for LCAs based on extensive research in the LCA, management, and economic literature. This taxonomy can be used ex ante to support planning and communication between an AN and DM...

  1. Using CV-GLUE procedure in analysis of wetland model predictive uncertainty.

    Science.gov (United States)

    Huang, Chun-Wei; Lin, Yu-Pin; Chiang, Li-Chi; Wang, Yung-Chieh

    2014-07-01

    This study develops a procedure that is related to Generalized Likelihood Uncertainty Estimation (GLUE), called the CV-GLUE procedure, for assessing the predictive uncertainty that is associated with different model structures with varying degrees of complexity. The proposed procedure comprises model calibration, validation, and predictive uncertainty estimation in terms of a characteristic coefficient of variation (characteristic CV). The procedure first performed two-stage Monte-Carlo simulations to ensure predictive accuracy by obtaining behavior parameter sets, and then the estimation of CV-values of the model outcomes, which represent the predictive uncertainties for a model structure of interest with its associated behavior parameter sets. Three commonly used wetland models (the first-order K-C model, the plug flow with dispersion model, and the Wetland Water Quality Model; WWQM) were compared based on data that were collected from a free water surface constructed wetland with paddy cultivation in Taipei, Taiwan. The results show that the first-order K-C model, which is simpler than the other two models, has greater predictive uncertainty. This finding shows that predictive uncertainty does not necessarily increase with the complexity of the model structure because in this case, the more simplistic representation (first-order K-C model) of reality results in a higher uncertainty in the prediction made by the model. The CV-GLUE procedure is suggested to be a useful tool not only for designing constructed wetlands but also for other aspects of environmental management.

  2. Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling

    Science.gov (United States)

    Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.

    2016-12-01

    The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.

  3. Comparing Two Strategies to Model Uncertainties in Structural Dynamics

    Directory of Open Access Journals (Sweden)

    Rubens Sampaio

    2010-01-01

    Full Text Available In the modeling of dynamical systems, uncertainties are present and they must be taken into account to improve the prediction of the models. Some strategies have been used to model uncertainties and the aim of this work is to discuss two of those strategies and to compare them. This will be done using the simplest model possible: a two d.o.f. (degrees of freedom dynamical system. A simple system is used because it is very helpful to assure a better understanding and, consequently, comparison of the strategies. The first strategy (called parametric strategy consists in taking each spring stiffness as uncertain and a random variable is associated to each one of them. The second strategy (called nonparametric strategy is more general and considers the whole stiffness matrix as uncertain, and associates a random matrix to it. In both cases, the probability density functions either of the random parameters or of the random matrix are deduced from the Maximum Entropy Principle using only the available information. With this example, some important results can be discussed, which cannot be assessed when complex structures are used, as it has been done so far in the literature. One important element for the comparison of the two strategies is the analysis of the samples spaces and the how to compare them.

  4. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  5. Quantification and assessment of fault uncertainty and risk using stochastic conditional simulations

    Institute of Scientific and Technical Information of China (English)

    LI Shuxing; Roussos Dimitrakopoulos

    2002-01-01

    The effect of geological uncertainty on the development and mining of underground coal deposits is a key issue for longwall mining, as the presence of faults generates substantial monetary losses. This paper develops a method for the conditional simulation of fault systems and uses the method to quantify and assess fault uncertainty. The method is based on the statistical modelling of fault attributes and the simulation of the locations of the centres of the fault traces. Fault locations are generated from the thinning of a Poisson process using a spatially correlated probability field. The proposed algorithm for simulating fault traces takes into account soft data such as geological interpretations and geomechanical data. The simulations generate realisations of fault populations that reproduce observed faults, honour the statistics of the fault attributes, and respect the constraints of soft data, providing the means to thereby model and assess the related fault uncertainty.

  6. Uncertainty propagation in a 3-D thermal code for performance assessment of a nuclear waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Dutfoy, A. [Electricite de France (EDF), Research and Development Div., Safety and Reliability Branch, ESF, 92 - Clamart (France); Ritz, J.B. [Electricite de France (EDF), Research and Development Div., Fluid Mechanics and Heat Transfer, MFTT, 78 - Chatou (France)

    2001-07-01

    Given the very large time scale involved, the performance assessment of a nuclear waste repository requires numerical modelling. Because we are uncertain of the exact value of the input parameters, we have to analyse the impact of these uncertainties on the outcome of the physical models. The EDF Division Research and Development has set a reliability method to propagate these uncertainties or variability through models which requires much less physical simulations than the usual simulation methods. We apply the reliability method MEFISTO to a base case modelling the heat transfers in a virtual disposal in the future site of the French underground research laboratory, in the East of France. This study is led in collaboration with ANDRA which is the French Nuclear Waste Management Agency. With this exercise, we want to evaluate the thermal behaviour of a concept related to the variation of physical parameters and their uncertainty. (author)

  7. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  8. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  9. [Application of an uncertainty model for fibromyalgia].

    Science.gov (United States)

    Triviño Martínez, Ángeles; Solano Ruiz, M Carmen; Siles González, José

    2016-04-01

    Finding out women's experiences diagnosed with fibromyalgia applying the Theory of Uncertainty proposed by M. Mishel. A qualitative study was conducted, using a phenomenological approach. An Association of patients in the province of Alicante during the months of June 2012 to November 2013. A total of 14 women diagnosed with fibromyalgia participated in the study as volunteers, aged between 45 and 65 years. Information generated through structured interviews with recording and transcription, prior confidentiality pledge and informed consent. Analysis content by extracting different categories according to the theory proposed. The study patients perceive a high level of uncertainty related to the difficulty to deal with symptoms, uncertainty about diagnosis and treatment complexity. Moreover, the ability of coping with the disease it is influenced by social support, relationships with health professionals and help and information attending to patient associations. The health professional must provide clear information on the pathology to the fibromyalgia suffers, the larger lever of knowledge of the patients about their disease and the better the quality of the information provided, it is reported to be the less anxiety and uncertainty in the experience of the disease. Likewise patient associations should have health professionals in order to avoid bias in the information and advice with scientific evidence. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  10. Uncertainty assessment via Bayesian revision of ensemble streamflow predictions in the operational river Rhine forecasting system

    NARCIS (Netherlands)

    Reggiani, P.; Renner, M.; Weerts, A.H.; Van Gelder, P.A.H.J.M.

    2009-01-01

    Ensemble streamflow forecasts obtained by using hydrological models with ensemble weather products are becoming more frequent in operational flow forecasting. The uncertainty of the ensemble forecast needs to be assessed for these products to become useful in forecasting operations. A comprehensive

  11. Uncertainty assessment via Bayesian revision of ensemble streamflow predictions in the operational river Rhine forecasting system

    NARCIS (Netherlands)

    Reggiani, P.; Renner, M.; Weerts, A.H.; Van Gelder, P.A.H.J.M.

    2009-01-01

    Ensemble streamflow forecasts obtained by using hydrological models with ensemble weather products are becoming more frequent in operational flow forecasting. The uncertainty of the ensemble forecast needs to be assessed for these products to become useful in forecasting operations. A comprehensive

  12. Reservoir management under geological uncertainty using fast model update

    NARCIS (Netherlands)

    Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.

    2015-01-01

    Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU del

  13. Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling

    DEFF Research Database (Denmark)

    Dotto, C. B.; Mannina, G.; Kleidorfer, M.

    2012-01-01

    is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM......-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multiobjective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty...... techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside...

  14. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    Science.gov (United States)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component

  15. Assessment of uncertainties in risk analysis of chemical establishments. The ASSURANCE project. Final summary report

    DEFF Research Database (Denmark)

    Lauridsen, K.; Kozine, Igor; Markert, Frank;

    2002-01-01

    This report summarises the results obtained in the ASSURANCE project (EU contract number ENV4-CT97-0627). Seven teams have performed risk analyses for the same chemical facility, an ammonia storage. The EC's Joint Research Centre at Ispra and RisøNational Laboratory co-ordinated the exercise...... on the ranking among the adherents of the probabilistic approach. Breaking down the modelling of both frequencyand consequence assessments into suitably small elements and conducting case studies allowed identifying root causes of uncertainty in the final risk assessments. Large differences were found in both...... the frequency assessments and in the assessment ofconsequences. The report gives a qualitative assessment of the importance to the final calculated risk of uncertainties in assumptions made, in the data and the calculation methods used. This assessment can serve as a guide to areas where, in particular...

  16. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  17. A Stochastic Nonlinear Water Wave Model for Efficient Uncertainty Quantification

    CERN Document Server

    Bigoni, Daniele; Eskilsson, Claes

    2014-01-01

    A major challenge in next-generation industrial applications is to improve numerical analysis by quantifying uncertainties in predictions. In this work we present a stochastic formulation of a fully nonlinear and dispersive potential flow water wave model for the probabilistic description of the evolution waves. This model is discretized using the Stochastic Collocation Method (SCM), which provides an approximate surrogate of the model. This can be used to accurately and efficiently estimate the probability distribution of the unknown time dependent stochastic solution after the forward propagation of uncertainties. We revisit experimental benchmarks often used for validation of deterministic water wave models. We do this using a fully nonlinear and dispersive model and show how uncertainty in the model input can influence the model output. Based on numerical experiments and assumed uncertainties in boundary data, our analysis reveals that some of the known discrepancies from deterministic simulation in compa...

  18. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    the results of uncertainty analysis to predict the uncertainties in process design. For parameter estimation, large data-sets of experimentally measured property values for a wide range of pure compounds are taken from the CAPEC database. Classical frequentist approach i.e., least square method is adopted...... parameter, octanol/water partition coefficient, aqueous solubility, acentric factor, and liquid molar volume at 298 K. The performance of property models for these properties with the revised set of model parameters is highlighted through a set of compounds not considered in the regression step...... sensitive properties for each unit operation are also identified. This analysis can be used to reduce the uncertainties in property estimates for the properties of critical importance (by performing additional experiments to get better experimental data and better model parameter values). Thus...

  19. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a m...

  20. Uncertainty and error in complex plasma chemistry models

    Science.gov (United States)

    Turner, Miles M.

    2015-06-01

    Chemistry models that include dozens of species and hundreds to thousands of reactions are common in low-temperature plasma physics. The rate constants used in such models are uncertain, because they are obtained from some combination of experiments and approximate theories. Since the predictions of these models are a function of the rate constants, these predictions must also be uncertain. However, systematic investigations of the influence of uncertain rate constants on model predictions are rare to non-existent. In this work we examine a particular chemistry model, for helium-oxygen plasmas. This chemistry is of topical interest because of its relevance to biomedical applications of atmospheric pressure plasmas. We trace the primary sources for every rate constant in the model, and hence associate an error bar (or equivalently, an uncertainty) with each. We then use a Monte Carlo procedure to quantify the uncertainty in predicted plasma species densities caused by the uncertainty in the rate constants. Under the conditions investigated, the range of uncertainty in most species densities is a factor of two to five. However, the uncertainty can vary strongly for different species, over time, and with other plasma conditions. There are extreme (pathological) cases where the uncertainty is more than a factor of ten. One should therefore be cautious in drawing any conclusion from plasma chemistry modelling, without first ensuring that the conclusion in question survives an examination of the related uncertainty.

  1. Uncertainties in stellar evolution models: convective overshoot

    CERN Document Server

    Bressan, Alessandro; Marigo, Paola; Rosenfield, Philip; Tang, Jing

    2014-01-01

    In spite of the great effort made in the last decades to improve our understanding of stellar evolution, significant uncertainties remain due to our poor knowledge of some complex physical processes that require an empirical calibration, such as the efficiency of the interior mixing related to convective overshoot. Here we review the impact of convective overshoot on the evolution of stars during the main Hydrogen and Helium burning phases.

  2. Uncertainties in Stellar Evolution Models: Convective Overshoot

    Science.gov (United States)

    Bressan, Alessandro; Girardi, Léo; Marigo, Paola; Rosenfield, Philip; Tang, Jing

    In spite of the great effort made in the last decades to improve our understanding of stellar evolution, significant uncertainties remain due to our poor knowledge of some complex physical processes that require an empirical calibration, such as the efficiency of the interior mixing related to convective overshoot. Here we review the impact of convective overshoot on the evolution of stars during the main Hydrogen and Helium burning phases.

  3. Modeling Uncertainty when Estimating IT Projects Costs

    OpenAIRE

    Winter, Michel; Mirbel, Isabelle; Crescenzo, Pierre

    2014-01-01

    In the current economic context, optimizing projects' cost is an obligation for a company to remain competitive in its market. Introducing statistical uncertainty in cost estimation is a good way to tackle the risk of going too far while minimizing the project budget: it allows the company to determine the best possible trade-off between estimated cost and acceptable risk. In this paper, we present new statistical estimators derived from the way IT companies estimate the projects' costs. In t...

  4. A Bayesian Chance-Constrained Method for Hydraulic Barrier Design Under Model Structure Uncertainty

    Science.gov (United States)

    Chitsazan, N.; Pham, H. V.; Tsai, F. T. C.

    2014-12-01

    The groundwater community has widely recognized the model structure uncertainty as the major source of model uncertainty in groundwater modeling. Previous studies in the aquifer remediation design, however, rarely discuss the impact of the model structure uncertainty. This study combines the chance-constrained (CC) programming with the Bayesian model averaging (BMA) as a BMA-CC framework to assess the effect of model structure uncertainty in the remediation design. To investigate the impact of the model structure uncertainty on the remediation design, we compare the BMA-CC method with the traditional CC programming that only considers the model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from saltwater intrusion in the "1,500-foot" sand and the "1-700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address the model structure uncertainty, we develop three conceptual groundwater models based on three different hydrostratigraphy structures. The results show that using the traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from connector wells is higher than the total pumpage of the protected public supply wells. While reducing injection rate can be achieved by reducing reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station is not economically attractive.

  5. Propagation of Uncertainty in System Parameters of a LWR Model by Sampling MCNPX Calculations - Burnup Analysis

    Science.gov (United States)

    Campolina, Daniel de A. M.; Lima, Claubia P. B.; Veloso, Maria Auxiliadora F.

    2014-06-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95th percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input.

  6. Integrating uncertainty in time series population forecasts: An illustration using a simple projection model

    Directory of Open Access Journals (Sweden)

    Guy J. Abel

    2013-12-01

    Full Text Available Background: Population forecasts are widely used for public policy purposes. Methods to quantify the uncertainty in forecasts tend to ignore model uncertainty and to be based on a single model. Objective: In this paper, we use Bayesian time series models to obtain future population estimates with associated measures of uncertainty. The models are compared based on Bayesian posterior model probabilities, which are then used to provide model-averaged forecasts. Methods: The focus is on a simple projection model with the historical data representing population change in England and Wales from 1841 to 2007. Bayesian forecasts to the year 2032 are obtained based on a range of models, including autoregression models, stochastic volatility models and random variance shift models. The computational steps to fit each of these models using the OpenBUGS software via R are illustrated. Results: We show that the Bayesian approach is adept in capturing multiple sources of uncertainty in population projections, including model uncertainty. The inclusion of non-constant variance improves the fit of the models and provides more realistic predictive uncertainty levels. The forecasting methodology is assessed through fitting the models to various truncated data series.

  7. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    H. Machguth

    2008-12-01

    Full Text Available By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tuned to observed mass balance for the investigated time period and its robustness was tested by comparing observed and modelled mass balance over 11 years, yielding very small deviations. Both systematic and random uncertainties are assigned to twelve input parameters and their respective values estimated from the literature or from available meteorological data sets. The calculated overall uncertainty in the model output is dominated by systematic errors and amounts to 0.7 m w.e. or approximately 10% of total melt over the investigated time span. In order to provide a first order estimate on variability in uncertainty depending on the quality of input data, we conducted a further experiment, calculating overall uncertainty for different levels of uncertainty in measured global radiation and air temperature. Our results show that the output of a well calibrated model is subject to considerable uncertainties, in particular when applied for extrapolation in time and space where systematic errors are likely to be an important issue.

  8. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    Directory of Open Access Journals (Sweden)

    H. Machguth

    2008-06-01

    Full Text Available By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tuned to observed mass balance for the investigated time period and its robustness was tested by comparing observed and modelled mass balance over 11 years, yielding very small deviations. Both systematic and random uncertainties are assigned to twelve input parameters and their respective values estimated from the literature or from available meteorological data sets. The calculated overall uncertainty in the model output is dominated by systematic errors and amounts to 0.7 m w.e. or approximately 10% of total melt over the investigated time span. In order to provide a first order estimate on variability in uncertainty depending on the quality of input data, we conducted a further experiment, calculating overall uncertainty for different levels of uncertainty in measured global radiation and air temperature. Our results show that the output of a well calibrated model is subject to considerable uncertainties, in particular when applied for extrapolation in time and space where systematic errors are likely to be an important issue.

  9. Precipitation interpolation and corresponding uncertainty assessment using copulas

    Science.gov (United States)

    Bardossy, A.; Pegram, G. G.

    2012-12-01

    Spatial interpolation of rainfall over different time and spatial scales is necessary in many applications of hydrometeorology. The specific problems encountered in rainfall interpolation include: the large number of calculations which need to be performed automatically the quantification of the influence of topography, usually the most influential of exogenous variables how to use observed zero (dry) values in interpolation, because their proportion increases the shorter the time interval the need to estimate a reasonable uncertainty of the modelled point/pixel distributions the need to separate (i) temporally highly correlated bias from (ii) random interpolation errors at different spatial and temporal scales the difficulty of estimating uncertainty of accumulations over a range of spatial scales. The approaches used and described in the presentation employ the variables rainfall and altitude. The methods of interpolation include (i) Ordinary Kriging of the rainfall without altitude, (ii) External Drift Kriging with altitude as an exogenous variable, and less conventionally, (iii) truncated Gaussian copulas and truncated v-copulas, both omitting and including the altitude of the control stations as well as that of the target (iv) truncated Gaussian copulas and truncated v-copulas for a two-step interpolation of precipitation combining temporal and spatial quantiles for bias quantification. It was found that truncated Gaussian copulas, with the target's and all control the stations' altitudes included as exogenous variables, produce the lowest Mean Square error in cross-validation and, as a bonus, model with the least bias. In contrast, the uncertainty of interpolation is better described by the v-copulas, but the Gaussian copulas have the advantage of computational effort (by three orders of magnitude) which justifies their use in practice. It turns out that the uncertainty estimates of the OK and EDK interpolants are not competitive at any time scale, from daily

  10. The Role and Meaning of Uncertainty and Probability in Natural Hazard Assessment (Invited)

    Science.gov (United States)

    Marzocchi, W.; Jordan, T. H.

    2013-12-01

    About one decade ago, Donald Rumsfeld provided a comprehensive view of the uncertainties in hazard assessment: "There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know". This apparently tongue-in-cheek definition describes well the variety of different types of uncertainty that scientists face in describing the Nature. The ubiquitous and deep uncertainties in the formal representation of natural systems imply that the forecasting of emergent phenomena such as natural hazards must be based on probabilities. Notwithstanding the very wide use of the terms 'uncertainty' and 'probability' in natural hazards, the way in which they are linked, how they are estimated and their scientific meaning are far to be clear, as testified by the last Intergovernmental Panel on Climate Change (IPCC) report and by its subsequent review. The lack of a formal framework to interpret uncertainty and probability in a coherent way paved the way to some of the strongest critics to hazard analysis; in fact, it has been argued that most of natural hazard analyses are intrinsically 'unscientific'. The purpose of this talk is to confront and clarify the conceptual issues associated with the role of uncertainty and probability in hazard analysis. Of particular concern is the taxonomy of uncertainty and the logical framework in which a probability can be tested and then considered 'scientific'. Specifically, we discuss the different types of uncertainties (aleatory variability, epistemic uncertainty, and ontological error), their differences, the link with the probability, and their estimation using data, models, and subjective expert opinion, in the framework of two particular problems, the Probabilistic Seismic Hazard Analysis (PSHA), and the Probabilistic Volcanic Hazard Analysis (PVHA).

  11. Estimation of Uncertainty in Risk Assessment of Hydrogen Applications

    DEFF Research Database (Denmark)

    Markert, Frank; Krymsky, V.; Kozine, Igor

    2011-01-01

    the permitting authorities request qualitative and quantitative risk assessments (QRA) to show the safety and acceptability in terms of failure frequencies and respective consequences. For new technologies not all statistical data might be established or are available in good quality causing assumptions......Hydrogen technologies such as hydrogen fuelled vehicles and refuelling stations are being tested in practice in a number of projects (e.g. HyFleet-Cute and Whistler project) giving valuable information on the reliability and maintenance requirements. In order to establish refuelling stations...... probability and the NUSAP concept to quantify uncertainties of new not fully qualified hydrogen technologies and implications to risk management....

  12. Stochastic uncertainties and sensitivities of a regional-scale transport model of nitrate in groundwater

    NARCIS (Netherlands)

    Brink, C.v.d.; Zaadnoordijk, W.J.; Burgers, S.; Griffioen, J.

    2008-01-01

    Groundwater quality management relies more and more on models in recent years. These models are used to predict the risk of groundwater contamination for various land uses. This paper presents an assessment of uncertainties and sensitivities to input parameters for a regional model. The model had

  13. The role of observational uncertainties in testing model hypotheses

    Science.gov (United States)

    Westerberg, I. K.; Birkel, C.

    2012-12-01

    Knowledge about hydrological processes and the spatial and temporal distribution of water resources is needed as a basis for managing water for hydropower, agriculture and flood-protection. Conceptual hydrological models may be used to infer knowledge on catchment functioning but are affected by uncertainties in the model representation of reality as well as in the observational data used to drive the model and to evaluate model performance. Therefore, meaningful hypothesis testing of the hydrological functioning of a catchment requires such uncertainties to be carefully estimated and accounted for in model calibration and evaluation. The aim of this study was to investigate the role of observational uncertainties in hypothesis testing, in particular whether it was possible to detect model-structural representations that were wrong in an important way given the uncertainties in the observational data. We studied the relatively data-scarce tropical Sarapiqui catchment in Costa Rica, Central America, where water resources play a vital part for hydropower production and livelihood. We tested several model structures of varying complexity as hypotheses about catchment functioning, but also hypotheses about the nature of the modelling errors. The tests were made within a learning framework for uncertainty estimation which enabled insights into data uncertainties, suitable model-structural representations and appropriate likelihoods. The observational uncertainty in discharge data was estimated from a rating-curve analysis and precipitation measurement errors through scenarios relating the error to, for example, canopy interception, wind-driven rain and the elevation gradient. The hypotheses were evaluated in a posterior analysis of the simulations where the performance of each simulation was analysed relative to the observational uncertainties for the entire hydrograph as well as for different aspects of the hydrograph (e.g. peak flows, recession periods, and base flow

  14. Assessment and interpretation of internal doses: uncertainty and variability.

    Science.gov (United States)

    Paquet, F; Bailey, M R; Leggett, R W; Harrison, J D

    2016-06-01

    Internal doses are calculated on the basis of knowledge of intakes and/or measurements of activity in bioassay samples, typically using reference biokinetic and dosimetric models recommended by the International Commission on Radiological Protection (ICRP). These models describe the behaviour of the radionuclides after ingestion, inhalation, and absorption to the blood, and the absorption of the energy resulting from their nuclear transformations. They are intended to be used mainly for the purpose of radiological protection: that is, optimisation and demonstration of compliance with dose limits. These models and parameter values are fixed by convention and are not subject to uncertainty. Over the past few years, ICRP has devoted a considerable amount of effort to the revision and improvement of models to make them more physiologically realistic. ICRP models are now sufficiently sophisticated for calculating organ and tissue absorbed doses for scientific purposes, and in many other areas, including toxicology, pharmacology and medicine. In these specific cases, uncertainties in parameters and variability between individuals need to be taken into account.

  15. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... time, especially with respect to large-scale transport models. The study described in this paper contributes to fill the gap by investigating the effects of uncertainty in socio-economic variables growth rate projections on large-scale transport model forecasts, using the Danish National Transport...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  16. The cascade of uncertainty in modeling the impacts of climate change on Europe's forests

    Science.gov (United States)

    Reyer, Christopher; Lasch-Born, Petra; Suckow, Felicitas; Gutsch, Martin

    2015-04-01

    Projecting the impacts of global change on forest ecosystems is a cornerstone for designing sustainable forest management strategies and paramount for assessing the potential of Europe's forest to contribute to the EU bioeconomy. Research on climate change impacts on forests relies to a large extent on model applications along a model chain from Integrated Assessment Models to General and Regional Circulation Models that provide important driving variables for forest models. Or to decision support systems that synthesize findings of more detailed forest models to inform forest managers. At each step in the model chain, model-specific uncertainties about, amongst others, parameter values, input data or model structure accumulate, leading to a cascade of uncertainty. For example, climate change impacts on forests strongly depend on the in- or exclusion of CO2-effects or on the use of an ensemble of climate models rather than relying on one particular climate model. In the past, these uncertainties have not or only partly been considered in studies of climate change impacts on forests. This has left managers and decision-makers in doubt of how robust the projected impacts on forest ecosystems are. We deal with this cascade of uncertainty in a structured way and the objective of this presentation is to assess how different types of uncertainties affect projections of the effects of climate change on forest ecosystems. To address this objective we synthesized a large body of scientific literature on modeled productivity changes and the effects of extreme events on plant processes. Furthermore, we apply the process-based forest growth model 4C to forest stands all over Europe and assess how different climate models, emission scenarios and assumptions about the parameters and structure of 4C affect the uncertainty of the model projections. We show that there are consistent regional changes in forest productivity such as an increase in NPP in cold and wet regions while

  17. Assessment of drought damages and their uncertainties in Europe

    Science.gov (United States)

    Naumann, Gustavo; Spinoni, Jonathan; Vogt, Jürgen V.; Barbosa, Paulo

    2015-12-01

    Drought is a natural hazard triggered by a lack of precipitation that can last for several months or years. Droughts can affect a wide range of socio-economic sectors while the related direct and indirect impacts are often difficult to quantify. In this context, drought damage refers to the total or partial destruction of physical assets in the affected area. The main constraint in constructing a robust relationship between the severity of drought events and related damages is the lack of sufficient quantitative impact data. In this paper we propose the use of power-law damage functions to assess the relationship between drought severity and related damages in two economic sectors, namely cereal crop production and hydropower generation, across 21 European countries. The different shapes of the resulting damage functions can be explained by the specific drought vulnerability or adaptive capacity of each sector and country. Due to the scarcity of impact data linked to extreme climate events a bootstrap resampling was performed to assess the potential uncertainties associated with the sample size. This approach helps communicating potential drought impacts and related uncertainties to end users and policy makers in support to the development of drought management plans and long-term adaptation measures.

  18. Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations

    CERN Document Server

    Krauss, L M; White, M; Krauss, Lawrence M.; Gates, Evalyn; White, Martin

    1993-01-01

    We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.

  19. Modelling theoretical uncertainties in phenomenological analyses for particle physics

    CERN Document Server

    Charles, Jérôme; Niess, Valentin; Silva, Luiz Vale

    2016-01-01

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding $p$-values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive $p$-value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavour p...

  20. Modeling theoretical uncertainties in phenomenological analyses for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)

    2017-04-15

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)

  1. An educational model for ensemble streamflow simulation and uncertainty analysis

    National Research Council Canada - National Science Library

    AghaKouchak, A; Nakhjiri, N; Habib, E

    2013-01-01

    ...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...

  2. Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations

    OpenAIRE

    1992-01-01

    We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.

  3. Differences between fully Bayesian and pragmatic methods to assess predictive uncertainty and optimal monitoring designs

    Science.gov (United States)

    Wöhling, Thomas; Geiges, Andreas; Gosses, Moritz; Nowak, Wolfgang

    2015-04-01

    Data acquisition for monitoring the state in different compartments of complex, coupled environmental systems is often time consuming and expensive. Therefore, experimental monitoring strategies are ideally designed such that most can be learned about the system at minimal costs. Bayesian methods for uncertainty quantification and optimal design (OD) of monitoring strategies are well suited to handle the non-linearity exhibited by most coupled environmental systems. However, their high computational demand restricts their applicability to models with comparatively low run-times. Therefore, pragmatic approaches have been used predominantly in the past where data worth and OD analyses have been restricted to linear or linearised problems and methods. Bayesian (nonlinear) and pragmatic (linear) OD approaches are founded on different assumptions and typically follow different steps in the modelling chain of 1) model calibration, 2) uncertainty quantification, and 3) optimal design analysis. The goal of this study is to follow through these steps for a Bayesian and a pragmatic approach and to discuss the impact of different assumptions (prior uncertainty), calibration strategies, and OD analysis methods on the proposed monitoring designs and their reliability to reduce predictive uncertainty. The OD framework PreDIA (Leube et al. 2012) is used for the nonlinear assessment with a conditional model ensemble obtained with Markov-chain Monte Carlo simulation representing the initial predictive uncertainty. PreDIA can consider any kind of uncertainties and non-linear (statistical) dependencies in data, models, parameters and system drivers during the OD process. In the pragmatic OD approach, the parameter calibration was performed with a non-linear global search and the initial predictive uncertainty was estimated using the PREDUNC utility (Moore and Doherty 2005) of PEST. PREDUNC was also used for the linear OD analysis. We applied PreDIA and PREDUNC for uncertainty

  4. Area 2: Inexpensive Monitoring and Uncertainty Assessment of CO2 Plume Migration using Injection Data

    Energy Technology Data Exchange (ETDEWEB)

    Srinivasan, Sanjay [Univ. of Texas, Austin, TX (United States)

    2014-09-30

    In-depth understanding of the long-term fate of CO₂ in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO₂ in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO₂ plume migration in two field projects – the In Salah CO₂ Injection project in Algeria and CO₂ injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were

  5. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  6. On the uncertainty of phenological responses to climate change and its implication for terrestrial biosphere models

    Directory of Open Access Journals (Sweden)

    M. Migliavacca

    2012-01-01

    Full Text Available Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst.

    The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure.

    We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario. Parameter uncertainty was the smallest (average 95% CI: 2.4 day century−1 for scenario B1 and 4.5 day century−1 for A1fi, whereas driver uncertainty was the largest (up to 8.4 day century−1 in the simulated trends. The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century−1 for A1fi, ±3.6 day century−1 for B1. The forecast sensitivity of bud-burst to

  7. Flood risk assessment at the regional scale: Computational challenges and the monster of uncertainty

    Science.gov (United States)

    Efstratiadis, Andreas; Papalexiou, Simon-Michael; Markonis, Yiannis; Koukouvinos, Antonis; Vasiliades, Lampros; Papaioannou, George; Loukas, Athanasios

    2016-04-01

    We present a methodological framework for flood risk assessment at the regional scale, developed within the implementation of the EU Directive 2007/60 in Greece. This comprises three phases: (a) statistical analysis of extreme rainfall data, resulting to spatially-distributed parameters of intensity-duration-frequency (IDF) relationships and their confidence intervals, (b) hydrological simulations, using event-based semi-distributed rainfall-runoff approaches, and (c) hydraulic simulations, employing the propagation of flood hydrographs across the river network and the mapping of inundated areas. The flood risk assessment procedure is employed over the River Basin District of Thessaly, Greece, which requires schematization and modelling of hundreds of sub-catchments, each one examined for several risk scenarios. This is a challenging task, involving multiple computational issues to handle, such as the organization, control and processing of huge amount of hydrometeorological and geographical data, the configuration of model inputs and outputs, and the co-operation of several software tools. In this context, we have developed supporting applications allowing massive data processing and effective model coupling, thus drastically reducing the need for manual interventions and, consequently, the time of the study. Within flood risk computations we also account for three major sources of uncertainty, in an attempt to provide upper and lower confidence bounds of flood maps, i.e. (a) statistical uncertainty of IDF curves, (b) structural uncertainty of hydrological models, due to varying anteceded soil moisture conditions, and (c) parameter uncertainty of hydraulic models, with emphasis to roughness coefficients. Our investigations indicate that the combined effect of the above uncertainties (which are certainly not the unique ones) result to extremely large bounds of potential inundation, thus rising many questions about the interpretation and usefulness of current flood

  8. Uncertainty assessment of climate change adaptation using an economic pluvial flood risk framework

    DEFF Research Database (Denmark)

    Zhou, Qianqian; Arnbjerg-Nielsen, Karsten

    2012-01-01

    It is anticipated that climate change is likely to lead to an increasing risk level of flooding in cities in northern Europe. One challenging question is how to best address the increasing flood risk and assess the costs and benefits of adapting to such changes. We established an integrated...... approach for identification and assessment of climate change adaptation options by incorporating climate change impacts, flood inundation modelling, economic tool and risk assessment and management. The framework is further extended and adapted by embedding a Monte Carlo simulation to estimate the total...... uncertainty bounds propagated through the evaluation and identify the relative contribution of inherent uncertainties in the assessment. The case study is a small urban catchment located in Skibhus, Odense where no significant city development is anticipated. Two adaptation scenarios, namely pipe enlargement...

  9. Do Metacognitions and Intolerance of Uncertainty Predict Worry in Everyday Life? An Ecological Momentary Assessment Study.

    Science.gov (United States)

    Thielsch, Carolin; Andor, Tanja; Ehring, Thomas

    2015-07-01

    Cognitive models of generalized anxiety disorder (GAD) suggest that excessive worry is due to positive and negative metacognitive beliefs and/or intolerance of uncertainty. Empirical support mainly derives from cross-sectional studies with limited conclusiveness, using self-report measures and thereby possibly causing recall biases. The aim of the present study therefore was to examine the power of these cognitive variables to predict levels of worry in everyday life using Ecological Momentary Assessment (EMA). Metacognitions and intolerance of uncertainty were assessed using well-established self-report questionnaires in 41 nonclinical participants who subsequently completed ratings on worry intensity and burden on a portable device for 1week at seven times a day once every 2hours. Results showed significant associations of negative metacognitive beliefs and intolerance of uncertainty, but not positive metacognitive beliefs, with worry in everyday life. In multilevel regression analyses, a substantial proportion of variance of everyday worry could be accounted for by negative metacognitions over and above trait worry and daily hassles. Intolerance of uncertainty likewise emerged as a valid predictor when tested in isolation, but did not explain additional variance once negative metacognitions were controlled. The findings support current cognitive models of excessive worry and highlight the role of negative metacognitions. By using EMA to assess levels of worry in everyday life, they extend earlier findings focusing exclusively on retrospective questionnaire measures.

  10. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  11. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  12. Probabilistic uncertainty analysis of epidemiological modeling to guide public health intervention policy

    Directory of Open Access Journals (Sweden)

    Jennifer A. Gilbert

    2014-03-01

    Full Text Available Mathematical modeling of disease transmission has provided quantitative predictions for health policy, facilitating the evaluation of epidemiological outcomes and the cost-effectiveness of interventions. However, typical sensitivity analyses of deterministic dynamic infectious disease models focus on model architecture and the relative importance of parameters but neglect parameter uncertainty when reporting model predictions. Consequently, model results that identify point estimates of intervention levels necessary to terminate transmission yield limited insight into the probability of success. We apply probabilistic uncertainty analysis to a dynamic model of influenza transmission and assess global uncertainty in outcome. We illustrate that when parameter uncertainty is not incorporated into outcome estimates, levels of vaccination and treatment predicted to prevent an influenza epidemic will only have an approximately 50% chance of terminating transmission and that sensitivity analysis alone is not sufficient to obtain this information. We demonstrate that accounting for parameter uncertainty yields probabilities of epidemiological outcomes based on the degree to which data support the range of model predictions. Unlike typical sensitivity analyses of dynamic models that only address variation in parameters, the probabilistic uncertainty analysis described here enables modelers to convey the robustness of their predictions to policy makers, extending the power of epidemiological modeling to improve public health.

  13. Climate data induced uncertainty in model-based estimations of terrestrial primary productivity

    Science.gov (United States)

    Wu, Zhendong; Ahlström, Anders; Smith, Benjamin; Ardö, Jonas; Eklundh, Lars; Fensholt, Rasmus; Lehsten, Veiko

    2017-06-01

    Model-based estimations of historical fluxes and pools of the terrestrial biosphere differ substantially. These differences arise not only from differences between models but also from differences in the environmental and climatic data used as input to the models. Here we investigate the role of uncertainties in historical climate data by performing simulations of terrestrial gross primary productivity (GPP) using a process-based dynamic vegetation model (LPJ-GUESS) forced by six different climate datasets. We find that the climate induced uncertainty, defined as the range among historical simulations in GPP when forcing the model with the different climate datasets, can be as high as 11 Pg C yr-1 globally (9% of mean GPP). We also assessed a hypothetical maximum climate data induced uncertainty by combining climate variables from different datasets, which resulted in significantly larger uncertainties of 41 Pg C yr-1 globally or 32% of mean GPP. The uncertainty is partitioned into components associated to the three main climatic drivers, temperature, precipitation, and shortwave radiation. Additionally, we illustrate how the uncertainty due to a given climate driver depends both on the magnitude of the forcing data uncertainty (climate data range) and the apparent sensitivity of the modeled GPP to the driver (apparent model sensitivity). We find that LPJ-GUESS overestimates GPP compared to empirically based GPP data product in all land cover classes except for tropical forests. Tropical forests emerge as a disproportionate source of uncertainty in GPP estimation both in the simulations and empirical data products. The tropical forest uncertainty is most strongly associated with shortwave radiation and precipitation forcing, of which climate data range contributes higher to overall uncertainty than apparent model sensitivity to forcing. Globally, precipitation dominates the climate induced uncertainty over nearly half of the vegetated land area, which is mainly due

  14. [Application of uncertainty assessment in NIR quantitative analysis of traditional Chinese medicine].

    Science.gov (United States)

    Xue, Zhong; Xu, Bing; Liu, Qian; Shi, Xin-Yuan; Li, Jian-Yu; Wu, Zhi-Sheng; Qiao, Yan-Jiang

    2014-10-01

    The near infrared (NIR) spectra of Liuyi San samples were collected during the mixing process and the quantitative models by PLS (partial least squares) method were generated for the quantification of the concentration of glycyrrhizin. The PLS quantitative model had good calibration and prediction performances (r(cal) 0.998 5, RMSEC = 0.044 mg · g(-1); r(val) = 0.947 4, RMSEP = 0.124 mg · g(-1)), indicating that NIR spectroscopy can be used as a rapid determination method of the concentration of glycyrrhizin in Liuyi San powder. After the validation tests were designed, the Liao-Lin-Iyer approach based on Monte Carlo simulation was used to estimate β-content-γ-confidence tolerance intervals. Then the uncertainty was calculated, and the uncer- tainty profile was drawn. The NIR analytical method was considered valid when the concentration of glycyrrhizin is above 1.56 mg · g(-1) since the uncertainty fell within the acceptable limits (λ = ± 20%). The results showed that uncertainty assessment can be used in NIR quantitative models of glycyrrhizin for different concentrations and provided references for other traditional Chinese medicine to finish the uncertainty assessment using NIR quantitative analysis.

  15. On the uncertainty of phenological responses to climate change, and implications for a terrestrial biosphere model

    Directory of Open Access Journals (Sweden)

    M. Migliavacca

    2012-06-01

    Full Text Available Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere.

    Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity.

    Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements.

    We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario. Parameter uncertainty was the smallest (average 95% Confidence Interval – CI: 2.4 days century−1 for scenario B1 and 4.5 days century−1 for A1fi, whereas driver uncertainty was the largest (up to 8.4 days century−1 in the simulated trends. The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century−1 for A1fi, ±3.6 days century−1 for B1. The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per

  16. Understanding quantitative structure-property relationships uncertainty in environmental fate modeling.

    Science.gov (United States)

    Sarfraz Iqbal, M; Golsteijn, Laura; Öberg, Tomas; Sahlin, Ullrika; Papa, Ester; Kovarich, Simona; Huijbregts, Mark A J

    2013-04-01

    In cases in which experimental data on chemical-specific input parameters are lacking, chemical regulations allow the use of alternatives to testing, such as in silico predictions based on quantitative structure-property relationships (QSPRs). Such predictions are often given as point estimates; however, little is known about the extent to which uncertainties associated with QSPR predictions contribute to uncertainty in fate assessments. In the present study, QSPR-induced uncertainty in overall persistence (POV ) and long-range transport potential (LRTP) was studied by integrating QSPRs into probabilistic assessments of five polybrominated diphenyl ethers (PBDEs), using the multimedia fate model Simplebox. The uncertainty analysis considered QSPR predictions of the fate input parameters' melting point, water solubility, vapor pressure, organic carbon-water partition coefficient, hydroxyl radical degradation, biodegradation, and photolytic degradation. Uncertainty in POV and LRTP was dominated by the uncertainty in direct photolysis and the biodegradation half-life in water. However, the QSPRs developed specifically for PBDEs had a relatively low contribution to uncertainty. These findings suggest that the reliability of the ranking of PBDEs on the basis of POV and LRTP can be substantially improved by developing better QSPRs to estimate degradation properties. The present study demonstrates the use of uncertainty and sensitivity analyses in nontesting strategies and highlights the need for guidance when compounds fall outside the applicability domain of a QSPR.

  17. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    Science.gov (United States)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2016-10-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  18. Multi-model seasonal forecast of Arctic sea-ice: forecast uncertainty at pan-Arctic and regional scales

    Science.gov (United States)

    Blanchard-Wrigglesworth, E.; Barthélemy, A.; Chevallier, M.; Cullather, R.; Fučkar, N.; Massonnet, F.; Posey, P.; Wang, W.; Zhang, J.; Ardilouze, C.; Bitz, C. M.; Vernieres, G.; Wallcraft, A.; Wang, M.

    2017-08-01

    Dynamical model forecasts in the Sea Ice Outlook (SIO) of September Arctic sea-ice extent over the last decade have shown lower skill than that found in both idealized model experiments and hindcasts of previous decades. Additionally, it is unclear how different model physics, initial conditions or forecast post-processing (bias correction) techniques contribute to SIO forecast uncertainty. In this work, we have produced a seasonal forecast of 2015 Arctic summer sea ice using SIO dynamical models initialized with identical sea-ice thickness in the central Arctic. Our goals are to calculate the relative contribution of model uncertainty and irreducible error growth to forecast uncertainty and assess the importance of post-processing, and to contrast pan-Arctic forecast uncertainty with regional forecast uncertainty. We find that prior to forecast post-processing, model uncertainty is the main contributor to forecast uncertainty, whereas after forecast post-processing forecast uncertainty is reduced overall, model uncertainty is reduced by an order of magnitude, and irreducible error growth becomes the main contributor to forecast uncertainty. While all models generally agree in their post-processed forecasts of September sea-ice volume and extent, this is not the case for sea-ice concentration. Additionally, forecast uncertainty of sea-ice thickness grows at a much higher rate along Arctic coastlines relative to the central Arctic ocean. Potential ways of offering spatial forecast information based on the timescale over which the forecast signal beats the noise are also explored.

  19. Uncertainty Assessment: Reservoir Inflow Forecasting with Ensemble Precipitation Forecasts and HEC-HMS

    Directory of Open Access Journals (Sweden)

    Sheng-Chi Yang

    2014-01-01

    Full Text Available During an extreme event, having accurate inflow forecasting with enough lead time helps reservoir operators decrease the impact of floods downstream. Furthermore, being able to efficiently operate reservoirs could help maximize flood protection while saving water for drier times of the year. This study combines ensemble quantitative precipitation forecasts and a hydrological model to provide a 3-day reservoir inflow in the Shihmen Reservoir, Taiwan. A total of six historical typhoons were used for model calibration, validation, and application. An understanding of cascaded uncertainties from the numerical weather model through the hydrological model is necessary for a better use for forecasting. This study thus conducted an assessment of forecast uncertainty on magnitude and timing of peak and cumulative inflows. It found that using the ensemble-mean had less uncertainty than randomly selecting individual member. The inflow forecasts with shorter length of cumulative time had a higher uncertainty. The results showed that using the ensemble precipitation forecasts with the hydrological model would have the advantage of extra lead time and serve as a valuable reference for operating reservoirs.

  20. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several

  1. Uncertainties in model predictions of nitrogen fluxes from agro-ecosystems in Europe

    Directory of Open Access Journals (Sweden)

    J. Kros

    2012-05-01

    Full Text Available To assess the responses of nitrogen and greenhouse gas emissions to pan-European changes in land cover, land management and climate, an integrated dynamic model, INTEGRATOR, has been developed. This model includes both simple process-based descriptions and empirical relationships, and uses detailed GIS-based environmental and farming data in combination with various downscaling methods. This paper analyses the propagation of uncertainties in model inputs and model parameters to outputs of INTEGRATOR, using a Monte Carlo analysis. Uncertain model inputs and parameters were represented by probability distributions, while spatial correlation in these uncertainties was taken into account by assigning correlation coefficients at various spatial scales. The uncertainty propagation was analysed for the emissions of NH3, N2O and NOx and N leaching to groundwater and N surface runoff to surface water for the entire EU27 and for individual countries. Results show large uncertainties for N leaching and N runoff (relative errors of ~19 % for Europe as a whole, and smaller uncertainties for emission of N2O, NH3 and NOx (relative errors of ~12 %. Uncertainties for Europe as a whole were much smaller compared to uncertainties at Country level, because errors partly cancelled out due to spatial aggregation.

  2. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  3. Climate change impact assessment and adaptation under uncertainty

    NARCIS (Netherlands)

    Wardekker, J.A.

    2011-01-01

    Expected impacts of climate change are associated with large uncertainties, particularly at the local level. Adaptation scientists, practitioners, and decision-makers will need to find ways to cope with these uncertainties. Several approaches have been suggested as ‘uncertainty-proof’ to some

  4. UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS

    Directory of Open Access Journals (Sweden)

    Fabiana Lucena Oliveira

    2014-05-01

    Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material.  The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.

  5. Uncertainty quantification of squeal instability via surrogate modelling

    Science.gov (United States)

    Nobari, Amir; Ouyang, Huajiang; Bannister, Paul

    2015-08-01

    One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of

  6. Considering the ranges of uncertainties in the New Probabilistic Seismic Hazard Assessment of Germany - Version 2016

    Science.gov (United States)

    Grunthal, Gottfried; Stromeyer, Dietrich; Bosse, Christian; Cotton, Fabrice; Bindi, Dino

    2017-04-01

    The seismic load parameters for the upcoming National Annex to the Eurocode 8 result from the reassessment of the seismic hazard supported by the German Institution for Civil Engineering . This 2016 version of hazard assessment for Germany as target area was based on a comprehensive involvement of all accessible uncertainties in models and parameters into the approach and the provision of a rational framework for facilitating the uncertainties in a transparent way. The developed seismic hazard model represents significant improvements; i.e. it is based on updated and extended databases, comprehensive ranges of models, robust methods and a selection of a set of ground motion prediction equations of their latest generation. The output specifications were designed according to the user oriented needs as suggested by two review teams supervising the entire project. In particular, seismic load parameters were calculated for rock conditions with a vS30 of 800 ms-1 for three hazard levels (10%, 5% and 2% probability of occurrence or exceedance within 50 years) in form of, e.g., uniform hazard spectra (UHS) based on 19 sprectral periods in the range of 0.01 - 3s, seismic hazard maps for spectral response accelerations for different spectral periods or for macroseismic intensities. The developed hazard model consists of a logic tree with 4040 end branches and essential innovations employed to capture epistemic uncertainties and aleatory variabilities. The computation scheme enables the sound calculation of the mean and any quantile of required seismic load parameters. Mean, median and 84th percentiles of load parameters were provided together with the full calculation model to clearly illustrate the uncertainties of such a probabilistic assessment for a region of a low-to-moderate level of seismicity. The regional variations of these uncertainties (e.g. ratios between the mean and median hazard estimations) were analyzed and discussed.

  7. Impact of inherent meteorology uncertainty on air quality model predictions

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...

  8. Quantification of Modelling Uncertainties in Turbulent Flow Simulations

    NARCIS (Netherlands)

    Edeling, W.N.

    2015-01-01

    The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest w

  9. Quantification of Modelling Uncertainties in Turbulent Flow Simulations

    NARCIS (Netherlands)

    Edeling, W.N.

    2015-01-01

    The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest

  10. Uncertainty quantification in Rothermel's Model using an efficient sampling method

    Science.gov (United States)

    Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick

    2007-01-01

    The purpose of the present work is to quantify parametric uncertainty in Rothermel’s wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...

  11. Assessing uncertainty in high-resolution spatial climate data across the US Northeast.

    Directory of Open Access Journals (Sweden)

    Daniel A Bishop

    Full Text Available Local and regional-scale knowledge of climate change is needed to model ecosystem responses, assess vulnerabilities and devise effective adaptation strategies. High-resolution gridded historical climate (GHC products address this need, but come with multiple sources of uncertainty that are typically not well understood by data users. To better understand this uncertainty in a region with a complex climatology, we conducted a ground-truthing analysis of two 4 km GHC temperature products (PRISM and NRCC for the US Northeast using 51 Cooperative Network (COOP weather stations utilized by both GHC products. We estimated GHC prediction error for monthly temperature means and trends (1980-2009 across the US Northeast and evaluated any landscape effects (e.g., elevation, distance from coast on those prediction errors. Results indicated that station-based prediction errors for the two GHC products were similar in magnitude, but on average, the NRCC product predicted cooler than observed temperature means and trends, while PRISM was cooler for means and warmer for trends. We found no evidence for systematic sources of uncertainty across the US Northeast, although errors were largest at high elevations. Errors in the coarse-scale (4 km digital elevation models used by each product were correlated with temperature prediction errors, more so for NRCC than PRISM. In summary, uncertainty