Model uncertainty in safety assessment
International Nuclear Information System (INIS)
Pulkkinen, U.; Huovinen, T.
1996-01-01
The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)
Model uncertainty in safety assessment
Energy Technology Data Exchange (ETDEWEB)
Pulkkinen, U; Huovinen, T [VTT Automation, Espoo (Finland). Industrial Automation
1996-01-01
The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.).
Uncertainties in radioecological assessment models
International Nuclear Information System (INIS)
Hoffman, F.O.; Miller, C.W.; Ng, Y.C.
1983-01-01
Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because models are inexact representations of real systems. The major sources of this uncertainty are related to bias in model formulation and imprecision in parameter estimation. The magnitude of uncertainty is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, health risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible. 41 references, 4 figures, 4 tables
Assessing uncertainty in mechanistic models
Edwin J. Green; David W. MacFarlane; Harry T. Valentine
2000-01-01
Concern over potential global change has led to increased interest in the use of mechanistic models for predicting forest growth. The rationale for this interest is that empirical models may be of limited usefulness if environmental conditions change. Intuitively, we expect that mechanistic models, grounded as far as possible in an understanding of the biology of tree...
Probabilistic Radiological Performance Assessment Modeling and Uncertainty
Tauxe, J.
2004-12-01
A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A
Uncertainty Assessment in Urban Storm Water Drainage Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren
The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...
Assessing scenario and parametric uncertainties in risk analysis: a model uncertainty audit
International Nuclear Information System (INIS)
Tarantola, S.; Saltelli, A.; Draper, D.
1999-01-01
In the present study a process of model audit is addressed on a computational model used for predicting maximum radiological doses to humans in the field of nuclear waste disposal. Global uncertainty and sensitivity analyses are employed to assess output uncertainty and to quantify the contribution of parametric and scenario uncertainties to the model output. These tools are of fundamental importance for risk analysis and decision making purposes
Uncertainties in environmental radiological assessment models and their implications
International Nuclear Information System (INIS)
Hoffman, F.O.; Miller, C.W.
1983-01-01
Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible
Sensitivity and uncertainty analyses for performance assessment modeling
International Nuclear Information System (INIS)
Doctor, P.G.
1988-08-01
Sensitivity and uncertainty analyses methods for computer models are being applied in performance assessment modeling in the geologic high level radioactive waste repository program. The models used in performance assessment tend to be complex physical/chemical models with large numbers of input variables. There are two basic approaches to sensitivity and uncertainty analyses: deterministic and statistical. The deterministic approach to sensitivity analysis involves numerical calculation or employs the adjoint form of a partial differential equation to compute partial derivatives; the uncertainty analysis is based on Taylor series expansions of the input variables propagated through the model to compute means and variances of the output variable. The statistical approach to sensitivity analysis involves a response surface approximation to the model with the sensitivity coefficients calculated from the response surface parameters; the uncertainty analysis is based on simulation. The methods each have strengths and weaknesses. 44 refs
Implications of model uncertainty for the practice of risk assessment
International Nuclear Information System (INIS)
Laskey, K.B.
1994-01-01
A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making
Spatial variability and parametric uncertainty in performance assessment models
International Nuclear Information System (INIS)
Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo
2011-01-01
The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)
Assessing Groundwater Model Uncertainty for the Central Nevada Test Area
International Nuclear Information System (INIS)
Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd
2002-01-01
The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation
Assessing uncertainty in SRTM elevations for global flood modelling
Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.
2017-12-01
The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.
Energy Technology Data Exchange (ETDEWEB)
Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)
2016-09-01
Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise
Integration of inaccurate data into model building and uncertainty assessment
Energy Technology Data Exchange (ETDEWEB)
Coleou, Thierry
1998-12-31
Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.
Geostatistical modeling of groundwater properties and assessment of their uncertainties
International Nuclear Information System (INIS)
Honda, Makoto; Yamamoto, Shinya; Sakurai, Hideyuki; Suzuki, Makoto; Sanada, Hiroyuki; Matsui, Hiroya; Sugita, Yutaka
2010-01-01
The distribution of groundwater properties is important for understanding of the deep underground hydrogeological environments. This paper proposes a geostatistical system for modeling the groundwater properties which have a correlation with the ground resistivity data obtained from widespread and exhaustive survey. That is, the methodology for the integration of resistivity data measured by various methods and the methodology for modeling the groundwater properties using the integrated resistivity data has been developed. The proposed system has also been validated using the data obtained in the Horonobe Underground Research Laboratory project. Additionally, the quantification of uncertainties in the estimated model has been tried by numerical simulations based on the data. As a result, the uncertainties of the proposal model have been estimated lower than other traditional model's. (author)
Assessment of errors and uncertainty patterns in GIA modeling
DEFF Research Database (Denmark)
Barletta, Valentina Roberta; Spada, G.
2012-01-01
During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one......, such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...
Li, L.; Xu, C.-Y.; Engeland, K.
2012-04-01
With respect to model calibration, parameter estimation and analysis of uncertainty sources, different approaches have been used in hydrological models. Bayesian method is one of the most widely used methods for uncertainty assessment of hydrological models, which incorporates different sources of information into a single analysis through Bayesian theorem. However, none of these applications can well treat the uncertainty in extreme flows of hydrological models' simulations. This study proposes a Bayesian modularization method approach in uncertainty assessment of conceptual hydrological models by considering the extreme flows. It includes a comprehensive comparison and evaluation of uncertainty assessments by a new Bayesian modularization method approach and traditional Bayesian models using the Metropolis Hasting (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions are used in combination with traditional Bayesian: the AR (1) plus Normal and time period independent model (Model 1), the AR (1) plus Normal and time period dependent model (Model 2) and the AR (1) plus multi-normal model (Model 3). The results reveal that (1) the simulations derived from Bayesian modularization method are more accurate with the highest Nash-Sutcliffe efficiency value, and (2) the Bayesian modularization method performs best in uncertainty estimates of entire flows and in terms of the application and computational efficiency. The study thus introduces a new approach for reducing the extreme flow's effect on the discharge uncertainty assessment of hydrological models via Bayesian. Keywords: extreme flow, uncertainty assessment, Bayesian modularization, hydrological model, WASMOD
DEFF Research Database (Denmark)
Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.
different conceptual models may describe the same contaminated site equally well. In many cases, conceptual model uncertainty has been shown to be one of the dominant sources for uncertainty and is therefore essential to account for when quantifying uncertainties in risk assessments. We present here......A key component in risk assessment of contaminated sites is the formulation of a conceptual site model. The conceptual model is a simplified representation of reality and forms the basis for the mathematical modelling of contaminant fate and transport at the site. A conceptual model should...... a Bayesian Belief Network (BBN) approach for evaluating the uncertainty in risk assessment of groundwater contamination from contaminated sites. The approach accounts for conceptual model uncertainty by considering multiple conceptual models, each of which represents an alternative interpretation of the site...
DEFF Research Database (Denmark)
Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard
2015-01-01
Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....
DEFF Research Database (Denmark)
Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard
2014-01-01
Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....
Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...
Li, Lu; Xu, Chong-Yu; Engeland, Kolbjørn
2013-04-01
SummaryWith respect to model calibration, parameter estimation and analysis of uncertainty sources, various regression and probabilistic approaches are used in hydrological modeling. A family of Bayesian methods, which incorporates different sources of information into a single analysis through Bayes' theorem, is widely used for uncertainty assessment. However, none of these approaches can well treat the impact of high flows in hydrological modeling. This study proposes a Bayesian modularization uncertainty assessment approach in which the highest streamflow observations are treated as suspect information that should not influence the inference of the main bulk of the model parameters. This study includes a comprehensive comparison and evaluation of uncertainty assessments by our new Bayesian modularization method and standard Bayesian methods using the Metropolis-Hastings (MH) algorithm with the daily hydrological model WASMOD. Three likelihood functions were used in combination with standard Bayesian method: the AR(1) plus Normal model independent of time (Model 1), the AR(1) plus Normal model dependent on time (Model 2) and the AR(1) plus Multi-normal model (Model 3). The results reveal that the Bayesian modularization method provides the most accurate streamflow estimates measured by the Nash-Sutcliffe efficiency and provide the best in uncertainty estimates for low, medium and entire flows compared to standard Bayesian methods. The study thus provides a new approach for reducing the impact of high flows on the discharge uncertainty assessment of hydrological models via Bayesian method.
Uncertainty Assessment in Long Term Urban Drainage Modelling
DEFF Research Database (Denmark)
Thorndahl, Søren
the probability of system failures (defined as either flooding or surcharge of manholes or combined sewer overflow); (2) an application of the Generalized Likelihood Uncertainty Estimation methodology in which an event based stochastic calibration is performed; and (3) long term Monte Carlo simulations...
Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment
Taner, M. U.; Wi, S.; Brown, C.
2017-12-01
The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.
International Nuclear Information System (INIS)
Miller, C.; Little, C.A.
1982-08-01
The purpose is to summarize estimates based on currently available data of the uncertainty associated with radiological assessment models. The models being examined herein are those recommended previously for use in breeder reactor assessments. Uncertainty estimates are presented for models of atmospheric and hydrologic transport, terrestrial and aquatic food-chain bioaccumulation, and internal and external dosimetry. Both long-term and short-term release conditions are discussed. The uncertainty estimates presented in this report indicate that, for many sites, generic models and representative parameter values may be used to calculate doses from annual average radionuclide releases when these calculated doses are on the order of one-tenth or less of a relevant dose limit. For short-term, accidental releases, especially those from breeder reactors located in sites dominated by complex terrain and/or coastal meteorology, the uncertainty in the dose calculations may be much larger than an order of magnitude. As a result, it may be necessary to incorporate site-specific information into the dose calculation under these circumstances to reduce this uncertainty. However, even using site-specific information, natural variability and the uncertainties in the dose conversion factor will likely result in an overall uncertainty of greater than an order of magnitude for predictions of dose or concentration in environmental media following shortterm releases
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Binning, Philip John; McKnight, Ursula S.
2016-01-01
the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found...... to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models...... that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert...
Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.
2017-12-01
The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40
Parameter estimation and uncertainty assessment in hydrological modelling
DEFF Research Database (Denmark)
Blasone, Roberta-Serena
En rationel og effektiv vandressourceadministration forudsætter indsigt i og forståelse af de hydrologiske processer samt præcise opgørelser af de tilgængelige vandmængder i både overfladevands- og grundvandsmagasiner. Til det formål er hydrologiske modeller et uomgængeligt værktøj. I de senest 1...
Some concepts of model uncertainty for performance assessments of nuclear waste repositories
International Nuclear Information System (INIS)
Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.
1994-01-01
Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided
International Nuclear Information System (INIS)
Yim, Man-Sung
1995-01-01
Performance assessment is an essential step either in design or in licensing processes to ensure the safety of any proposed radioactive waste disposal facilities. Since performance assessment requires the use of computer codes, understanding the characteristics of computer models used and the uncertainties of the estimated results is important. The PRESTO-EPA code, which was the basis of the Environmental Protection Agency's analysis for low-level-waste rulemaking, is widely used for various performance assessment activities in the country with no adequate information available for the uncertainty characteristics of the results. In this study, the groundwater transport model PRESTO-EPA was examined based on the analysis of 14 C transport along with the investigation of uncertainty characteristics
DEFF Research Database (Denmark)
Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan
2008-01-01
uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......In recent years, there has been an increase in the application of distributed, physically-based and integrated hydrological models. Many questions regarding how to properly calibrate and validate distributed models and assess the uncertainty of the estimated parameters and the spatially......-site validation must complement the usual time validation. In this study, we develop, through an application, a comprehensive framework for multi-criteria calibration and uncertainty assessment of distributed physically-based, integrated hydrological models. A revised version of the generalized likelihood...
Tyler Jon Smith
2008-01-01
In Montana and much of the Rocky Mountain West, the single most important parameter in forecasting the controls on regional water resources is snowpack. Despite the heightened importance of snowpack, few studies have considered the representation of uncertainty in coupled snowmelt/hydrologic conceptual models. Uncertainty estimation provides a direct interpretation of...
Ex-plant consequence assessment for NUREG-1150: models, typical results, uncertainties
International Nuclear Information System (INIS)
Sprung, J.L.
1988-01-01
The assessment of ex-plant consequences for NUREG-1150 source terms was performed using the MELCOR Accident Consequence Code System (MACCS). This paper briefly discusses the following elements of MACCS consequence calculations: input data, phenomena modeled, computational framework, typical results, controlling phenomena, and uncertainties. Wherever possible, NUREG-1150 results will be used to illustrate the discussion. 28 references
Measures of Model Uncertainty in the Assessment of Primary Stresses in Ship Structures
DEFF Research Database (Denmark)
Östergaard, Carsten; Dogliani, Mario; Guedes Soares, Carlos
1996-01-01
The paper considers various models and methods commonly used for linear elastic stress analysis and assesses the uncertainty involved in their application to the analysis of the distribution of primary stresses in the hull of a containership example, through statistical evaluations of the results...
Uncertainty analysis in safety assessment
International Nuclear Information System (INIS)
Lemos, Francisco Luiz de; Sullivan, Terry
1997-01-01
Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author)
Uncertainty and Sensitivity of Alternative Rn-222 Flux Density Models Used in Performance Assessment
Energy Technology Data Exchange (ETDEWEB)
Greg J. Shott, Vefa Yucel, Lloyd Desotell
2007-06-01
Performance assessments for the Area 5 Radioactive Waste Management Site on the Nevada Test Site have used three different mathematical models to estimate Rn-222 flux density. This study describes the performance, uncertainty, and sensitivity of the three models which include the U.S. Nuclear Regulatory Commission Regulatory Guide 3.64 analytical method and two numerical methods. The uncertainty of each model was determined by Monte Carlo simulation using Latin hypercube sampling. The global sensitivity was investigated using Morris one-at-time screening method, sample-based correlation and regression methods, the variance-based extended Fourier amplitude sensitivity test, and Sobol's sensitivity indices. The models were found to produce similar estimates of the mean and median flux density, but to have different uncertainties and sensitivities. When the Rn-222 effective diffusion coefficient was estimated using five different published predictive models, the radon flux density models were found to be most sensitive to the effective diffusion coefficient model selected, the emanation coefficient, and the radionuclide inventory. Using a site-specific measured effective diffusion coefficient significantly reduced the output uncertainty. When a site-specific effective-diffusion coefficient was used, the models were most sensitive to the emanation coefficient and the radionuclide inventory.
Uncertainty and Sensitivity of Alternative Rn-222 Flux Density Models Used in Performance Assessment
International Nuclear Information System (INIS)
Greg J. Shott, Vefa Yucel, Lloyd Desotell Non-Nstec Authors: G. Pyles and Jon Carilli
2007-01-01
Performance assessments for the Area 5 Radioactive Waste Management Site on the Nevada Test Site have used three different mathematical models to estimate Rn-222 flux density. This study describes the performance, uncertainty, and sensitivity of the three models which include the U.S. Nuclear Regulatory Commission Regulatory Guide 3.64 analytical method and two numerical methods. The uncertainty of each model was determined by Monte Carlo simulation using Latin hypercube sampling. The global sensitivity was investigated using Morris one-at-time screening method, sample-based correlation and regression methods, the variance-based extended Fourier amplitude sensitivity test, and Sobol's sensitivity indices. The models were found to produce similar estimates of the mean and median flux density, but to have different uncertainties and sensitivities. When the Rn-222 effective diffusion coefficient was estimated using five different published predictive models, the radon flux density models were found to be most sensitive to the effective diffusion coefficient model selected, the emanation coefficient, and the radionuclide inventory. Using a site-specific measured effective diffusion coefficient significantly reduced the output uncertainty. When a site-specific effective-diffusion coefficient was used, the models were most sensitive to the emanation coefficient and the radionuclide inventory
International Nuclear Information System (INIS)
Datta, D.; Ranade, A.K.; Pandey, M.; Sathyabama, N.; Kumar, Brij
2012-01-01
The basic objective of an environmental impact assessment (EIA) is to build guidelines to reduce the associated risk or mitigate the consequences of the reactor accident at its source to prevent deterministic health effects, to reduce the risk of stochastic health effects (eg. cancer and severe hereditary effects) as much as reasonable achievable by implementing protective actions in accordance with IAEA guidance (IAEA Safety Series No. 115, 1996). The measure of exposure being the basic tool to take any appropriate decisions related to risk reduction, EIA is traditionally expressed in terms of radiation exposure to the member of the public. However, models used to estimate the exposure received by the member of the public are governed by parameters some of which are deterministic with relative uncertainty and some of which are stochastic as well as imprecise (insufficient knowledge). In an admixture environment of this type, it is essential to assess the uncertainty of a model to estimate the bounds of the exposure to the public to invoke a decision during an event of nuclear or radiological emergency. With a view to this soft computing technique such as evidence theory based assessment of model parameters is addressed to compute the risk or exposure to the member of the public. The possible pathway of exposure to the member of the public in the aquatic food stream is the drinking of water. Accordingly, this paper presents the uncertainty analysis of exposure via uncertainty analysis of the contaminated water. Evidence theory finally addresses the uncertainty in terms of lower bound as belief measure and upper bound of exposure as plausibility measure. In this work EIA is presented using evidence theory. Data fusion technique is used to aggregate the knowledge on the uncertain information. Uncertainty of concentration and exposure is expressed as an interval of belief, plausibility
International Nuclear Information System (INIS)
Hofer, E.; Hoffman, F.O.
1987-02-01
The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model
A commentary on model uncertainty
International Nuclear Information System (INIS)
Apostolakis, G.
1994-01-01
A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed
Directory of Open Access Journals (Sweden)
A. E. Sikorska
2012-04-01
Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.
A practical method to assess model sensitivity and parameter uncertainty in C cycle models
Delahaies, Sylvain; Roulstone, Ian; Nichols, Nancy
2015-04-01
The carbon cycle combines multiple spatial and temporal scales, from minutes to hours for the chemical processes occurring in plant cells to several hundred of years for the exchange between the atmosphere and the deep ocean and finally to millennia for the formation of fossil fuels. Together with our knowledge of the transformation processes involved in the carbon cycle, many Earth Observation systems are now available to help improving models and predictions using inverse modelling techniques. A generic inverse problem consists in finding a n-dimensional state vector x such that h(x) = y, for a given N-dimensional observation vector y, including random noise, and a given model h. The problem is well posed if the three following conditions hold: 1) there exists a solution, 2) the solution is unique and 3) the solution depends continuously on the input data. If at least one of these conditions is violated the problem is said ill-posed. The inverse problem is often ill-posed, a regularization method is required to replace the original problem with a well posed problem and then a solution strategy amounts to 1) constructing a solution x, 2) assessing the validity of the solution, 3) characterizing its uncertainty. The data assimilation linked ecosystem carbon (DALEC) model is a simple box model simulating the carbon budget allocation for terrestrial ecosystems. Intercomparison experiments have demonstrated the relative merit of various inverse modelling strategies (MCMC, ENKF) to estimate model parameters and initial carbon stocks for DALEC using eddy covariance measurements of net ecosystem exchange of CO2 and leaf area index observations. Most results agreed on the fact that parameters and initial stocks directly related to fast processes were best estimated with narrow confidence intervals, whereas those related to slow processes were poorly estimated with very large uncertainties. While other studies have tried to overcome this difficulty by adding complementary
Thomsen, Nanna I.; Binning, Philip J.; McKnight, Ursula S.; Tuxen, Nina; Bjerg, Poul L.; Troldborg, Mads
2016-05-01
A key component in risk assessment of contaminated sites is in the formulation of a conceptual site model (CSM). A CSM is a simplified representation of reality and forms the basis for the mathematical modeling of contaminant fate and transport at the site. The CSM should therefore identify the most important site-specific features and processes that may affect the contaminant transport behavior at the site. However, the development of a CSM will always be associated with uncertainties due to limited data and lack of understanding of the site conditions. CSM uncertainty is often found to be a major source of model error and it should therefore be accounted for when evaluating uncertainties in risk assessments. We present a Bayesian belief network (BBN) approach for constructing CSMs and assessing their uncertainty at contaminated sites. BBNs are graphical probabilistic models that are effective for integrating quantitative and qualitative information, and thus can strengthen decisions when empirical data are lacking. The proposed BBN approach facilitates a systematic construction of multiple CSMs, and then determines the belief in each CSM using a variety of data types and/or expert opinion at different knowledge levels. The developed BBNs combine data from desktop studies and initial site investigations with expert opinion to assess which of the CSMs are more likely to reflect the actual site conditions. The method is demonstrated on a Danish field site, contaminated with chlorinated ethenes. Four different CSMs are developed by combining two contaminant source zone interpretations (presence or absence of a separate phase contamination) and two geological interpretations (fractured or unfractured clay till). The beliefs in each of the CSMs are assessed sequentially based on data from three investigation stages (a screening investigation, a more detailed investigation, and an expert consultation) to demonstrate that the belief can be updated as more information
Multi-model ensembles for assessment of flood losses and associated uncertainty
Figueiredo, Rui; Schröter, Kai; Weiss-Motz, Alexander; Martina, Mario L. V.; Kreibich, Heidi
2018-05-01
Flood loss modelling is a crucial part of risk assessments. However, it is subject to large uncertainty that is often neglected. Most models available in the literature are deterministic, providing only single point estimates of flood loss, and large disparities tend to exist among them. Adopting any one such model in a risk assessment context is likely to lead to inaccurate loss estimates and sub-optimal decision-making. In this paper, we propose the use of multi-model ensembles to address these issues. This approach, which has been applied successfully in other scientific fields, is based on the combination of different model outputs with the aim of improving the skill and usefulness of predictions. We first propose a model rating framework to support ensemble construction, based on a probability tree of model properties, which establishes relative degrees of belief between candidate models. Using 20 flood loss models in two test cases, we then construct numerous multi-model ensembles, based both on the rating framework and on a stochastic method, differing in terms of participating members, ensemble size and model weights. We evaluate the performance of ensemble means, as well as their probabilistic skill and reliability. Our results demonstrate that well-designed multi-model ensembles represent a pragmatic approach to consistently obtain more accurate flood loss estimates and reliable probability distributions of model uncertainty.
International Nuclear Information System (INIS)
Fischer, F.; Ehrhardt, J.
1988-06-01
Various techniques available for uncertainty analysis of large computer models are applied, described and selected as most appropriate for analyzing the uncertainty in the predictions of accident consequence assessments. The investigation refers to the atmospheric dispersion and deposition submodel (straight-line Gaussian plume model) of UFOMOD, whose most important input variables and parameters are linked with probability distributions derived from expert judgement. Uncertainty bands show how much variability exists, sensitivity measures determine what causes this variability in consequences. Results are presented as confidence bounds of complementary cumulative frequency distributions (CCFDs) of activity concentrations, organ doses and health effects, partially as a function of distance from the site. In addition the ranked influence of the uncertain parameters on the different consequence types is shown. For the estimation of confidence bounds it was sufficient to choose a model parameter sample size of n (n=59) equal to 1.5 times the number of uncertain model parameters. Different samples or an increase of sample size did not change the 5%-95% - confidence bands. To get statistically stable results of the sensitivity analysis, larger sample sizes are needed (n=100, 200). Random or Latin-hypercube sampling schemes as tools for uncertainty and sensitivity analyses led to comparable results. (orig.) [de
Assessing Uncertainties of Water Footprints Using an Ensemble of Crop Growth Models on Winter Wheat
Directory of Open Access Journals (Sweden)
Kurt Christian Kersebaum
2016-12-01
Full Text Available Crop productivity and water consumption form the basis to calculate the water footprint (WF of a specific crop. Under current climate conditions, calculated evapotranspiration is related to observed crop yields to calculate WF. The assessment of WF under future climate conditions requires the simulation of crop yields adding further uncertainty. To assess the uncertainty of model based assessments of WF, an ensemble of crop models was applied to data from five field experiments across Europe. Only limited data were provided for a rough calibration, which corresponds to a typical situation for regional assessments, where data availability is limited. Up to eight models were applied for wheat. The coefficient of variation for the simulated actual evapotranspiration between models was in the range of 13%–19%, which was higher than the inter-annual variability. Simulated yields showed a higher variability between models in the range of 17%–39%. Models responded differently to elevated CO2 in a FACE (Free-Air Carbon Dioxide Enrichment experiment, especially regarding the reduction of water consumption. The variability of calculated WF between models was in the range of 15%–49%. Yield predictions contributed more to this variance than the estimation of water consumption. Transpiration accounts on average for 51%–68% of the total actual evapotranspiration.
Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.
2016-11-01
Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.
Uncertainties in radioecological assessment models-Their nature and approaches to reduce them
International Nuclear Information System (INIS)
Kirchner, G.; Steiner, M.
2008-01-01
Radioecological assessment models are necessary tools for estimating the radiation exposure of humans and non-human biota. This paper focuses on factors affecting their predictive accuracy, discusses the origin and nature of the different contributions to uncertainty and variability and presents approaches to separate and quantify them. The key role of the conceptual model, notably in relation to its structure and complexity, as well as the influence of the number and type of input parameters, are highlighted. Guidelines are provided to improve the degree of reliability of radioecological models
Assessing the Uncertainty of Tropical Cyclone Simulations in NCAR's Community Atmosphere Model
Directory of Open Access Journals (Sweden)
Kevin A Reed
2011-08-01
Full Text Available The paper explores the impact of the initial-data, parameter and structural model uncertainty on the simulation of a tropical cyclone-like vortex in the National Center for Atmospheric Research's (NCAR Community Atmosphere Model (CAM. An analytic technique is used to initialize the model with an idealized weak vortex that develops into a tropical cyclone over ten simulation days. A total of 78 ensemble simulations are performed at horizontal grid spacings of 1.0°, 0.5° and 0.25° using two recently released versions of the model, CAM 4 and CAM 5. The ensemble members represent simulations with random small-amplitude perturbations of the initial conditions, small shifts in the longitudinal position of the initial vortex and runs with slightly altered model parameters. The main distinction between CAM 4 and CAM 5 lies within the physical parameterization suite, and the simulations with both CAM versions at the varying resolutions assess the structural model uncertainty. At all resolutions storms are produced with many tropical cyclone-like characteristics. The CAM 5 simulations exhibit more intense storms than CAM 4 by day 10 at the 0.5° and 0.25° grid spacings, while the CAM 4 storm at 1.0° is stronger. There are also distinct differences in the shapes and vertical profiles of the storms in the two variants of CAM. The ensemble members show no distinction between the initial-data and parameter uncertainty simulations. At day 10 they produce ensemble root-mean-square deviations from an unperturbed control simulation on the order of 1--5 m s^{-1} for the maximum low-level wind speed and 2--10 hPa for the minimum surface pressure. However, there are large differences between the two CAM versions at identical horizontal resolutions. It suggests that the structural uncertainty is more dominant than the initial-data and parameter uncertainties in this study. The uncertainty among the ensemble members is assessed and quantified.
Uncertainty analysis of environmental models
International Nuclear Information System (INIS)
Monte, L.
1990-01-01
In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition
Uncertainty analysis in safety assessment
Energy Technology Data Exchange (ETDEWEB)
Lemos, Francisco Luiz de [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN), Belo Horizonte, MG (Brazil); Sullivan, Terry [Brookhaven National Lab., Upton, NY (United States)
1997-12-31
Nuclear waste disposal is a very complex subject which requires the study of many different fields of science, like hydro geology, meteorology, geochemistry, etc. In addition, the waste disposal facilities are designed to last for a very long period of time. Both of these conditions make safety assessment projections filled with uncertainty. This paper addresses approaches for treatment of uncertainties in the safety assessment modeling due to the variability of data and some current approaches used to deal with this problem. (author) 13 refs.; e-mail: lemos at bnl.gov; sulliva1 at bnl.gov
Model uncertainty and probability
International Nuclear Information System (INIS)
Parry, G.W.
1994-01-01
This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example
Chowdhury, S.; Sharma, A.
2005-12-01
present. SIMEX is based on theory that the trend in alternate parameters can be extrapolated back to the notional error free zone. We illustrate the utility of SIMEX in a synthetic rainfall-runoff modelling scenario and an application to study the dependence of uncertain distributed sea surface temperature anomalies with an indicator of the El Nino Southern Oscillation, the Southern Oscillation Index (SOI). The errors in rainfall data and its affect is explored using Sacramento rainfall runoff model. The rainfall uncertainty is assumed to be multiplicative and temporally invariant. The model used to relate the sea surface temperature anomalies (SSTA) to the SOI is assumed to be of a linear form. The nature of uncertainty in the SSTA is additive and varies with time. The SIMEX framework allows assessment of the relationship between the error free inputs and response. Cook, J.R., Stefanski, L. A., Simulation-Extrapolation Estimation in Parametric Measurement Error Models, Journal of the American Statistical Association, 89 (428), 1314-1328, 1994.
DEFF Research Database (Denmark)
Thomsen, Nanna Isbak; Troldborg, Mads; McKnight, Ursula S.
2012-01-01
site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level...... the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We...... propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same...
Ehrhardt, Fiona; Soussana, Jean-François; Bellocchi, Gianni; Grace, Peter; McAuliffe, Russel; Recous, Sylvie; Sándor, Renáta; Smith, Pete; Snow, Val; de Antoni Migliorati, Massimiliano; Basso, Bruno; Bhatia, Arti; Brilli, Lorenzo; Doltra, Jordi; Dorich, Christopher D; Doro, Luca; Fitton, Nuala; Giacomini, Sandro J; Grant, Brian; Harrison, Matthew T; Jones, Stephanie K; Kirschbaum, Miko U F; Klumpp, Katja; Laville, Patricia; Léonard, Joël; Liebig, Mark; Lieffering, Mark; Martin, Raphaël; Massad, Raia S; Meier, Elizabeth; Merbold, Lutz; Moore, Andrew D; Myrgiotis, Vasileios; Newton, Paul; Pattey, Elizabeth; Rolinski, Susanne; Sharp, Joanna; Smith, Ward N; Wu, Lianhai; Zhang, Qing
2018-02-01
Simulation models are extensively used to predict agricultural productivity and greenhouse gas emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multi-species agricultural contexts. We report an international model comparison and benchmarking exercise, showing the potential of multi-model ensembles to predict productivity and nitrous oxide (N 2 O) emissions for wheat, maize, rice and temperate grasslands. Using a multi-stage modelling protocol, from blind simulations (stage 1) to partial (stages 2-4) and full calibration (stage 5), 24 process-based biogeochemical models were assessed individually or as an ensemble against long-term experimental data from four temperate grassland and five arable crop rotation sites spanning four continents. Comparisons were performed by reference to the experimental uncertainties of observed yields and N 2 O emissions. Results showed that across sites and crop/grassland types, 23%-40% of the uncalibrated individual models were within two standard deviations (SD) of observed yields, while 42 (rice) to 96% (grasslands) of the models were within 1 SD of observed N 2 O emissions. At stage 1, ensembles formed by the three lowest prediction model errors predicted both yields and N 2 O emissions within experimental uncertainties for 44% and 33% of the crop and grassland growth cycles, respectively. Partial model calibration (stages 2-4) markedly reduced prediction errors of the full model ensemble E-median for crop grain yields (from 36% at stage 1 down to 4% on average) and grassland productivity (from 44% to 27%) and to a lesser and more variable extent for N 2 O emissions. Yield-scaled N 2 O emissions (N 2 O emissions divided by crop yields) were ranked accurately by three-model ensembles across crop species and field sites. The potential of using process-based model ensembles to predict jointly
A meta model-based methodology for an energy savings uncertainty assessment of building retrofitting
Directory of Open Access Journals (Sweden)
Caucheteux Antoine
2016-01-01
Full Text Available To reduce greenhouse gas emissions, energy retrofitting of building stock presents significant potential for energy savings. In the design stage, energy savings are usually assessed through Building Energy Simulation (BES. The main difficulty is to first assess the energy efficiency of the existing buildings, in other words, to calibrate the model. As calibration is an under determined problem, there is many solutions for building representation in simulation tools. In this paper, a method is proposed to assess not only energy savings but also their uncertainty. Meta models, using experimental designs, are used to identify many acceptable calibrations: sets of parameters that provide the most accurate representation of the building are retained to calculate energy savings. The method was applied on an existing office building modeled with the TRNsys BES. The meta model, using 13 parameters, is built with no more than 105 simulations. The evaluation of the meta model on thousands of new simulations gives a normalized mean bias error between the meta model and BES of <4%. Energy savings are assessed based on six energy savings concepts, which indicate savings of 2–45% with a standard deviation ranging between 1.3% and 2.5%.
Directory of Open Access Journals (Sweden)
G. Coccia
2011-10-01
Full Text Available The work aims at discussing the role of predictive uncertainty in flood forecasting and flood emergency management, its relevance to improve the decision making process and the techniques to be used for its assessment.
Real time flood forecasting requires taking into account predictive uncertainty for a number of reasons. Deterministic hydrological/hydraulic forecasts give useful information about real future events, but their predictions, as usually done in practice, cannot be taken and used as real future occurrences but rather used as pseudo-measurements of future occurrences in order to reduce the uncertainty of decision makers. Predictive Uncertainty (PU is in fact defined as the probability of occurrence of a future value of a predictand (such as water level, discharge or water volume conditional upon prior observations and knowledge as well as on all the information we can obtain on that specific future value from model forecasts. When dealing with commensurable quantities, as in the case of floods, PU must be quantified in terms of a probability distribution function which will be used by the emergency managers in their decision process in order to improve the quality and reliability of their decisions.
After introducing the concept of PU, the presently available processors are introduced and discussed in terms of their benefits and limitations. In this work the Model Conditional Processor (MCP has been extended to the possibility of using two joint Truncated Normal Distributions (TNDs, in order to improve adaptation to low and high flows.
The paper concludes by showing the results of the application of the MCP on two case studies, the Po river in Italy and the Baron Fork river, OK, USA. In the Po river case the data provided by the Civil Protection of the Emilia Romagna region have been used to implement an operational example, where the predicted variable is the observed water level. In the Baron Fork River
Uncertainties in soil-plant interactions in advanced models for long-timescale dose assessment
Energy Technology Data Exchange (ETDEWEB)
Klos, R. [Aleksandria Sciences Ltd. (United Kingdom); Limer, L. [Limer Scientific Ltd. (United Kingdom); Perez-Sanchez, D. [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Xu, S.; Andersson, P. [Swedish Radiation Safty Authority (Sweden)
2014-07-01
Traditional models for long-timescale dose assessment are generally conceptually straightforward, featuring one, two or three spatial compartments in the soil column and employing data based on annually averaged parameters for climate characteristics. The soil-plant system is usually modelled using concentration ratios. The justification for this approach is that the timescales relevant to the geologic disposal of radioactive waste are so long that simple conceptual models are necessary to account for the inherent uncertainties over the timescale of the dose assessment. In the past few years, attention has been given to more detailed 'advanced' models for use dose assessment that have a high degree of site-specific detail. These recognise more features, events and processes since they have higher spatial and temporal resolution. This modelling approach has been developed to account for redox sensitive radionuclides, variability of the water table position and accumulation in non-agricultural ecosystems prior to conversion to an agricultural ecosystem. The models feature higher spatial and temporal resolution in the soil column (up to ten layers with spatially varying k{sub d}s dependent on soil conditions) and monthly rather than annually averaged parameters. Soil-plant interaction is treated as a dynamic process, allowing for root uptake as a function of time and depth, according to the root profile. Uncertainty in dose assessment models associated with the treatment of prior accumulations in agricultural soils has demonstrated the importance of the model's representation of the soil-plant interaction. The treatment of root uptake as a dynamic process as opposed to a simple concentration ratio implies a potentially important difference despite the dynamic soil-plant transfer rate being based on established concentration ratio values. These discrepancies have also appeared in the results from the higher spatio-temporal resolution models. This paper
Mannina, Giorgio; Cosenza, Alida; Viviani, Gaspare
In the last few years, the use of mathematical models in WasteWater Treatment Plant (WWTP) processes has become a common way to predict WWTP behaviour. However, mathematical models generally demand advanced input for their implementation that must be evaluated by an extensive data-gathering campaign, which cannot always be carried out. This fact, together with the intrinsic complexity of the model structure, leads to model results that may be very uncertain. Quantification of the uncertainty is imperative. However, despite the importance of uncertainty quantification, only few studies have been carried out in the wastewater treatment field, and those studies only included a few of the sources of model uncertainty. Seeking the development of the area, the paper presents the uncertainty assessment of a mathematical model simulating biological nitrogen and phosphorus removal. The uncertainty assessment was conducted according to the Generalised Likelihood Uncertainty Estimation (GLUE) methodology that has been scarcely applied in wastewater field. The model was based on activated-sludge models 1 (ASM) and 2 (ASM2). Different approaches can be used for uncertainty analysis. The GLUE methodology requires a large number of Monte Carlo simulations in which a random sampling of individual parameters drawn from probability distributions is used to determine a set of parameter values. Using this approach, model reliability was evaluated based on its capacity to globally limit the uncertainty. The method was applied to a large full-scale WWTP for which quantity and quality data was gathered. The analysis enabled to gain useful insights for WWTP modelling identifying the crucial aspects where higher uncertainty rely and where therefore, more efforts should be provided in terms of both data gathering and modelling practises.
Critical loads - assessment of uncertainty
Energy Technology Data Exchange (ETDEWEB)
Barkman, A.
1998-10-01
The effects of data uncertainty in applications of the critical loads concept were investigated on different spatial resolutions in Sweden and northern Czech Republic. Critical loads of acidity (CL) were calculated for Sweden using the biogeochemical model PROFILE. Three methods with different structural complexity were used to estimate the adverse effects of S0{sub 2} concentrations in northern Czech Republic. Data uncertainties in the calculated critical loads/levels and exceedances (EX) were assessed using Monte Carlo simulations. Uncertainties within cumulative distribution functions (CDF) were aggregated by accounting for the overlap between site specific confidence intervals. Aggregation of data uncertainties within CDFs resulted in lower CL and higher EX best estimates in comparison with percentiles represented by individual sites. Data uncertainties were consequently found to advocate larger deposition reductions to achieve non-exceedance based on low critical loads estimates on 150 x 150 km resolution. Input data were found to impair the level of differentiation between geographical units at all investigated resolutions. Aggregation of data uncertainty within CDFs involved more constrained confidence intervals for a given percentile. Differentiation as well as identification of grid cells on 150 x 150 km resolution subjected to EX was generally improved. Calculation of the probability of EX was shown to preserve the possibility to differentiate between geographical units. Re-aggregation of the 95%-ile EX on 50 x 50 km resolution generally increased the confidence interval for each percentile. Significant relationships were found between forest decline and the three methods addressing risks induced by S0{sub 2} concentrations. Modifying S0{sub 2} concentrations by accounting for the length of the vegetation period was found to constitute the most useful trade-off between structural complexity, data availability and effects of data uncertainty. Data
Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.
2012-04-01
Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from
Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology
DEFF Research Database (Denmark)
Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk
2008-01-01
of combined sewer overflow. The GLUE methodology is used to test different conceptual setups in order to determine if one model setup gives a better goodness of fit conditional on the observations than the other. Moreover, different methodological investigations of GLUE are conducted in order to test......In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....
Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz
2016-04-01
Water for agriculture is strongly limited in arid and semi-arid regions and often of low quality in terms of salinity. The application of saline waters for irrigation increases the salt load in the rooting zone and has to be managed by leaching to maintain a healthy soil, i.e. to wash out salts by additional irrigation. Dynamic simulation models are helpful tools to calculate the root zone water fluxes and soil salinity content in order to investigate best management practices. However, there is little information on structural and parameter uncertainty for simulations regarding the water and salt balance of saline irrigation. Hence, we established a multi-model system with four different models (AquaCrop, RZWQM, SWAP, Hydrus1D/UNSATCHEM) to analyze the structural and parameter uncertainty by using the Global Likelihood and Uncertainty Estimation (GLUE) method. Hydrus1D/UNSATCHEM and SWAP were set up with multiple sets of different implemented functions (e.g. matric and osmotic stress for root water uptake) which results in a broad range of different model structures. The simulations were evaluated against soil water and salinity content observations. The posterior distribution of the GLUE analysis gives behavioral parameters sets and reveals uncertainty intervals for parameter uncertainty. Throughout all of the model sets, most parameters accounting for the soil water balance show a low uncertainty, only one or two out of five to six parameters in each model set displays a high uncertainty (e.g. pore-size distribution index in SWAP and Hydrus1D/UNSATCHEM). The differences between the models and model setups reveal the structural uncertainty. The highest structural uncertainty is observed for deep percolation fluxes between the model sets of Hydrus1D/UNSATCHEM (~200 mm) and RZWQM (~500 mm) that are more than twice as high for the latter. The model sets show a high variation in uncertainty intervals for deep percolation as well, with an interquartile range (IQR) of
Directory of Open Access Journals (Sweden)
R. H. Moore
2013-04-01
Full Text Available We use the Global Modelling Initiative (GMI chemical transport model with a cloud droplet parameterisation adjoint to quantify the sensitivity of cloud droplet number concentration to uncertainties in predicting CCN concentrations. Published CCN closure uncertainties for six different sets of simplifying compositional and mixing state assumptions are used as proxies for modelled CCN uncertainty arising from application of those scenarios. It is found that cloud droplet number concentrations (Nd are fairly insensitive to the number concentration (Na of aerosol which act as CCN over the continents (∂lnNd/∂lnNa ~10–30%, but the sensitivities exceed 70% in pristine regions such as the Alaskan Arctic and remote oceans. This means that CCN concentration uncertainties of 4–71% translate into only 1–23% uncertainty in cloud droplet number, on average. Since most of the anthropogenic indirect forcing is concentrated over the continents, this work shows that the application of Köhler theory and attendant simplifying assumptions in models is not a major source of uncertainty in predicting cloud droplet number or anthropogenic aerosol indirect forcing for the liquid, stratiform clouds simulated in these models. However, it does highlight the sensitivity of some remote areas to pollution brought into the region via long-range transport (e.g., biomass burning or from seasonal biogenic sources (e.g., phytoplankton as a source of dimethylsulfide in the southern oceans. Since these transient processes are not captured well by the climatological emissions inventories employed by current large-scale models, the uncertainties in aerosol-cloud interactions during these events could be much larger than those uncovered here. This finding motivates additional measurements in these pristine regions, for which few observations exist, to quantify the impact (and associated uncertainty of transient aerosol processes on cloud properties.
Model Uncertainty via the Integration of Hormesis and LNT as the Default in Cancer Risk Assessment.
Calabrese, Edward J
2015-01-01
On June 23, 2015, the US Nuclear Regulatory Commission (NRC) issued a formal notice in the Federal Register that it would consider whether "it should amend its 'Standards for Protection Against Radiation' regulations from the linear non-threshold (LNT) model of radiation protection to the hormesis model." The present commentary supports this recommendation based on the (1) flawed and deceptive history of the adoption of LNT by the US National Academy of Sciences (NAS) in 1956; (2) the documented capacity of hormesis to make more accurate predictions of biological responses for diverse biological end points in the low-dose zone; (3) the occurrence of extensive hormetic data from the peer-reviewed biomedical literature that revealed hormetic responses are highly generalizable, being independent of biological model, end point measured, inducing agent, level of biological organization, and mechanism; and (4) the integration of hormesis and LNT models via a model uncertainty methodology that optimizes public health responses at 10(-4). Thus, both LNT and hormesis can be integratively used for risk assessment purposes, and this integration defines the so-called "regulatory sweet spot."
Essays on model uncertainty in financial models
Li, Jing
2018-01-01
This dissertation studies model uncertainty, particularly in financial models. It consists of two empirical chapters and one theoretical chapter. The first empirical chapter (Chapter 2) classifies model uncertainty into parameter uncertainty and misspecification uncertainty. It investigates the
Directory of Open Access Journals (Sweden)
Simon van Mourik
2014-06-01
Full Text Available Multi-parameter models in systems biology are typically ‘sloppy’: some parameters or combinations of parameters may be hard to estimate from data, whereas others are not. One might expect that parameter uncertainty automatically leads to uncertain predictions, but this is not the case. We illustrate this by showing that the prediction uncertainty of each of six sloppy models varies enormously among different predictions. Statistical approximations of parameter uncertainty may lead to dramatic errors in prediction uncertainty estimation. We argue that prediction uncertainty assessment must therefore be performed on a per-prediction basis using a full computational uncertainty analysis. In practice this is feasible by providing a model with a sample or ensemble representing the distribution of its parameters. Within a Bayesian framework, such a sample may be generated by a Markov Chain Monte Carlo (MCMC algorithm that infers the parameter distribution based on experimental data. Matlab code for generating the sample (with the Differential Evolution Markov Chain sampler and the subsequent uncertainty analysis using such a sample, is supplied as Supplemental Information.
Model uncertainty: Probabilities for models?
International Nuclear Information System (INIS)
Winkler, R.L.
1994-01-01
Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising
Uncertainties in repository modeling
Energy Technology Data Exchange (ETDEWEB)
Wilson, J.R.
1996-12-31
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.
Uncertainties in repository modeling
International Nuclear Information System (INIS)
Wilson, J.R.
1996-01-01
The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling
Assessing uncertainties of water footprints using an ensemble of crop growth models on winter wheat
Czech Academy of Sciences Publication Activity Database
Kersebaum, K. C.; Kroes, J.; Gobin, A.; Takáč, J.; Hlavinka, Petr; Trnka, Miroslav; Ventrella, D.; Giglio, L.; Ferrise, R.; Moriondo, M.; Marta, A. D.; Luo, Q.; Eitzinger, Josef; Mirschel, W.; Weigel, H-J.; Manderscheid, R.; Hofmann, M.; Nejedlík, P.; Hösch, J.
2016-01-01
Roč. 8, č. 12 (2016), č. článku 571. ISSN 2073-4441 R&D Projects: GA MŠk(CZ) LO1415; GA MŠk(CZ) LD13030 Institutional support: RVO:67179843 Keywords : water footprint * uncertainty * model ensemble * wheat Subject RIV: DA - Hydrology ; Limnology Impact factor: 1.832, year: 2016
International Nuclear Information System (INIS)
Nelson, R.W.; Jacobson, E.A.; Conbere, W.
1985-06-01
There is a growing awareness of the need to quantify uncertainty in groundwater flow and transport model results. Regulatory organizations are beginning to request the statistical distributions of predicted contaminant arrival to the biosphere, so that realistic confidence intervals can be obtained for the modeling results. To meet these needs, methods are being developed to quantify uncertainty in the subsurface flow and transport analysis sequence. A method for evaluating this uncertainty, described in this paper, considers uncertainty in material properties and was applied to an example field problem. Our analysis begins by using field measurements of transmissivity and hydraulic head in a regional, parameter estimation method to obtain a calibrated fluid flow model and a covariance matrix of the parameter estimation errors. The calibrated model and the covariance matrix are next used in a conditional simulation mode to generate a large number of 'head realizations.' The specific pore water velocity distribution for each realization is calculated from the effective porosity, the aquifer parameter realization, and the associated head values. Each velocity distribution is used to obtain a transport solution for a contaminant originating from the same source for all realizations. The results are the statistical distributions for the outflow arrival times. The confidence intervals for contamination reaching the biosphere are obtained from the outflow statistical distributions. 20 refs., 12 figs
Energy Technology Data Exchange (ETDEWEB)
Rider, William J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Witkowski, Walter R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Mousseau, Vincent Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2016-04-13
The importance of credible, trustworthy numerical simulations is obvious especially when using the results for making high-consequence decisions. Determining the credibility of such numerical predictions is much more difficult and requires a systematic approach to assessing predictive capability, associated uncertainties and overall confidence in the computational simulation process for the intended use of the model. This process begins with an evaluation of the computational modeling of the identified, important physics of the simulation for its intended use. This is commonly done through a Phenomena Identification Ranking Table (PIRT). Then an assessment of the evidence basis supporting the ability to computationally simulate these physics can be performed using various frameworks such as the Predictive Capability Maturity Model (PCMM). There were several critical activities that follow in the areas of code and solution verification, validation and uncertainty quantification, which will be described in detail in the following sections. Here, we introduce the subject matter for general applications but specifics are given for the failure prediction project. In addition, the first task that must be completed in the verification & validation procedure is to perform a credibility assessment to fully understand the requirements and limitations of the current computational simulation capability for the specific application intended use. The PIRT and PCMM are tools used at Sandia National Laboratories (SNL) to provide a consistent manner to perform such an assessment. Ideally, all stakeholders should be represented and contribute to perform an accurate credibility assessment. PIRTs and PCMMs are both described in brief detail below and the resulting assessments for an example project are given.
Sun, Mei; Zhang, Xiaolin; Huo, Zailin; Feng, Shaoyuan; Huang, Guanhua; Mao, Xiaomin
2016-03-01
Quantitatively ascertaining and analyzing the effects of model uncertainty on model reliability is a focal point for agricultural-hydrological models due to more uncertainties of inputs and processes. In this study, the generalized likelihood uncertainty estimation (GLUE) method with Latin hypercube sampling (LHS) was used to evaluate the uncertainty of the RZWQM-DSSAT (RZWQM2) model outputs responses and the sensitivity of 25 parameters related to soil properties, nutrient transport and crop genetics. To avoid the one-sided risk of model prediction caused by using a single calibration criterion, the combined likelihood (CL) function integrated information concerning water, nitrogen, and crop production was introduced in GLUE analysis for the predictions of the following four model output responses: the total amount of water content (T-SWC) and the nitrate nitrogen (T-NIT) within the 1-m soil profile, the seed yields of waxy maize (Y-Maize) and winter wheat (Y-Wheat). In the process of evaluating RZWQM2, measurements and meteorological data were obtained from a field experiment that involved a winter wheat and waxy maize crop rotation system conducted from 2003 to 2004 in southern Beijing. The calibration and validation results indicated that RZWQM2 model can be used to simulate the crop growth and water-nitrogen migration and transformation in wheat-maize crop rotation planting system. The results of uncertainty analysis using of GLUE method showed T-NIT was sensitive to parameters relative to nitrification coefficient, maize growth characteristics on seedling period, wheat vernalization period, and wheat photoperiod. Parameters on soil saturated hydraulic conductivity, nitrogen nitrification and denitrification, and urea hydrolysis played an important role in crop yield component. The prediction errors for RZWQM2 outputs with CL function were relatively lower and uniform compared with other likelihood functions composed of individual calibration criterion. This
Tompkins, A. M.; Thomson, M. C.
2017-12-01
Simulations of the impact of climate variations on a vector-bornedisease such as malaria are subject to a number of sources ofuncertainty. These include the model structure and parameter settingsin addition to errors in the climate data and the neglect of theirspatial heterogeneity, especially over complex terrain. We use aconstrained genetic algorithm to confront these two sources ofuncertainty for malaria transmission in the highlands of Kenya. Thetechnique calibrates the parameter settings of a process-based,mathematical model of malaria transmission to vary within theirassessed level of uncertainty and also allows the calibration of thedriving climate data. The simulations show that in highland settingsclose to the threshold for sustained transmission, the uncertainty inclimate is more important to address than the malaria modeluncertainty. Applications of the coupled climate-malaria modelling system are briefly presented.
Tauxe, J.; Black, P.; Carilli, J.; Catlett, K.; Crowe, B.; Hooten, M.; Rawlinson, S.; Schuh, A.; Stockton, T.; Yucel, V.
2002-12-01
The disposal of low-level radioactive waste (LLW) in the United States (U.S.) is a highly regulated undertaking. The U.S. Department of Energy (DOE), itself a large generator of such wastes, requires a substantial amount of analysis and assessment before permitting disposal of LLW at its facilities. One of the requirements that must be met in assessing the performance of a disposal site and technology is that a Performance Assessment (PA) demonstrate "reasonable expectation" that certain performance objectives, such as dose to a hypothetical future receptor, not be exceeded. The phrase "reasonable expectation" implies recognition of uncertainty in the assessment process. In order for this uncertainty to be quantified and communicated to decision makers, the PA computer model must accept probabilistic (uncertain) input (parameter values) and produce results which reflect that uncertainty as it is propagated through the model calculations. The GoldSim modeling software was selected for the task due to its unique facility with both probabilistic analysis and radioactive contaminant transport. Probabilistic model parameters range from water content and other physical properties of alluvium to the activity of radionuclides disposed to the amount of time a future resident might be expected to spend tending a garden. Although these parameters govern processes which are defined in isolation as rather simple differential equations, the complex interaction of couple processes makes for a highly nonlinear system with often unanticipated results. The decision maker has the difficult job of evaluating the uncertainty of modeling results in the context of granting permission for LLW disposal. This job also involves the evaluation of alternatives, such as the selection of disposal technologies. Various scenarios can be evaluated in the model, so that the effects of, for example, using a thicker soil cap over the waste cell can be assessed. This ability to evaluate mitigation
International Nuclear Information System (INIS)
Meyer, Philip D.; Gee, Glendon W.; Nicholson, Thomas J.
1999-01-01
This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases
International Nuclear Information System (INIS)
Meyer D, Philip; Gee W, Glendon
2000-01-01
This report addresses issues related to the analysis of uncertainty in dose assessments conducted as part of decommissioning analyses. The analysis is limited to the hydrologic aspects of the exposure pathway involving infiltration of water at the ground surface, leaching of contaminants, and transport of contaminants through the groundwater to a point of exposure. The basic conceptual models and mathematical implementations of three dose assessment codes are outlined along with the site-specific conditions under which the codes may provide inaccurate, potentially nonconservative results. In addition, the hydrologic parameters of the codes are identified and compared. A methodology for parameter uncertainty assessment is outlined that considers the potential data limitations and modeling needs of decommissioning analyses. This methodology uses generic parameter distributions based on national or regional databases, sensitivity analysis, probabilistic modeling, and Bayesian updating to incorporate site-specific information. Data sources for best-estimate parameter values and parameter uncertainty information are also reviewed. A follow-on report will illustrate the uncertainty assessment methodology using decommissioning test cases
Olea, Ricardo A.; Luppens, James A.
2014-01-01
Standards for the public disclosure of mineral resources and reserves do not require the use of any specific methodology when it comes to estimating the reliability of the resources. Unbeknownst to most intended recipients of resource appraisals, such freedom commonly results in subjective opinions or estimations based on suboptimal approaches, such as use of distance methods. This report presents the results of a study of the third of three coal deposits in which drilling density has been increased one order of magnitude in three stages. Applying geostatistical simulation, the densest dataset was used to check the results obtained by modeling the sparser drillings. We have come up with two summary displays of results based on the same simulations, which individually and combined provide a better assessment of uncertainty than traditional qualitative resource classifications: (a) a display of cell 90 percent confidence interval versus cumulative cell tonnage, and (b) a histogram of total resources. The first graph allows classification of data into any number of bins with dividers to be decided by the assessor on the basis of a discriminating variable that is statistically accepted as a measure of uncertainty, thereby improving the quality and flexibility of the modeling. The second display expands the scope of the modeling by providing a quantitative measure of uncertainty for total tonnage, which is a fundamental concern for stockholders, geologists, and decision makers. Our approach allows us to correctly model uncertainty issues not possible to predict with distance methods, such as (a) different levels of uncertainty for individual beds with the same pattern and density of drill holes, (b) different local degrees of reduction of uncertainty with drilling densification reflecting fluctuation in the complexity of the geology, (c) average reduction in uncertainty at a disproportionately lesser rate than the reduction in area per drill hole, (d) the proportional
Wang, W.; Dungan, J. L.; Hashimoto, H.; Michaelis, A.; Milesi, C.; Ichii, K.; Nemani, R. R.
2009-12-01
We are conducting an ensemble modeling exercise using the Terrestrial Observation and Prediction System (TOPS) to characterize structural uncertainty in carbon fluxes and stocks estimates from different ecosystem models. The experiment uses public-domain versions of Biome-BGC, LPJ, TOPS-BGC, and CASA, driven by a consistent set of climate fields for North America at 8km resolution and daily/monthly time steps over the period of 1982-2006. A set of diagnostics is developed to characterize the behavior of the models in the climate (temperature-precipitation) space, and to evaluate the simulated carbon cycle in an integrated way. The key findings of this study include that: (relative) optimal primary production is generally found in climate regions where the relationship between annual temperature (T, oC) and precipitation (P, mm) is defined by P = 50*T+500; the ratios between NPP and GPP are close to 50% on average, yet can vary between models and in different climate regions; the allocation of carbon to leaf growth represents a positive feedback to the primary production, and different approaches to constrain this process have significant impacts on the simulated carbon cycle; substantial differences in biomass stocks may be induced by small differences in the tissue turnover rate and the plant mortality; the mean residence time of soil carbon pools is strongly influenced by schemes of temperature regulations; non-respiratory disturbances (e.g., fires) are the main driver for NEP, yet its magnitudes vary between models. Overall, these findings indicate that although the structures of the models are similar, the uncertainties among them can be large, highlighting the problem inherent in relying on only one modeling approach to map surface carbon fluxes or to assess vegetation-climate interactions.
International Nuclear Information System (INIS)
Lee, Yong Suk; Bang, Young Suk; Chung, Chang Hyun; Jeong, Ji Hwan
2004-01-01
Since International Safety Advisory Group (INSAG) introduced term 'safety culture', it has been widely recognized that safety culture has an important role in safety of nuclear power plants. Research on the safety culture can be divided in the following two parts. 1) Assessment of safety culture (by interview, questionnaire, etc.) 2) Assessment of link between safety culture and safety of nuclear power plants. There is a substantial body of literature that addresses the first part, but there is much less work that addresses the second part. To address the second part, most work focused on the development of model incorporating safety culture into Probabilistic Safety Assessment (PSA). One of the most advanced methodology in the area of incorporating safety culture quantitatively into PSA is System Dynamics (SD) model developed by Kwak et al. It can show interactions among various factors which affect employees' productivity and job quality. Also various situations in nuclear power plant can be simulated and time-dependent risk can be recalculated with this model. But this model does not consider minimal cut set (MCS) dependency and uncertainty of risk. Another well-known methodology is Work Process Analysis Model (WPAM) developed by Davoudian. It considers MCS dependency by modifying conditional probability values using SLI methodology. But we found that the modified conditional probability values in WPAM are somewhat artificial and have no sound basis. WPAM tend to overestimate conditional probability of hardware failure, because it uses SLI methodology which is normally used in Human Reliability Analysis (HRA). WPAM also does not consider uncertainty of risk. In this study, we proposed methodology to incorporate safety culture into PSA quantitatively that can deal with MCS dependency and uncertainty of risk by applying the Common Uncertainty Source (CUS) model developed by Zhang. CUS is uncertainty source that is common to basic events, and this can be physical
Prinn, R. G.
2013-12-01
The world is facing major challenges that create tensions between human development and environmental sustenance. In facing these challenges, computer models are invaluable tools for addressing the need for probabilistic approaches to forecasting. To illustrate this, I use the MIT Integrated Global System Model framework (IGSM; http://globalchange.mit.edu ). The IGSM consists of a set of coupled sub-models of global economic and technological development and resultant emissions, and physical, dynamical and chemical processes in the atmosphere, land, ocean and ecosystems (natural and managed). Some of the sub-models have both complex and simplified versions available, with the choice of which version to use being guided by the questions being addressed. Some sub-models (e.g.urban air pollution) are reduced forms of complex ones created by probabilistic collocation with polynomial chaos bases. Given the significant uncertainties in the model components, it is highly desirable that forecasts be probabilistic. We achieve this by running 400-member ensembles (Latin hypercube sampling) with different choices for key uncertain variables and processes within the human and natural system model components (pdfs of inputs estimated by model-observation comparisons, literature surveys, or expert elicitation). The IGSM has recently been used for probabilistic forecasts of climate, each using 400-member ensembles: one ensemble assumes no explicit climate mitigation policy and others assume increasingly stringent policies involving stabilization of greenhouse gases at various levels. These forecasts indicate clearly that the greatest effect of these policies is to lower the probability of extreme changes. The value of such probability analyses for policy decision-making lies in their ability to compare relative (not just absolute) risks of various policies, which are less affected by the earth system model uncertainties. Given the uncertainties in forecasts, it is also clear that
Energy Technology Data Exchange (ETDEWEB)
Prather, Michael J. [Univ. of California, Irvine, CA (United States); Hsu, Juno [Univ. of California, Irvine, CA (United States); Nicolau, Alex [Univ. of California, Irvine, CA (United States); Veidenbaum, Alex [Univ. of California, Irvine, CA (United States); Smith, Philip Cameron [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bergmann, Dan [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2014-11-07
Atmospheric chemistry controls the abundances and hence climate forcing of important greenhouse gases including N_{2}O, CH_{4}, HFCs, CFCs, and O_{3}. Attributing climate change to human activities requires, at a minimum, accurate models of the chemistry and circulation of the atmosphere that relate emissions to abundances. This DOE-funded research provided realistic, yet computationally optimized and affordable, photochemical modules to the Community Earth System Model (CESM) that augment the CESM capability to explore the uncertainty in future stratospheric-tropospheric ozone, stratospheric circulation, and thus the lifetimes of chemically controlled greenhouse gases from climate simulations. To this end, we have successfully implemented Fast-J (radiation algorithm determining key chemical photolysis rates) and Linoz v3.0 (linearized photochemistry for interactive O_{3}, N_{2}O, NO_{y} and CH_{4}) packages in LLNL-CESM and for the first time demonstrated how change in O2 photolysis rate within its uncertainty range can significantly impact on the stratospheric climate and ozone abundances. From the UCI side, this proposal also helped LLNL develop a CAM-Superfast Chemistry model that was implemented for the IPCC AR5 and contributed chemical-climate simulations to CMIP5.
Assessment of SFR Wire Wrap Simulation Uncertainties
Energy Technology Data Exchange (ETDEWEB)
Delchini, Marc-Olivier G. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Popov, Emilian L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Pointer, William David [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Reactor and Nuclear Systems Division; Swiler, Laura P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2016-09-30
Predictive modeling and simulation of nuclear reactor performance and fuel are challenging due to the large number of coupled physical phenomena that must be addressed. Models that will be used for design or operational decisions must be analyzed for uncertainty to ascertain impacts to safety or performance. Rigorous, structured uncertainty analyses are performed by characterizing the model’s input uncertainties and then propagating the uncertainties through the model to estimate output uncertainty. This project is part of the ongoing effort to assess modeling uncertainty in Nek5000 simulations of flow configurations relevant to the advanced reactor applications of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program. Three geometries are under investigation in these preliminary assessments: a 3-D pipe, a 3-D 7-pin bundle, and a single pin from the Thermal-Hydraulic Out-of-Reactor Safety (THORS) facility. Initial efforts have focused on gaining an understanding of Nek5000 modeling options and integrating Nek5000 with Dakota. These tasks are being accomplished by demonstrating the use of Dakota to assess parametric uncertainties in a simple pipe flow problem. This problem is used to optimize performance of the uncertainty quantification strategy and to estimate computational requirements for assessments of complex geometries. A sensitivity analysis to three turbulent models was conducted for a turbulent flow in a single wire wrapped pin (THOR) geometry. Section 2 briefly describes the software tools used in this study and provides appropriate references. Section 3 presents the coupling interface between Dakota and a computational fluid dynamic (CFD) code (Nek5000 or STARCCM+), with details on the workflow, the scripts used for setting up the run, and the scripts used for post-processing the output files. In Section 4, the meshing methods used to generate the THORS and 7-pin bundle meshes are explained. Sections 5, 6 and 7 present numerical results
Smith, A. A.; Welch, C.; Stadnyk, T. A.
2018-05-01
Evapotranspiration (ET) partitioning is a growing field of research in hydrology due to the significant fraction of watershed water loss it represents. The use of tracer-aided models has improved understanding of watershed processes, and has significant potential for identifying time-variable partitioning of evaporation (E) from ET. A tracer-aided model was used to establish a time-series of E/ET using differences in riverine δ18O and δ2H in four northern Canadian watersheds (lower Nelson River, Manitoba, Canada). On average E/ET follows a parabolic trend ranging from 0.7 in the spring and autumn to 0.15 (three watersheds) and 0.5 (fourth watershed) during the summer growing season. In the fourth watershed wetlands and shrubs dominate land cover. During the summer, E/ET ratios are highest in wetlands for three watersheds (10% higher than unsaturated soil storage), while lowest for the fourth watershed (20% lower than unsaturated soil storage). Uncertainty of the ET partition parameters is strongly influenced by storage volumes, with large storage volumes increasing partition uncertainty. In addition, higher simulated soil moisture increases estimated E/ET. Although unsaturated soil storage accounts for larger surface areas in these watersheds than wetlands, riverine isotopic composition is more strongly affected by E from wetlands. Comparisons of E/ET to measurement-intensive studies in similar ecoregions indicate that the methodology proposed here adequately partitions ET.
Uncertainty analysis comes to integrated assessment models for climate change and conversely
Cooke, R.M.
2012-01-01
This article traces the development of uncertainty analysis through three generations punctuated by large methodology investments in the nuclear sector. Driven by a very high perceived legitimation burden, these investments aimed at strengthening the scientific basis of uncertainty quantification.
Garrigues, S.; Olioso, A.; Calvet, J.-C.; Lafont, S.; Martin, E.; Chanzy, A.; Marloie, O.; Bertrand, N.; Desfonds, V.; Renard, D.
2012-04-01
Vegetation productivity and water balance of Mediterranean regions will be particularly affected by climate and land-use changes. In order to analyze and predict these changes through land surface models, a critical step is to quantify the uncertainties associated with these models (processes, parameters) and their implementation over a long period of time. Besides, uncertainties attached to the data used to force these models (atmospheric forcing, vegetation and soil characteristics, crop management practices...) which are generally available at coarse spatial resolution (>1-10 km) and for a limited number of plant functional types, need to be evaluated. This paper aims at assessing the uncertainties in water (evapotranspiration) and energy fluxes estimated from a Soil Vegetation Atmosphere Transfer (SVAT) model over a Mediterranean agricultural site. While similar past studies focused on particular crop types and limited period of time, the originality of this paper consists in implementing the SVAT model and assessing its uncertainties over a long period of time (10 years), encompassing several cycles of distinct crops (wheat, sorghum, sunflower, peas). The impacts on the SVAT simulations of the following sources of uncertainties are characterized: - Uncertainties in atmospheric forcing are assessed comparing simulations forced with local meteorological measurements and simulations forced with re-analysis atmospheric dataset (SAFRAN database). - Uncertainties in key surface characteristics (soil, vegetation, crop management practises) are tested comparing simulations feeded with standard values from global database (e.g. ECOCLIMAP) and simulations based on in situ or site-calibrated values. - Uncertainties dues to the implementation of the SVAT model over a long period of time are analyzed with regards to crop rotation. The SVAT model being analyzed in this paper is ISBA in its a-gs version which simulates the photosynthesis and its coupling with the stomata
Scale changes in air quality modelling and assessment of associated uncertainties
International Nuclear Information System (INIS)
Korsakissok, Irene
2009-01-01
After an introduction of issues related to a scale change in the field of air quality (existing scales for emissions, transport, turbulence and loss processes, hierarchy of data and models, methods of scale change), the author first presents Gaussian models which have been implemented within the Polyphemus modelling platform. These models are assessed by comparison with experimental observations and with other commonly used Gaussian models. The second part reports the coupling of the puff-based Gaussian model with the Eulerian Polair3D model for the sub-mesh processing of point sources. This coupling is assessed at the continental scale for a passive tracer, and at the regional scale for photochemistry. Different statistical methods are assessed
Treatment of uncertainty in low-level waste performance assessment
International Nuclear Information System (INIS)
Kozak, M.W.; Olague, N.E.; Gallegos, D.P.; Rao, R.R.
1991-01-01
Uncertainties arise from a number of different sources in low-level waste performance assessment. In this paper the types of uncertainty are reviewed, and existing methods for quantifying and reducing each type of uncertainty are discussed. These approaches are examined in the context of the current low-level radioactive waste regulatory performance objectives, which are deterministic. The types of uncertainty discussed in this paper are model uncertainty, uncertainty about future conditions, and parameter uncertainty. The advantages and disadvantages of available methods for addressing uncertainty in low-level waste performance assessment are presented. 25 refs
Assessing the DICE model: uncertainty associated with the emission and retention of greenhouse gases
International Nuclear Information System (INIS)
Kaufmann, R.K.
1997-01-01
Analysis of the DICE model indicates that it contains unsupported assumptions, simple extrapolations, and mis-specifications that cause it to understate the rate at which economic activity emits greenhouse gases and the rate at which the atmosphere retains greenhouse gases. The model assumes a world population that is 2 billion people lower than the 'base case' projected by demographers. The model extrapolates a decline in the quantity of greenhouse gases emitted per unit of economic activity that is possible only if there is a structural break in the economic and engineering factors have determined this ratio over the last century. The model uses a single equation to simulate the rate at which greenhouse gases accumulate in the atmosphere. The forecast for the airborne fraction generated by this equation contradicts forecasts generated by models that represent the physical and chemical processes which determine the movement of carbon from the atmosphere to the ocean. When these unsupported assumptions, simple extrapolations, and misspecifications are remedied with simple fixes, the economic impact of global climate change increases several fold. Similarly, these remedies increase the impact of uncertainty on estimates for the economic impact of global climate change. Together, these results indicate that considerable scientific and economic research is needed before the threat of climate change can be dismissed with any degree of certainty. 23 refs., 3 figs
International Nuclear Information System (INIS)
Flari, Villie; Chaudhry, Qasim; Neslo, Rabin; Cooke, Roger
2011-01-01
Currently, risk assessment of nanotechnology-enabled food products is considered difficult due to the large number of uncertainties involved. We developed an approach which could address some of the main uncertainties through the use of expert judgment. Our approach employs a multi-criteria decision model, based on probabilistic inversion that enables capturing experts’ preferences in regard to safety of nanotechnology-enabled food products, and identifying their opinions in regard to the significance of key criteria that are important in determining the safety of such products. An advantage of these sample-based techniques is that they provide out-of-sample validation and therefore a robust scientific basis. This validation in turn adds predictive power to the model developed. We achieved out-of-sample validation in two ways: (1) a portion of the expert preference data was excluded from the model’s fitting and was then predicted by the model fitted on the remaining rankings and (2) a (partially) different set of experts generated new scenarios, using the same criteria employed in the model, and ranked them; their ranks were compared with ranks predicted by the model. The degree of validation in each method was less than perfect but reasonably substantial. The validated model we applied captured and modelled experts’ preferences regarding safety of hypothetical nanotechnology-enabled food products. It appears therefore that such an approach can provide a promising route to explore further for assessing the risk of nanotechnology-enabled food products.
Rosenzweig, C.; Hatfield, J.; Jones, J. W.; Ruane, A. C.
2012-12-01
The Agricultural Model Intercomparison and Improvement Project (AgMIP) is an international effort to assess the state of global agricultural modeling and to understand climate impacts on the agricultural sector. AgMIP connects the climate science, crop modeling, and agricultural economic modeling communities to generate probabilistic projections of current and future climate impacts. The goals of AgMIP are to improve substantially the characterization of risk of hunger and world food security due to climate change and to enhance adaptation capacity in both developing and developed countries. This presentation will describe the general approach of AgMIP, highlight AgMIP efforts to evaluate climate, crop, and economic models, and discuss AgMIP uncertainty assessments. Model evaluation efforts will be outlined using examples from various facets of AgMIP, including climate scenario generation, the wheat crop model intercomparison, and the global agricultural economics model intercomparison being led in collaboration with the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP). Strategies developed to quantify uncertainty in each component of AgMIP, as well as the propagation of uncertainty through the climate-crop-economic modeling framework, will be detailed and preliminary uncertainty assessments that highlight crucial areas requiring improved models and data collection will be introduced.
Chemical model reduction under uncertainty
Najm, Habib; Galassi, R. Malpica; Valorani, M.
2016-01-01
We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.
Chemical model reduction under uncertainty
Najm, Habib
2016-01-05
We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.
Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling
Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.
2017-12-01
Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model
Velázquez, J. A.; Schmid, J.; Ricard, S.; Muerth, M. J.; Gauvin St-Denis, B.; Minville, M.; Chaumont, D.; Caya, D.; Ludwig, R.; Turcotte, R.
2012-06-01
Over the recent years, several research efforts investigated the impact of climate change on water resources for different regions of the world. The projection of future river flows is affected by different sources of uncertainty in the hydro-climatic modelling chain. One of the aims of the QBic3 project (Québec-Bavarian International Collaboration on Climate Change) is to assess the contribution to uncertainty of hydrological models by using an ensemble of hydrological models presenting a diversity of structural complexity (i.e. lumped, semi distributed and distributed models). The study investigates two humid, mid-latitude catchments with natural flow conditions; one located in Southern Québec (Canada) and one in Southern Bavaria (Germany). Daily flow is simulated with four different hydrological models, forced by outputs from regional climate models driven by a given number of GCMs' members over a reference (1971-2000) and a future (2041-2070) periods. The results show that the choice of the hydrological model does strongly affect the climate change response of selected hydrological indicators, especially those related to low flows. Indicators related to high flows seem less sensitive on the choice of the hydrological model. Therefore, the computationally less demanding models (usually simple, lumped and conceptual) give a significant level of trust for high and overall mean flows.
Risk Assessment Uncertainties in Cybersecurity Investments
Directory of Open Access Journals (Sweden)
Andrew Fielder
2018-06-01
Full Text Available When undertaking cybersecurity risk assessments, it is important to be able to assign numeric values to metrics to compute the final expected loss that represents the risk that an organization is exposed to due to cyber threats. Even if risk assessment is motivated by real-world observations and data, there is always a high chance of assigning inaccurate values due to different uncertainties involved (e.g., evolving threat landscape, human errors and the natural difficulty of quantifying risk. Existing models empower organizations to compute optimal cybersecurity strategies given their financial constraints, i.e., available cybersecurity budget. Further, a general game-theoretic model with uncertain payoffs (probability-distribution-valued payoffs shows that such uncertainty can be incorporated in the game-theoretic model by allowing payoffs to be random. This paper extends previous work in the field to tackle uncertainties in risk assessment that affect cybersecurity investments. The findings from simulated examples indicate that although uncertainties in cybersecurity risk assessment lead, on average, to different cybersecurity strategies, they do not play a significant role in the final expected loss of the organization when utilising a game-theoretic model and methodology to derive these strategies. The model determines robust defending strategies even when knowledge regarding risk assessment values is not accurate. As a result, it is possible to show that the cybersecurity investments’ tool is capable of providing effective decision support.
Uncertainty assessment of 3D instantaneous velocity model from stack velocities
Emanuele Maesano, Francesco; D'Ambrogi, Chiara
2015-04-01
3D modelling is a powerful tool that is experiencing increasing applications in data analysis and dissemination. At the same time the need of quantitative uncertainty evaluation is strongly requested in many aspects of the geological sciences and by the stakeholders. In many cases the starting point for 3D model building is the interpretation of seismic profiles that provide indirect information about the geology of the subsurface in the domain of time. The most problematic step in the 3D modelling construction is the conversion of the horizons and faults interpreted in time domain to the depth domain. In this step the dominant variable that could lead to significantly different results is the velocity. The knowledge of the subsurface velocities is related mainly to punctual data (sonic logs) that are often sparsely distributed in the areas covered by the seismic interpretation. The extrapolation of velocity information to wide extended horizons is thus a critical step to obtain a 3D model in depth that can be used for predictive purpose. In the EU-funded GeoMol Project, the availability of a dense network of seismic lines (confidentially provided by ENI S.p.A.) in the Central Po Plain, is paired with the presence of 136 well logs, but few of them have sonic logs and in some portion of the area the wells are very widely spaced. The depth conversion of the 3D model in time domain has been performed testing different strategies for the use and the interpolation of velocity data. The final model has been obtained using a 4 layer cake 3D instantaneous velocity model that considers both the initial velocity (v0) in every reference horizon and the gradient of velocity variation with depth (k). Using this method it is possible to consider the geological constraint given by the geometries of the horizons and the geo-statistical approach to the interpolation of velocities and gradient. Here we present an experiment based on the use of set of pseudo-wells obtained from the
Uncertainties in risk assessment and decision making
International Nuclear Information System (INIS)
Starzec, Peter; Purucker, Tom; Stewart, Robert
2008-02-01
The general concept for risk assessment in accordance with the Swedish model for contaminated soil implies that the toxicological reference value for a given receptor is first back-calculated to a corresponding concentration of a compound in soil and (if applicable) then modified with respect to e.g. background levels, acute toxicity, and factor of safety. This result in a guideline value that is subsequently compared to the observed concentration levels. Many sources of uncertainty exist when assessing whether the risk for a receptor is significant or not. In this study, the uncertainty aspects have been addressed from three standpoints: 1. Uncertainty in the comparison between the level of contamination (source) and a given risk criterion (e.g. a guideline value) and possible implications on subsequent decisions. This type of uncertainty is considered to be most important in situations where a contaminant is expected to be spatially heterogeneous without any tendency to form isolated clusters (hotspots) that can be easily delineated, i.e. where mean values are appropriate to compare to the risk criterion. 2. Uncertainty in spatial distribution of a contaminant. Spatial uncertainty should be accounted for when hotspots are to be delineated and the volume of soil contaminated with levels above a stated decision criterion has to be assessed (quantified). 3. Uncertainty in an ecological exposure model with regard to the moving pattern of a receptor in relation to spatial distribution of contaminant in question. The study points out that the choice of methodology to characterize the relation between contaminant concentration and a pre-defined risk criterion is governed by a conceptual perception of the contaminant's spatial distribution and also depends on the structure of collected data (observations). How uncertainty in transition from contaminant concentration into risk criterion can be quantified was demonstrated by applying hypothesis tests and the concept of
Gelati, Emiliano; Decharme, Bertrand; Calvet, Jean-Christophe; Minvielle, Marie; Polcher, Jan; Fairbairn, David; Weedon, Graham P.
2018-04-01
Physically consistent descriptions of land surface hydrology are crucial for planning human activities that involve freshwater resources, especially in light of the expected climate change scenarios. We assess how atmospheric forcing data uncertainties affect land surface model (LSM) simulations by means of an extensive evaluation exercise using a number of state-of-the-art remote sensing and station-based datasets. For this purpose, we use the CO2-responsive ISBA-A-gs LSM coupled with the CNRM version of the Total Runoff Integrated Pathways (CTRIP) river routing model. We perform multi-forcing simulations over the Euro-Mediterranean area (25-75.5° N, 11.5° W-62.5° E, at 0.5° resolution) from 1979 to 2012. The model is forced using four atmospheric datasets. Three of them are based on the ERA-Interim reanalysis (ERA-I). The fourth dataset is independent from ERA-Interim: PGF, developed at Princeton University. The hydrological impacts of atmospheric forcing uncertainties are assessed by comparing simulated surface soil moisture (SSM), leaf area index (LAI) and river discharge against observation-based datasets: SSM from the European Space Agency's Water Cycle Multi-mission Observation Strategy and Climate Change Initiative projects (ESA-CCI), LAI of the Global Inventory Modeling and Mapping Studies (GIMMS), and Global Runoff Data Centre (GRDC) river discharge. The atmospheric forcing data are also compared to reference datasets. Precipitation is the most uncertain forcing variable across datasets, while the most consistent are air temperature and SW and LW radiation. At the monthly timescale, SSM and LAI simulations are relatively insensitive to forcing uncertainties. Some discrepancies with ESA-CCI appear to be forcing-independent and may be due to different assumptions underlying the LSM and the remote sensing retrieval algorithm. All simulations overestimate average summer and early-autumn LAI. Forcing uncertainty impacts on simulated river discharge are
Assessing flood forecast uncertainty with fuzzy arithmetic
Directory of Open Access Journals (Sweden)
de Bruyn Bertrand
2016-01-01
Full Text Available Providing forecasts for flow rates and water levels during floods have to be associated with uncertainty estimates. The forecast sources of uncertainty are plural. For hydrological forecasts (rainfall-runoff performed using a deterministic hydrological model with basic physics, two main sources can be identified. The first obvious source is the forcing data: rainfall forecast data are supplied in real time by meteorological forecasting services to the Flood Forecasting Service within a range between a lowest and a highest predicted discharge. These two values define an uncertainty interval for the rainfall variable provided on a given watershed. The second source of uncertainty is related to the complexity of the modeled system (the catchment impacted by the hydro-meteorological phenomenon, the number of variables that may describe the problem and their spatial and time variability. The model simplifies the system by reducing the number of variables to a few parameters. Thus it contains an intrinsic uncertainty. This model uncertainty is assessed by comparing simulated and observed rates for a large number of hydro-meteorological events. We propose a method based on fuzzy arithmetic to estimate the possible range of flow rates (and levels of water making a forecast based on possible rainfalls provided by forcing and uncertainty model. The model uncertainty is here expressed as a range of possible values. Both rainfall and model uncertainties are combined with fuzzy arithmetic. This method allows to evaluate the prediction uncertainty range. The Flood Forecasting Service of Oise and Aisne rivers, in particular, monitors the upstream watershed of the Oise at Hirson. This watershed’s area is 310 km2. Its response time is about 10 hours. Several hydrological models are calibrated for flood forecasting in this watershed and use the rainfall forecast. This method presents the advantage to be easily implemented. Moreover, it permits to be carried out
Shamaii, Azin; Omidvari, Manouchehr; Lotfi, Farhad Hosseinzadeh
2017-01-01
Performance assessment is a critical objective of management systems. As a result of the non-deterministic and qualitative nature of performance indicators, assessments are likely to be influenced by evaluators' personal judgments. Furthermore, in developing countries, performance assessments by the Health, Safety and Environment (HSE) department are based solely on the number of accidents. A questionnaire is used to conduct the study in one of the largest steel production companies in Iran. With respect to health, safety, and environment, the results revealed that control of disease, fire hazards, and air pollution are of paramount importance, with coefficients of 0.057, 0.062, and 0.054, respectively. Furthermore, health and environment indicators were found to be the most common causes of poor performance. Finally, it was shown that HSE management systems can affect the majority of performance safety indicators in the short run, whereas health and environment indicators require longer periods of time. The objective of this study is to present an HSE-MS unit performance assessment model in steel industries. Moreover, we seek to answer the following question: what are the factors that affect HSE unit system in the steel industry? Also, for each factor, the extent of impact on the performance of the HSE management system in the organization is determined.
Uncertainties in Nuclear Proliferation Modeling
International Nuclear Information System (INIS)
Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok
2015-01-01
There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies
Incorporating uncertainty in predictive species distribution modelling.
Beale, Colin M; Lennon, Jack J
2012-01-19
Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.
Some remarks on modeling uncertainties
International Nuclear Information System (INIS)
Ronen, Y.
1983-01-01
Several topics related to the question of modeling uncertainties are considered. The first topic is related to the use of the generalized bias operator method for modeling uncertainties. The method is expanded to a more general form of operators. The generalized bias operator is also used in the inverse problem and applied to determine the anisotropic scattering law. The last topic discussed is related to the question of the limit to accuracy and how to establish its value. (orig.) [de
Uncertainty and validation. Effect of model complexity on uncertainty estimates
International Nuclear Information System (INIS)
Elert, M.
1996-09-01
In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root
Modeling Uncertainty in Climate Change: A Multi-Model Comparison
Energy Technology Data Exchange (ETDEWEB)
Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul
2015-10-01
The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO_{2} concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.
Uncertainty in ecological risk assessment: A statistician's view
International Nuclear Information System (INIS)
Smith, E.P.
1995-01-01
Uncertainty is a topic that has different meanings to researchers, modelers, managers and policy makers. The perspective of this presentation will be on the modeling view of uncertainty and its quantitative assessment. The goal is to provide some insight into how a statistician visualizes and addresses the issue of uncertainty in ecological risk assessment problems. In ecological risk assessment, uncertainty arises from many sources and is of different type depending on what is studies, where it is studied and how it is studied. Some major sources and their impact are described. A variety of quantitative approaches to modeling uncertainty are characterized and a general taxonomy given. Examples of risk assessments of lake acidification, power plant impact assessment and the setting of standards for chemicals will be used discuss approaches to quantitative assessment of uncertainty and some of the potential difficulties
Freni, G; La Loggia, G; Notaro, V
2010-01-01
Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly
Energy Technology Data Exchange (ETDEWEB)
Hoffman, F O; Schwarz, G; Killough, G G [Oak Ridge National Lab., TN (USA)
1980-08-01
Concern is expressed regarding the use of the robustness index, as proposed in ICRP 29, to characterise the uncertainties associated with a model's predictions. Results of a Monte Carlo simulation employing a model of the grass-cow-milk-infant pathway for /sup 131/I are used to elucidate the author's criticisms. It is recommended that the robustness index should be carefully examined to appraise its possible usefulness and potential dangers. Alternate methods for analysis of uncertainty are proposed.
International Nuclear Information System (INIS)
King, Fearghal; Fu, Miao; Kelly, J. Andrew
2011-01-01
National outlooks of emission levels are important components of international environmental policymaking and associated national policy development. This is the case for both greenhouse gas emissions and transboundary air pollutants. However, there is uncertainty inherent in the production of forecasts. In the climate context, IPCC guidelines have been established to support national teams in quantifying uncertainty within national inventory reporting of historic emissions. These are presented to indicate the potential range of deviation from reported values and to offer added evidence for policy decisions. However, the method and practice of accounting for uncertainty amongst emission forecasts is both less clear and less common. This paper posits that the role of forecasts in setting international targets and planning policy action renders the management of ‘forecast’ uncertainty as important as addressing uncertainty in the context of inventory and compliance work. Failure to explicitly present uncertainty in forecasting delivers an implicit and misplaced confidence in a given future scenario, irrespective of parallel work on other scenarios and sensitivities. However, it is acknowledged that approaches to uncertainty analyses within the literature are often highly technical and the models used are both computationally demanding and time-intensive. This can limit broader adoption where national capacities are limited and scenario development is frequent. This paper describes an approach to presenting uncertainty, where the aim is to balance the technical and temporal demands of uncertainty estimation against a means of delivering regular and practical estimation and presentation of uncertainty for any given scenario. In turn this methodology should help formalise the recognition of the uncertainty dimension in emissions forecasts, for all stakeholders engaged.
A Bayesian approach to model uncertainty
International Nuclear Information System (INIS)
Buslik, A.
1994-01-01
A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given
Uncertainty and validation. Effect of model complexity on uncertainty estimates
Energy Technology Data Exchange (ETDEWEB)
Elert, M. [Kemakta Konsult AB, Stockholm (Sweden)] [ed.
1996-09-01
In the Model Complexity subgroup of BIOMOVS II, models of varying complexity have been applied to the problem of downward transport of radionuclides in soils. A scenario describing a case of surface contamination of a pasture soil was defined. Three different radionuclides with different environmental behavior and radioactive half-lives were considered: Cs-137, Sr-90 and I-129. The intention was to give a detailed specification of the parameters required by different kinds of model, together with reasonable values for the parameter uncertainty. A total of seven modelling teams participated in the study using 13 different models. Four of the modelling groups performed uncertainty calculations using nine different modelling approaches. The models used range in complexity from analytical solutions of a 2-box model using annual average data to numerical models coupling hydrology and transport using data varying on a daily basis. The complex models needed to consider all aspects of radionuclide transport in a soil with a variable hydrology are often impractical to use in safety assessments. Instead simpler models, often box models, are preferred. The comparison of predictions made with the complex models and the simple models for this scenario show that the predictions in many cases are very similar, e g in the predictions of the evolution of the root zone concentration. However, in other cases differences of many orders of magnitude can appear. One example is the prediction of the flux to the groundwater of radionuclides being transported through the soil column. Some issues that have come to focus in this study: There are large differences in the predicted soil hydrology and as a consequence also in the radionuclide transport, which suggests that there are large uncertainties in the calculation of effective precipitation and evapotranspiration. The approach used for modelling the water transport in the root zone has an impact on the predictions of the decline in root
Numerical modeling of economic uncertainty
DEFF Research Database (Denmark)
Schjær-Jacobsen, Hans
2007-01-01
Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....
Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut
2016-01-01
We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS
Muthu, Satish; Childress, Amy; Brant, Jonathan
2014-08-15
Membrane fouling assessed from a fundamental standpoint within the context of the Derjaguin-Landau-Verwey-Overbeek (DLVO) model. The DLVO model requires that the properties of the membrane and foulant(s) be quantified. Membrane surface charge (zeta potential) and free energy values are characterized using streaming potential and contact angle measurements, respectively. Comparing theoretical assessments for membrane-colloid interactions between research groups requires that the variability of the measured inputs be established. The impact that such variability in input values on the outcome from interfacial models must be quantified to determine an acceptable variance in inputs. An interlaboratory study was conducted to quantify the variability in streaming potential and contact angle measurements when using standard protocols. The propagation of uncertainty from these errors was evaluated in terms of their impact on the quantitative and qualitative conclusions on extended DLVO (XDLVO) calculated interaction terms. The error introduced into XDLVO calculated values was of the same magnitude as the calculated free energy values at contact and at any given separation distance. For two independent laboratories to draw similar quantitative conclusions regarding membrane-foulant interfacial interactions the standard error in contact angle values must be⩽2.5°, while that for the zeta potential values must be⩽7 mV. Copyright © 2014 Elsevier Inc. All rights reserved.
DEFF Research Database (Denmark)
Sin, Gürkan; Meyer, Anne S.; Gernaey, Krist
2010-01-01
The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done in the ori......The reliability of cellulose hydrolysis models is studied using the NREL model. An identifiability analysis revealed that only 6 out of 26 parameters are identifiable from the available data (typical hydrolysis experiments). Attempting to identify a higher number of parameters (as done...
Baustert, P.M.; Benetto, E.
2017-01-01
The evolution of life cycle assessment (LCA) from a merely comparative tool for the assessment of products to a policy analysis tool proceeds by incorporating increasingly complex modelling approaches. In more recent studies of complex systems, such as the agriculture sector or mobility, agent-based
Analysis of uncertainty in modeling perceived risks
International Nuclear Information System (INIS)
Melnyk, R.; Sandquist, G.M.
2005-01-01
Expanding on a mathematical model developed for quantifying and assessing perceived risks, the distribution functions, variances, and uncertainties associated with estimating the model parameters are quantified. The analytical model permits the identification and assignment of any number of quantifiable risk perception factors that can be incorporated within standard risk methodology. Those risk perception factors associated with major technical issues are modeled using lognormal probability density functions to span the potentially large uncertainty variations associated with these risk perceptions. The model quantifies the logic of public risk perception and provides an effective means for measuring and responding to perceived risks. (authors)
Model uncertainty in growth empirics
Prüfer, P.
2008-01-01
This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high
Dam break modelling, risk assessment and uncertainty analysis for flood mitigation
Zagonjolli, M.
2007-01-01
In this thesis a range of modelling techniques is explored to deal effectively with flood risk management. In particular, attention is paid to floods caused by failure of hydraulic structures such as dams and dikes. The methods considered here are applied for simulating dam and dike failure events,
Advanced LOCA code uncertainty assessment
International Nuclear Information System (INIS)
Wickett, A.J.; Neill, A.P.
1990-11-01
This report describes a pilot study that identified, quantified and combined uncertainties for the LOBI BL-02 3% small break test. A ''dials'' version of TRAC-PF1/MOD1, called TRAC-F, was used. (author)
Meteorological Uncertainty of atmospheric Dispersion model results (MUD)
DEFF Research Database (Denmark)
Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario....
Raleigh, M. S.; Smyth, E.; Small, E. E.
2017-12-01
The spatial distribution of snow water equivalent (SWE) is not sufficiently monitored with either remotely sensed or ground-based observations for water resources management. Recent applications of airborne Lidar have yielded basin-wide mapping of SWE when combined with a snow density model. However, in the absence of snow density observations, the uncertainty in these SWE maps is dominated by uncertainty in modeled snow density rather than in Lidar measurement of snow depth. Available observations tend to have a bias in physiographic regime (e.g., flat open areas) and are often insufficient in number to support testing of models across a range of conditions. Thus, there is a need for targeted sampling strategies and controlled model experiments to understand where and why different snow density models diverge. This will enable identification of robust model structures that represent dominant processes controlling snow densification, in support of basin-scale estimation of SWE with remotely-sensed snow depth datasets. The NASA SnowEx mission is a unique opportunity to evaluate sampling strategies of snow density and to quantify and reduce uncertainty in modeled snow density. In this presentation, we present initial field data analyses and modeling results over the Colorado SnowEx domain in the 2016-2017 winter campaign. We detail a framework for spatially mapping the uncertainty in snowpack density, as represented across multiple models. Leveraging the modular SUMMA model, we construct a series of physically-based models to assess systematically the importance of specific process representations to snow density estimates. We will show how models and snow pit observations characterize snow density variations with forest cover in the SnowEx domains. Finally, we will use the spatial maps of density uncertainty to evaluate the selected locations of snow pits, thereby assessing the adequacy of the sampling strategy for targeting uncertainty in modeled snow density.
Directory of Open Access Journals (Sweden)
L. Altarejos-García
2012-07-01
Full Text Available This paper addresses the use of reliability techniques such as Rosenblueth's Point-Estimate Method (PEM as a practical alternative to more precise Monte Carlo approaches to get estimates of the mean and variance of uncertain flood parameters water depth and velocity. These parameters define the flood severity, which is a concept used for decision-making in the context of flood risk assessment. The method proposed is particularly useful when the degree of complexity of the hydraulic models makes Monte Carlo inapplicable in terms of computing time, but when a measure of the variability of these parameters is still needed. The capacity of PEM, which is a special case of numerical quadrature based on orthogonal polynomials, to evaluate the first two moments of performance functions such as the water depth and velocity is demonstrated in the case of a single river reach using a 1-D HEC-RAS model. It is shown that in some cases, using a simple variable transformation, statistical distributions of both water depth and velocity approximate the lognormal. As this distribution is fully defined by its mean and variance, PEM can be used to define the full probability distribution function of these flood parameters and so allowing for probability estimations of flood severity. Then, an application of the method to the same river reach using a 2-D Shallow Water Equations (SWE model is performed. Flood maps of mean and standard deviation of water depth and velocity are obtained, and uncertainty in the extension of flooded areas with different severity levels is assessed. It is recognized, though, that whenever application of Monte Carlo method is practically feasible, it is a preferred approach.
Uncertainty modeling and decision support
International Nuclear Information System (INIS)
Yager, Ronald R.
2004-01-01
We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function
Uncertainty in hydrological change modelling
DEFF Research Database (Denmark)
Seaby, Lauren Paige
applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...
Avoiding climate change uncertainties in Strategic Environmental Assessment
DEFF Research Database (Denmark)
Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick Arthur
2013-01-01
This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies...
Plósz, Benedek Gy; De Clercq, Jeriffa; Nopens, Ingmar; Benedetti, Lorenzo; Vanrolleghem, Peter A
2011-01-01
In WWTP models, the accurate assessment of solids inventory in bioreactors equipped with solid-liquid separators, mostly described using one-dimensional (1-D) secondary settling tank (SST) models, is the most fundamental requirement of any calibration procedure. Scientific knowledge on characterising particulate organics in wastewater and on bacteria growth is well-established, whereas 1-D SST models and their impact on biomass concentration predictions are still poorly understood. A rigorous assessment of two 1-DSST models is thus presented: one based on hyperbolic (the widely used Takács-model) and one based on parabolic (the more recently presented Plósz-model) partial differential equations. The former model, using numerical approximation to yield realistic behaviour, is currently the most widely used by wastewater treatment process modellers. The latter is a convection-dispersion model that is solved in a numerically sound way. First, the explicit dispersion in the convection-dispersion model and the numerical dispersion for both SST models are calculated. Second, simulation results of effluent suspended solids concentration (XTSS,Eff), sludge recirculation stream (XTSS,RAS) and sludge blanket height (SBH) are used to demonstrate the distinct behaviour of the models. A thorough scenario analysis is carried out using SST feed flow rate, solids concentration, and overflow rate as degrees of freedom, spanning a broad loading spectrum. A comparison between the measurements and the simulation results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant-wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer
Uncertainty quantification for environmental models
Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming
2012-01-01
Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10
Bond, Alan; Morrison-Saunders, Angus; Gunn, Jill A E; Pope, Jenny; Retief, Francois
2015-03-15
In the context of continuing uncertainty, ambiguity and ignorance in impact assessment (IA) prediction, the case is made that existing IA processes are based on false 'normal' assumptions that science can solve problems and transfer knowledge into policy. Instead, a 'post-normal science' approach is needed that acknowledges the limits of current levels of scientific understanding. We argue that this can be achieved through embedding evolutionary resilience into IA; using participatory workshops; and emphasising adaptive management. The goal is an IA process capable of informing policy choices in the face of uncertain influences acting on socio-ecological systems. We propose a specific set of process steps to operationalise this post-normal science approach which draws on work undertaken by the Resilience Alliance. This process differs significantly from current models of IA, as it has a far greater focus on avoidance of, or adaptation to (through incorporating adaptive management subsequent to decisions), unwanted future scenarios rather than a focus on the identification of the implications of a single preferred vision. Implementing such a process would represent a culture change in IA practice as a lack of knowledge is assumed and explicit, and forms the basis of future planning activity, rather than being ignored. Copyright © 2014 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Mélanie Trudel
2017-03-01
Full Text Available Low-flow is the flow of water in a river during prolonged dry weather. This paper investigated the uncertainty originating from hydrological model calibration and structure in low-flow simulations under climate change conditions. Two hydrological models of contrasting complexity, GR4J and SWAT, were applied to four sub-watersheds of the Yamaska River, Canada. The two models were calibrated using seven different objective functions including the Nash-Sutcliffe coefficient (NSEQ and six other objective functions more related to low flows. The uncertainty in the model parameters was evaluated using a PARAmeter SOLutions procedure (PARASOL. Twelve climate projections from different combinations of General Circulation Models (GCMs and Regional Circulation Models (RCMs were used to simulate low-flow indices in a reference (1970–2000 and future (2040–2070 horizon. Results indicate that the NSEQ objective function does not properly represent low-flow indices for either model. The NSE objective function applied to the log of the flows shows the lowest total variance for all sub-watersheds. In addition, these hydrological models should be used with care for low-flow studies, since they both show some inconsistent results. The uncertainty is higher for SWAT than for GR4J. With GR4J, the uncertainties in the simulations for the 7Q2 index (the 7-day low-flow value with a 2-year return period are lower for the future period than for the reference period. This can be explained by the analysis of hydrological processes. In the future horizon, a significant worsening of low-flow conditions was projected.
Chemical model reduction under uncertainty
Malpica Galassi, Riccardo
2017-03-06
A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.
Uncertainty analysis in the applications of nuclear probabilistic risk assessment
International Nuclear Information System (INIS)
Le Duy, T.D.
2011-01-01
The aim of this thesis is to propose an approach to model parameter and model uncertainties affecting the results of risk indicators used in the applications of nuclear Probabilistic Risk assessment (PRA). After studying the limitations of the traditional probabilistic approach to represent uncertainty in PRA model, a new approach based on the Dempster-Shafer theory has been proposed. The uncertainty analysis process of the proposed approach consists in five main steps. The first step aims to model input parameter uncertainties by belief and plausibility functions according to the data PRA model. The second step involves the propagation of parameter uncertainties through the risk model to lay out the uncertainties associated with output risk indicators. The model uncertainty is then taken into account in the third step by considering possible alternative risk models. The fourth step is intended firstly to provide decision makers with information needed for decision making under uncertainty (parametric and model) and secondly to identify the input parameters that have significant uncertainty contributions on the result. The final step allows the process to be continued in loop by studying the updating of beliefs functions given new data. The proposed methodology was implemented on a real but simplified application of PRA model. (author)
Methodology for characterizing modeling and discretization uncertainties in computational simulation
Energy Technology Data Exchange (ETDEWEB)
ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.
2000-03-01
This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.
Puncher, M; Zhang, W; Harrison, J D; Wakeford, R
2017-06-26
Assessments of risk to a specific population group resulting from internal exposure to a particular radionuclide can be used to assess the reliability of the appropriate International Commission on Radiological Protection (ICRP) dose coefficients used as a radiation protection device for the specified exposure pathway. An estimate of the uncertainty on the associated risk is important for informing judgments on reliability; a derived uncertainty factor, UF, is an estimate of the 95% probable geometric difference between the best risk estimate and the nominal risk and is a useful tool for making this assessment. This paper describes the application of parameter uncertainty analysis to quantify uncertainties resulting from internal exposures to radioiodine by members of the public, specifically 1, 10 and 20-year old females from the population of England and Wales. Best estimates of thyroid cancer incidence risk (lifetime attributable risk) are calculated for ingestion or inhalation of 129 I and 131 I, accounting for uncertainties in biokinetic model and cancer risk model parameter values. These estimates are compared with the equivalent ICRP derived nominal age-, sex- and population-averaged estimates of excess thyroid cancer incidence to obtain UFs. Derived UF values for ingestion or inhalation of 131 I for 1 year, 10-year and 20-year olds are around 28, 12 and 6, respectively, when compared with ICRP Publication 103 nominal values, and 9, 7 and 14, respectively, when compared with ICRP Publication 60 values. Broadly similar results were obtained for 129 I. The uncertainties on risk estimates are largely determined by uncertainties on risk model parameters rather than uncertainties on biokinetic model parameters. An examination of the sensitivity of the results to the risk models and populations used in the calculations show variations in the central estimates of risk of a factor of around 2-3. It is assumed that the direct proportionality of excess thyroid cancer
An evaluation of uncertainties in radioecological models
International Nuclear Information System (INIS)
Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III
1978-01-01
The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de
Verburg, P.H.; Tabeau, A.A.; Hatna, E.
2013-01-01
Land change model outcomes are vulnerable to multiple types of uncertainty, including uncertainty in input data, structural uncertainties in the model and uncertainties in model parameters. In coupled model systems the uncertainties propagate between the models. This paper assesses uncertainty of
Climate change decision-making: Model & parameter uncertainties explored
Energy Technology Data Exchange (ETDEWEB)
Dowlatabadi, H.; Kandlikar, M.; Linville, C.
1995-12-31
A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.
Applied research in uncertainty modeling and analysis
Ayyub, Bilal
2005-01-01
Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...
Estimating uncertainty of data limited stock assessments
DEFF Research Database (Denmark)
Kokkalis, Alexandros; Eikeset, Anne Maria; Thygesen, Uffe Høgsbro
2017-01-01
-limited. Particular emphasis is put on providing uncertainty estimates of the data-limited assessment. We assess four cod stocks in the North-East Atlantic and compare our estimates of stock status (F/Fmsy) with the official assessments. The estimated stock status of all four cod stocks followed the established stock...
Some illustrative examples of model uncertainty
International Nuclear Information System (INIS)
Bier, V.M.
1994-01-01
In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2009-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....
Directory of Open Access Journals (Sweden)
Richard M. Palin
2016-07-01
Full Text Available Pseudosection modelling is rapidly becoming an essential part of a petrologist's toolkit and often forms the basis of interpreting the tectonothermal evolution of a rock sample, outcrop, or geological region. Of the several factors that can affect the accuracy and precision of such calculated phase diagrams, “geological” uncertainty related to natural petrographic variation at the hand sample- and/or thin section-scale is rarely considered. Such uncertainty influences the sample's bulk composition, which is the primary control on its equilibrium phase relationships and thus the interpreted pressure–temperature (P–T conditions of formation. Two case study examples—a garnet–cordierite granofels and a garnet–staurolite–kyanite schist—are used to compare the relative importance that geological uncertainty has on bulk compositions determined via (1 X-ray fluorescence (XRF or (2 point counting techniques. We show that only minor mineralogical variation at the thin-section scale propagates through the phase equilibria modelling procedure and affects the absolute P–T conditions at which key assemblages are stable. Absolute displacements of equilibria can approach ±1 kbar for only a moderate degree of modal proportion uncertainty, thus being essentially similar to the magnitudes reported for analytical uncertainties in conventional thermobarometry. Bulk compositions determined from multiple thin sections of a heterogeneous garnet–staurolite–kyanite schist show a wide range in major-element oxides, owing to notable variation in mineral proportions. Pseudosections constructed for individual point count-derived bulks accurately reproduce this variability on a case-by-case basis, though averaged proportions do not correlate with those calculated at equivalent peak P–T conditions for a whole-rock XRF-derived bulk composition. The main discrepancies relate to varying proportions of matrix phases (primarily mica relative to
On the relationship between aerosol model uncertainty and radiative forcing uncertainty.
Lee, Lindsay A; Reddington, Carly L; Carslaw, Kenneth S
2016-05-24
The largest uncertainty in the historical radiative forcing of climate is caused by the interaction of aerosols with clouds. Historical forcing is not a directly measurable quantity, so reliable assessments depend on the development of global models of aerosols and clouds that are well constrained by observations. However, there has been no systematic assessment of how reduction in the uncertainty of global aerosol models will feed through to the uncertainty in the predicted forcing. We use a global model perturbed parameter ensemble to show that tight observational constraint of aerosol concentrations in the model has a relatively small effect on the aerosol-related uncertainty in the calculated forcing between preindustrial and present-day periods. One factor is the low sensitivity of present-day aerosol to natural emissions that determine the preindustrial aerosol state. However, the major cause of the weak constraint is that the full uncertainty space of the model generates a large number of model variants that are equally acceptable compared to present-day aerosol observations. The narrow range of aerosol concentrations in the observationally constrained model gives the impression of low aerosol model uncertainty. However, these multiple "equifinal" models predict a wide range of forcings. To make progress, we need to develop a much deeper understanding of model uncertainty and ways to use observations to constrain it. Equifinality in the aerosol model means that tuning of a small number of model processes to achieve model-observation agreement could give a misleading impression of model robustness.
A review of uncertainty research in impact assessment
International Nuclear Information System (INIS)
Leung, Wanda; Noble, Bram; Gunn, Jill; Jaeger, Jochen A.G.
2015-01-01
This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We
A review of uncertainty research in impact assessment
Energy Technology Data Exchange (ETDEWEB)
Leung, Wanda, E-mail: wanda.leung@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Noble, Bram, E-mail: b.noble@usask.ca [Department of Geography and Planning, School of Environment and Sustainability, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Gunn, Jill, E-mail: jill.gunn@usask.ca [Department of Geography and Planning, University of Saskatchewan, 117 Science Place, Saskatoon, Saskatchewan S7N 5A5 (Canada); Jaeger, Jochen A.G., E-mail: jochen.jaeger@concordia.ca [Department of Geography, Planning and Environment, Concordia University, 1455 de Maisonneuve W., Suite 1255, Montreal, Quebec H3G 1M8 (Canada); Loyola Sustainability Research Centre, Concordia University, 7141 Sherbrooke W., AD-502, Montreal, Quebec H4B 1R6 (Canada)
2015-01-15
This paper examines uncertainty research in Impact Assessment (IA) and the focus of attention of the IA scholarly literature. We do so by first exploring ‘outside’ the IA literature, identifying three main themes of uncertainty research, and then apply these themes to examine the focus of scholarly research on uncertainty ‘inside’ IA. Based on a search of the database Scopus, we identified 134 journal papers published between 1970 and 2013 that address uncertainty in IA, 75% of which were published since 2005. We found that 90% of IA research addressing uncertainty focused on uncertainty in the practice of IA, including uncertainty in impact predictions, models and managing environmental impacts. Notwithstanding early guidance on uncertainty treatment in IA from the 1980s, we found no common, underlying conceptual framework that was guiding research on uncertainty in IA practice. Considerably less attention, only 9% of papers, focused on uncertainty communication, disclosure and decision-making under uncertain conditions, the majority of which focused on the need to disclose uncertainties as opposed to providing guidance on how to do so and effectively use that information to inform decisions. Finally, research focused on theory building for explaining human behavior with respect to uncertainty avoidance constituted only 1% of the IA published literature. We suggest the need for further conceptual framework development for researchers focused on identifying and addressing uncertainty in IA practice; the need for guidance on how best to communicate uncertainties in practice, versus criticizing practitioners for not doing so; research that explores how best to interpret and use disclosures about uncertainty when making decisions about project approvals, and the implications of doing so; and academic theory building and exploring the utility of existing theories to better understand and explain uncertainty avoidance behavior in IA. - Highlights: • We
Assessment of uncertainties in Neutron Multiplicity Counting
International Nuclear Information System (INIS)
Peerani, P.; Marin Ferrer, M.
2008-01-01
This paper describes a methodology for a complete and correct assessment of the errors coming from the uncertainty of each individual component on the final result. A general methodology accounting for all the main sources of error (both type-A and type-B) will be outlined. In order to better illustrate the method, a practical example applying it to the uncertainty estimation for a special case of multiplicity counter, the SNMC developed at JRC, will be given
Honti, Mark; Reichert, Peter; Scheidegger, Andreas; Stamm, Christian
2013-04-01
Climate change impact assessments have become more and more popular in hydrology since the middle 1980's with another boost after the publication of the IPCC AR4 report. During hundreds of impact studies a quasi-standard methodology emerged, which is mainly shaped by the growing public demand for predicting how water resources management or flood protection should change in the close future. The ``standard'' workflow considers future climate under a specific IPCC emission scenario simulated by global circulation models (GCMs), possibly downscaled by a regional climate model (RCM) and/or a stochastic weather generator. The output from the climate models is typically corrected for bias before feeding it into a calibrated hydrological model, which is run on the past and future meteorological data to analyse the impacts of climate change on the hydrological indicators of interest. The impact predictions are as uncertain as any forecast that tries to describe the behaviour of an extremely complex system decades into the future. Future climate predictions are uncertain due to the scenario uncertainty and the GCM model uncertainty that is obvious on finer resolution than continental scale. Like in any hierarchical model system, uncertainty propagates through the descendant components. Downscaling increases uncertainty with the deficiencies of RCMs and/or weather generators. Bias correction adds a strong deterministic shift to the input data. Finally the predictive uncertainty of the hydrological model ends the cascade that leads to the total uncertainty of the hydrological impact assessment. There is an emerging consensus between many studies on the relative importance of the different uncertainty sources. The prevailing perception is that GCM uncertainty dominates hydrological impact studies. There are only few studies, which found that the predictive uncertainty of hydrological models can be in the same range or even larger than climatic uncertainty. We carried out a
Uncertainty quantification in flood risk assessment
Blöschl, Günter; Hall, Julia; Kiss, Andrea; Parajka, Juraj; Perdigão, Rui A. P.; Rogger, Magdalena; Salinas, José Luis; Viglione, Alberto
2017-04-01
Uncertainty is inherent to flood risk assessments because of the complexity of the human-water system, which is characterised by nonlinearities and interdependencies, because of limited knowledge about system properties and because of cognitive biases in human perception and decision-making. On top of the uncertainty associated with the assessment of the existing risk to extreme events, additional uncertainty arises because of temporal changes in the system due to climate change, modifications of the environment, population growth and the associated increase in assets. Novel risk assessment concepts are needed that take into account all these sources of uncertainty. They should be based on the understanding of how flood extremes are generated and how they change over time. They should also account for the dynamics of risk perception of decision makers and population in the floodplains. In this talk we discuss these novel risk assessment concepts through examples from Flood Frequency Hydrology, Socio-Hydrology and Predictions Under Change. We believe that uncertainty quantification in flood risk assessment should lead to a robust approach of integrated flood risk management aiming at enhancing resilience rather than searching for optimal defense strategies.
Avoiding climate change uncertainties in Strategic Environmental Assessment
Energy Technology Data Exchange (ETDEWEB)
Larsen, Sanne Vammen, E-mail: sannevl@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark); Kørnøv, Lone, E-mail: lonek@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University, Skibbrogade 5, 1. Sal, 9000 Aalborg (Denmark); Driscoll, Patrick, E-mail: patrick@plan.aau.dk [The Danish Centre for Environmental Assessment, Aalborg University-Copenhagen, A.C. Meyers Vænge 15, 2450 København SV (Denmark)
2013-11-15
This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty.
Avoiding climate change uncertainties in Strategic Environmental Assessment
International Nuclear Information System (INIS)
Larsen, Sanne Vammen; Kørnøv, Lone; Driscoll, Patrick
2013-01-01
This article is concerned with how Strategic Environmental Assessment (SEA) practice handles climate change uncertainties within the Danish planning system. First, a hypothetical model is set up for how uncertainty is handled and not handled in decision-making. The model incorporates the strategies ‘reduction’ and ‘resilience’, ‘denying’, ‘ignoring’ and ‘postponing’. Second, 151 Danish SEAs are analysed with a focus on the extent to which climate change uncertainties are acknowledged and presented, and the empirical findings are discussed in relation to the model. The findings indicate that despite incentives to do so, climate change uncertainties were systematically avoided or downplayed in all but 5 of the 151 SEAs that were reviewed. Finally, two possible explanatory mechanisms are proposed to explain this: conflict avoidance and a need to quantify uncertainty
Uncertainty and its propagation in dynamics models
International Nuclear Information System (INIS)
Devooght, J.
1994-01-01
The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision
Uncertainties in risk assessment at USDOE facilities
Energy Technology Data Exchange (ETDEWEB)
Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.
1994-01-01
The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms{open_quote} risk assessment{close_quote} and{open_quote} risk management{close_quote} are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of {open_quotes}... the most significant data and uncertainties...{close_quotes} in an assessment. Significant data and uncertainties are {open_quotes}...those that define and explain the main risk conclusions{close_quotes}. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation.
Uncertainties in risk assessment at USDOE facilities
International Nuclear Information System (INIS)
Hamilton, L.D.; Holtzman, S.; Meinhold, A.F.; Morris, S.C.; Rowe, M.D.
1994-01-01
The United States Department of Energy (USDOE) has embarked on an ambitious program to remediate environmental contamination at its facilities. Decisions concerning cleanup goals, choices among cleanup technologies, and funding prioritization should be largely risk-based. Risk assessments will be used more extensively by the USDOE in the future. USDOE needs to develop and refine risk assessment methods and fund research to reduce major sources of uncertainty in risk assessments at USDOE facilities. The terms open-quote risk assessment close-quote and open-quote risk management close-quote are frequently confused. The National Research Council (1983) and the United States Environmental Protection Agency (USEPA, 1991a) described risk assessment as a scientific process that contributes to risk management. Risk assessment is the process of collecting, analyzing and integrating data and information to identify hazards, assess exposures and dose responses, and characterize risks. Risk characterization must include a clear presentation of open-quotes... the most significant data and uncertainties...close quotes in an assessment. Significant data and uncertainties are open-quotes...those that define and explain the main risk conclusionsclose quotes. Risk management integrates risk assessment information with other considerations, such as risk perceptions, socioeconomic and political factors, and statutes, to make and justify decisions. Risk assessments, as scientific processes, should be made independently of the other aspects of risk management (USEPA, 1991a), but current methods for assessing health risks are based on conservative regulatory principles, causing unnecessary public concern and misallocation of funds for remediation
Modelling of Transport Projects Uncertainties
DEFF Research Database (Denmark)
Salling, Kim Bang; Leleur, Steen
2012-01-01
This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....
Uncertainty Quantification in Geomagnetic Field Modeling
Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.
2017-12-01
Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.
A Framework for Understanding Uncertainty in Seismic Risk Assessment.
Foulser-Piggott, Roxane; Bowman, Gary; Hughes, Martin
2017-10-11
A better understanding of the uncertainty that exists in models used for seismic risk assessment is critical to improving risk-based decisions pertaining to earthquake safety. Current models estimating the probability of collapse of a building do not consider comprehensively the nature and impact of uncertainty. This article presents a model framework to enhance seismic risk assessment and thus gives decisionmakers a fuller understanding of the nature and limitations of the estimates. This can help ensure that risks are not over- or underestimated and the value of acquiring accurate data is appreciated fully. The methodology presented provides a novel treatment of uncertainties in input variables, their propagation through the model, and their effect on the results. The study presents ranges of possible annual collapse probabilities for different case studies on buildings in different parts of the world, exposed to different levels of seismicity, and with different vulnerabilities. A global sensitivity analysis was conducted to determine the significance of uncertain variables. Two key outcomes are (1) that the uncertainty in ground-motion conversion equations has the largest effect on the uncertainty in the calculation of annual collapse probability; and (2) the vulnerability of a building appears to have an effect on the range of annual collapse probabilities produced, i.e., the level of uncertainty in the estimate of annual collapse probability, with less vulnerable buildings having a smaller uncertainty. © 2017 Society for Risk Analysis.
Reyes, J. J.; Adam, J. C.; Tague, C.
2016-12-01
Grasslands play an important role in agricultural production as forage for livestock; they also provide a diverse set of ecosystem services including soil carbon (C) storage. The partitioning of C between above and belowground plant compartments (i.e. allocation) is influenced by both plant characteristics and environmental conditions. The objectives of this study are to 1) develop and evaluate a hybrid C allocation strategy suitable for grasslands, and 2) apply this strategy to examine the importance of various parameters related to biogeochemical cycling, photosynthesis, allocation, and soil water drainage on above and belowground biomass. We include allocation as an important process in quantifying the model parameter uncertainty, which identifies the most influential parameters and what processes may require further refinement. For this, we use the Regional Hydro-ecologic Simulation System, a mechanistic model that simulates coupled water and biogeochemical processes. A Latin hypercube sampling scheme was used to develop parameter sets for calibration and evaluation of allocation strategies, as well as parameter uncertainty analysis. We developed the hybrid allocation strategy to integrate both growth-based and resource-limited allocation mechanisms. When evaluating the new strategy simultaneously for above and belowground biomass, it produced a larger number of less biased parameter sets: 16% more compared to resource-limited and 9% more compared to growth-based. This also demonstrates its flexible application across diverse plant types and environmental conditions. We found that higher parameter importance corresponded to sub- or supra-optimal resource availability (i.e. water, nutrients) and temperature ranges (i.e. too hot or cold). For example, photosynthesis-related parameters were more important at sites warmer than the theoretical optimal growth temperature. Therefore, larger values of parameter importance indicate greater relative sensitivity in
DEFF Research Database (Denmark)
Mäntyniemi, Samu; Uusitalo, Laura; Peltonen, Heikki
2013-01-01
We developed a generic, age-structured, state-space stock assessment model that can be used as a platform for including information elicited from stakeholders. The model tracks the mean size-at-age and then uses it to explain rates of natural and ﬁshing mortality. The ﬁshery selectivity is divided...... to two components, which makes it possible to model the active seeking of the ﬂeet for certain sizes of ﬁsh, as well as the selectivity of the gear itself. The model can account for uncertainties that are not currently accounted for in state-of-the-art models for integrated assessments: (i) The form...... of the stock–recruitment function is considered uncertain and is accounted for by using Bayesian model averaging. (ii) In addition to recruitment variation, process variation in natural mortality, growth parameters, and ﬁshing mortality can also be treated as uncertain parameters...
Monier, E.; Scott, J. R.; Sokolov, A. P.; Forest, C. E.; Schlosser, C. A.
2013-12-01
This paper describes a computationally efficient framework for uncertainty studies in global and regional climate change. In this framework, the Massachusetts Institute of Technology (MIT) Integrated Global System Model (IGSM), an integrated assessment model that couples an Earth system model of intermediate complexity to a human activity model, is linked to the National Center for Atmospheric Research (NCAR) Community Atmosphere Model (CAM). Since the MIT IGSM-CAM framework (version 1.0) incorporates a human activity model, it is possible to analyze uncertainties in emissions resulting from both uncertainties in the underlying socio-economic characteristics of the economic model and in the choice of climate-related policies. Another major feature is the flexibility to vary key climate parameters controlling the climate system response to changes in greenhouse gases and aerosols concentrations, e.g., climate sensitivity, ocean heat uptake rate, and strength of the aerosol forcing. The IGSM-CAM is not only able to realistically simulate the present-day mean climate and the observed trends at the global and continental scale, but it also simulates ENSO variability with realistic time scales, seasonality and patterns of SST anomalies, albeit with stronger magnitudes than observed. The IGSM-CAM shares the same general strengths and limitations as the Coupled Model Intercomparison Project Phase 3 (CMIP3) models in simulating present-day annual mean surface temperature and precipitation. Over land, the IGSM-CAM shows similar biases to the NCAR Community Climate System Model (CCSM) version 3, which shares the same atmospheric model. This study also presents 21st century simulations based on two emissions scenarios (unconstrained scenario and stabilization scenario at 660 ppm CO2-equivalent) similar to, respectively, the Representative Concentration Pathways RCP8.5 and RCP4.5 scenarios, and three sets of climate parameters. Results of the simulations with the chosen
Directory of Open Access Journals (Sweden)
Dianfa Wu
2018-05-01
Full Text Available The transformation of the power generation industry from coal-based to more sustainable energy sources is an irreversible trend. In China, the coal-fired power plant, as the main electric power supply facility at present, needs to know its own sustainability level to face the future competition. A hybrid multi-criteria decision making (MCDM model is proposed in this paper to assess the sustainability levels of the existing Chinese coal-fired power units. The areal grey relational analysis (AGRA method is involved in the hybrid model, and a combined weighting method is used to determine the priorities of the criteria. The combining weight fuses the fuzzy rough set (FRS and entropy objective weighting method together with the analytic hierarchy process (AHP subjective weighting method by game theory. Moreover, an AHP weighting uncertainty analysis using Monte Carlo (MC simulation is introduced to measure the uncertainty of the results, and a 95 percent confidence interval (CI is defined as the uncertainty measurement of the alternatives. A case study about eight coal-fired power units is carried out with a criteria system, which contains five aspects in an operational perspective, such as the flexibility, economic, environmental, reliability and technical criterion. The sustainability assessment is performed at the unit level, and the results give a priority rank of the eight alternatives; additionally, the uncertainty analysis supplies the extra information from a statistical perspective. This work expands a novel hybrid MCDM method to the sustainability assessment of the power generation systems, and it may be a benefit to the energy enterprises in assessing the sustainability at the unit level and enhance its ability in future sustainable development.
Flood modelling : Parameterisation and inflow uncertainty
Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.
2014-01-01
This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve
International Nuclear Information System (INIS)
Dumas, P.
2006-01-01
The aim of this research is to introduce new elements for the assessment of damages due to climate changes within the frame of compact models aiding the decision. Two types of methodologies are used: sequential optimisation stochastic models and simulation stochastic models using optimal assessment methods. The author first defines the damages, characterizes their different categories, and reviews the existing assessments. Notably, he makes the distinction between damages due to climate change and damages due to its rate. Then, he presents the different models used in this study, the numerical solutions, and gives a rough estimate of the importance of the considered phenomena. By introducing a new category of capital in an optimal growth model, he tries to establish a framework allowing the representation of adaptation and of its costs. He introduces inertia in macro-economical evolutions, climatic variability, detection of climate change and damages due to climate hazards
Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.
2012-04-01
Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture content throughout the river basin. Space-borne remote sensing may provide this information with a high temporal and spatial resolution and with a global coverage. Currently three microwave soil moisture products are available: AMSR-E, ASCAT and SMOS. The quality of these satellite-based products is often assessed by comparing them with in-situ observations of soil moisture. This comparison is however hampered by the difference in spatial and temporal support (i.e., resolution, scale), because the spatial resolution of microwave satellites is rather low compared to in-situ field measurements. Thus, the aim of this study is to derive a method to assess the uncertainty of microwave satellite soil moisture products at the correct spatial support. To overcome the difference in support size between in-situ soil moisture observations and remote sensed soil moisture, we used a stochastic, distributed unsaturated zone model (SWAP, van Dam (2000)) that is upscaled to the support of different satellite products. A detailed assessment of the SWAP model uncertainty is included to ensure that the uncertainty in satellite soil moisture is not overestimated due to an underestimation of the model uncertainty. We simulated unsaturated water flow up to a depth of 1.5m with a vertical resolution of 1 to 10 cm and on a horizontal grid of 1 km2 for the period Jan 2010 - Jun 2011. The SWAP model was first calibrated and validated on in-situ data of the REMEDHUS soil moisture network (Spain). Next, to evaluate the satellite products, the model was run for areas in the proximity of 79 meteorological stations in Spain, where model results were aggregated to the correct support of the satellite
Reusable launch vehicle model uncertainties impact analysis
Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng
2018-03-01
Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).
Assessing student understanding of measurement and uncertainty
Jirungnimitsakul, S.; Wattanakasiwich, P.
2017-09-01
The objectives of this study were to develop and assess student understanding of measurement and uncertainty. A test has been adapted and translated from the Laboratory Data Analysis Instrument (LDAI) test, consists of 25 questions focused on three topics including measures of central tendency, experimental errors and uncertainties, and fitting regression lines. The test was evaluated its content validity by three physics experts in teaching physics laboratory. In the pilot study, Thai LDAI was administered to 93 freshmen enrolled in a fundamental physics laboratory course. The final draft of the test was administered to three groups—45 freshmen taking fundamental physics laboratory, 16 sophomores taking intermediated physics laboratory and 21 juniors taking advanced physics laboratory at Chiang Mai University. As results, we found that the freshmen had difficulties in experimental errors and uncertainties. Most students had problems with fitting regression lines. These results will be used to improve teaching and learning physics laboratory for physics students in the department.
Model Uncertainty for Bilinear Hysteretic Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Thoft-Christensen, Palle
1984-01-01
. The statistical uncertainty -due to lack of information can e.g. be taken into account by describing the variables by predictive density functions, Veneziano [2). In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis...... is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...
International Nuclear Information System (INIS)
Kalinich, D. A.; Wilson, M. L.
2001-01-01
Seepage into the repository drifts is an important factor in total-system performance. Uncertainty and spatial variability are considered in the seepage calculations. The base-case results show 13.6% of the waste packages (WPs) have seepage. For 5th percentile uncertainty, 4.5% of the WPs have seepage and the seepage flow decreased by a factor of 2. For 95th percentile uncertainty, 21.5% of the WPs have seepage and the seepage flow increased by a factor of 2. Ignoring spatial variability resulted in seepage on 100% of the WPs, with a factor of 3 increase in the seepage flow
New challenges on uncertainty propagation assessment of flood risk analysis
Martins, Luciano; Aroca-Jiménez, Estefanía; Bodoque, José M.; Díez-Herrero, Andrés
2016-04-01
Natural hazards, such as floods, cause considerable damage to the human life, material and functional assets every year and around the World. Risk assessment procedures has associated a set of uncertainties, mainly of two types: natural, derived from stochastic character inherent in the flood process dynamics; and epistemic, that are associated with lack of knowledge or the bad procedures employed in the study of these processes. There are abundant scientific and technical literature on uncertainties estimation in each step of flood risk analysis (e.g. rainfall estimates, hydraulic modelling variables); but very few experience on the propagation of the uncertainties along the flood risk assessment. Therefore, epistemic uncertainties are the main goal of this work, in particular,understand the extension of the propagation of uncertainties throughout the process, starting with inundability studies until risk analysis, and how far does vary a proper analysis of the risk of flooding. These methodologies, such as Polynomial Chaos Theory (PCT), Method of Moments or Monte Carlo, are used to evaluate different sources of error, such as data records (precipitation gauges, flow gauges...), hydrologic and hydraulic modelling (inundation estimation), socio-demographic data (damage estimation) to evaluate the uncertainties propagation (UP) considered in design flood risk estimation both, in numerical and cartographic expression. In order to consider the total uncertainty and understand what factors are contributed most to the final uncertainty, we used the method of Polynomial Chaos Theory (PCT). It represents an interesting way to handle to inclusion of uncertainty in the modelling and simulation process. PCT allows for the development of a probabilistic model of the system in a deterministic setting. This is done by using random variables and polynomials to handle the effects of uncertainty. Method application results have a better robustness than traditional analysis
Aspects of uncertainty analysis in accident consequence modeling
International Nuclear Information System (INIS)
Travis, C.C.; Hoffman, F.O.
1981-01-01
Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data
International Nuclear Information System (INIS)
Ramarohetra, Johanna; Pohl, Benjamin; Sultan, Benjamin
2015-01-01
The challenge of estimating the potential impacts of climate change has led to an increasing use of dynamical downscaling to produce fine spatial-scale climate projections for impact assessments. In this work, we analyze if and to what extent the bias in the simulated crop yield can be reduced by using the Weather Research and Forecasting (WRF) regional climate model to downscale ERA-Interim (European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis) rainfall and radiation data. Then, we evaluate the uncertainties resulting from both the choice of the physical parameterizations of the WRF model and its internal variability. Impact assessments were performed at two sites in Sub-Saharan Africa and by using two crop models to simulate Niger pearl millet and Benin maize yields. We find that the use of the WRF model to downscale ERA-Interim climate data generally reduces the bias in the simulated crop yield, yet this reduction in bias strongly depends on the choices in the model setup. Among the physical parameterizations considered, we show that the choice of the land surface model (LSM) is of primary importance. When there is no coupling with a LSM, or when the LSM is too simplistic, the simulated precipitation and then the simulated yield are null, or respectively very low; therefore, coupling with a LSM is necessary. The convective scheme is the second most influential scheme for yield simulation, followed by the shortwave radiation scheme. The uncertainties related to the internal variability of the WRF model are also significant and reach up to 30% of the simulated yields. These results suggest that regional models need to be used more carefully in order to improve the reliability of impact assessments. (letter)
Subspace-based Inverse Uncertainty Quantification for Nuclear Data Assessment
Energy Technology Data Exchange (ETDEWEB)
Khuwaileh, B.A., E-mail: bakhuwai@ncsu.edu; Abdel-Khalik, H.S.
2015-01-15
Safety analysis and design optimization depend on the accurate prediction of various reactor attributes. Predictions can be enhanced by reducing the uncertainty associated with the attributes of interest. An inverse problem can be defined and solved to assess the sources of uncertainty, and experimental effort can be subsequently directed to further improve the uncertainty associated with these sources. In this work a subspace-based algorithm for inverse sensitivity/uncertainty quantification (IS/UQ) has been developed to enable analysts account for all sources of nuclear data uncertainties in support of target accuracy assessment-type analysis. An approximate analytical solution of the optimization problem is used to guide the search for the dominant uncertainty subspace. By limiting the search to a subspace, the degrees of freedom available for the optimization search are significantly reduced. A quarter PWR fuel assembly is modeled and the accuracy of the multiplication factor and the fission reaction rate are used as reactor attributes whose uncertainties are to be reduced. Numerical experiments are used to demonstrate the computational efficiency of the proposed algorithm. Our ongoing work is focusing on extending the proposed algorithm to account for various forms of feedback, e.g., thermal-hydraulics and depletion effects.
Meteorological Uncertainty of atmospheric Dispersion model results (MUD)
DEFF Research Database (Denmark)
Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely’ di...
Wastewater treatment modelling: dealing with uncertainties
DEFF Research Database (Denmark)
Belia, E.; Amerlinck, Y.; Benedetti, L.
2009-01-01
This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...
Urban drainage models - making uncertainty analysis simple
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2012-01-01
in each measured/observed datapoint; an issue which is commonly overlook in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...
Uncertainty Assessments in Fast Neutron Activation Analysis
International Nuclear Information System (INIS)
W. D. James; R. Zeisler
2000-01-01
Fast neutron activation analysis (FNAA) carried out with the use of small accelerator-based neutron generators is routinely used for major/minor element determinations in industry, mineral and petroleum exploration, and to some extent in research. While the method shares many of the operational procedures and therefore errors inherent to conventional thermal neutron activation analysis, its unique implementation gives rise to additional specific concerns that can result in errors or increased uncertainties of measured quantities. The authors were involved in a recent effort to evaluate irreversible incorporation of oxygen into a standard reference material (SRM) by direct measurement of oxygen by FNAA. That project required determination of oxygen in bottles of the SRM stored in varying environmental conditions and a comparison of the results. We recognized the need to accurately describe the total uncertainty of the measurements to accurately characterize any differences in the resulting average concentrations. It is our intent here to discuss the breadth of potential parameters that have the potential to contribute to the random and nonrandom errors of the method and provide estimates of the magnitude of uncertainty introduced. In addition, we will discuss the steps taken in this recent FNAA project to control quality, assess the uncertainty of the measurements, and evaluate results based on the statistical reproducibility
Modelling and propagation of uncertainties in the German Risk Study
International Nuclear Information System (INIS)
Hofer, E.; Krzykacz, B.
1982-01-01
Risk assessments are generally subject to uncertainty considerations. This is because of the various estimates that are involved. The paper points out those estimates in the so-called phase A of the German Risk Study, for which uncertainties were quantified. It explains the probabilistic models applied in the assessment to their impact on the findings of the study. Finally the resulting subjective confidence intervals of the study results are presented and their sensitivity to these probabilistic models is investigated
Study on Uncertainty and Contextual Modelling
Czech Academy of Sciences Publication Activity Database
Klimešová, Dana; Ocelíková, E.
2007-01-01
Roč. 1, č. 1 (2007), s. 12-15 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * contextual modelling * temporal modelling * uncertainty * knowledge management Subject RIV: BD - Theory of Information
Collaborative framework for PIV uncertainty quantification: comparative assessment of methods
International Nuclear Information System (INIS)
Sciacchitano, Andrea; Scarano, Fulvio; Neal, Douglas R; Smith, Barton L; Warner, Scott O; Vlachos, Pavlos P; Wieneke, Bernhard
2015-01-01
A posteriori uncertainty quantification of particle image velocimetry (PIV) data is essential to obtain accurate estimates of the uncertainty associated with a given experiment. This is particularly relevant when measurements are used to validate computational models or in design and decision processes. In spite of the importance of the subject, the first PIV uncertainty quantification (PIV-UQ) methods have been developed only in the last three years. The present work is a comparative assessment of four approaches recently proposed in the literature: the uncertainty surface method (Timmins et al 2012), the particle disparity approach (Sciacchitano et al 2013), the peak ratio criterion (Charonko and Vlachos 2013) and the correlation statistics method (Wieneke 2015). The analysis is based upon experiments conducted for this specific purpose, where several measurement techniques are employed simultaneously. The performances of the above approaches are surveyed across different measurement conditions and flow regimes. (paper)
Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning
Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.
2016-12-01
Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate
Model uncertainties in top-quark physics
Seidel, Markus
2014-01-01
The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.
Assessment and uncertainty analysis of groundwater risk.
Li, Fawen; Zhu, Jingzhao; Deng, Xiyuan; Zhao, Yong; Li, Shaofei
2018-01-01
Groundwater with relatively stable quantity and quality is commonly used by human being. However, as the over-mining of groundwater, problems such as groundwater funnel, land subsidence and salt water intrusion have emerged. In order to avoid further deterioration of hydrogeological problems in over-mining regions, it is necessary to conduct the assessment of groundwater risk. In this paper, risks of shallow and deep groundwater in the water intake area of the South-to-North Water Transfer Project in Tianjin, China, were evaluated. Firstly, two sets of four-level evaluation index system were constructed based on the different characteristics of shallow and deep groundwater. Secondly, based on the normalized factor values and the synthetic weights, the risk values of shallow and deep groundwater were calculated. Lastly, the uncertainty of groundwater risk assessment was analyzed by indicator kriging method. The results meet the decision maker's demand for risk information, and overcome previous risk assessment results expressed in the form of deterministic point estimations, which ignore the uncertainty of risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.
Assessing uncertainty and risk in exploited marine populations
International Nuclear Information System (INIS)
Fogarty, M.J.; Mayo, R.K.; O'Brien, L.; Serchuk, F.M.; Rosenberg, A.A.
1996-01-01
The assessment and management of exploited fish and invertebrate populations is subject to several types of uncertainty. This uncertainty translates into risk to the population in the development and implementation of fishery management advice. Here, we define risk as the probability that exploitation rates will exceed a threshold level where long term sustainability of the stock is threatened. We distinguish among several sources of error or uncertainty due to (a) stochasticity in demographic rates and processes, particularly in survival rates during the early fife stages; (b) measurement error resulting from sampling variation in the determination of population parameters or in model estimation; and (c) the lack of complete information on population and ecosystem dynamics. The first represents a form of aleatory uncertainty while the latter two factors represent forms of epistemic uncertainty. To illustrate these points, we evaluate the recent status of the Georges Bank cod stock in a risk assessment framework. Short term stochastic projections are made accounting for uncertainty in population size and for random variability in the number of young surviving to enter the fishery. We show that recent declines in this cod stock can be attributed to exploitation rates that have substantially exceeded sustainable levels
Uncertainty management in integrated modelling, the IMAGE case
International Nuclear Information System (INIS)
Van der Sluijs, J.P.
1995-01-01
Integrated assessment models of global environmental problems play an increasingly important role in decision making. This use demands a good insight regarding the reliability of these models. In this paper we analyze uncertainty management in the IMAGE-project (Integrated Model to Assess the Greenhouse Effect). We use a classification scheme comprising type and source of uncertainty. Our analysis shows reliability analysis as main area for improvement. We briefly review a recently developed methodology, NUSAP (Numerical, Unit, Spread, Assessment and Pedigree), that systematically addresses the strength of data in terms of spread, reliability and scientific status (pedigree) of information. This approach is being tested through interviews with model builders. 3 tabs., 20 refs
Methodology for qualitative uncertainty assessment of climate impact indicators
Otto, Juliane; Keup-Thiel, Elke; Rechid, Diana; Hänsler, Andreas; Pfeifer, Susanne; Roth, Ellinor; Jacob, Daniela
2016-04-01
The FP7 project "Climate Information Portal for Copernicus" (CLIPC) is developing an integrated platform of climate data services to provide a single point of access for authoritative scientific information on climate change and climate change impacts. In this project, the Climate Service Center Germany (GERICS) has been in charge of the development of a methodology on how to assess the uncertainties related to climate impact indicators. Existing climate data portals mainly treat the uncertainties in two ways: Either they provide generic guidance and/or express with statistical measures the quantifiable fraction of the uncertainty. However, none of the climate data portals give the users a qualitative guidance how confident they can be in the validity of the displayed data. The need for such guidance was identified in CLIPC user consultations. Therefore, we aim to provide an uncertainty assessment that provides the users with climate impact indicator-specific guidance on the degree to which they can trust the outcome. We will present an approach that provides information on the importance of different sources of uncertainties associated with a specific climate impact indicator and how these sources affect the overall 'degree of confidence' of this respective indicator. To meet users requirements in the effective communication of uncertainties, their feedback has been involved during the development process of the methodology. Assessing and visualising the quantitative component of uncertainty is part of the qualitative guidance. As visual analysis method, we apply the Climate Signal Maps (Pfeifer et al. 2015), which highlight only those areas with robust climate change signals. Here, robustness is defined as a combination of model agreement and the significance of the individual model projections. Reference Pfeifer, S., Bülow, K., Gobiet, A., Hänsler, A., Mudelsee, M., Otto, J., Rechid, D., Teichmann, C. and Jacob, D.: Robustness of Ensemble Climate Projections
Hanson, Niklas; Stark, John D
2012-04-01
Traditionally, ecological risk assessments (ERA) of pesticides have been based on risk ratios, where the predicted concentration of the chemical is compared to the concentration that causes biological effects. The concentration that causes biological effect is mostly determined from laboratory experiments using endpoints on the level of the individual (e.g., mortality and reproduction). However, the protection goals are mostly defined at the population level. To deal with the uncertainty in the necessary extrapolations, safety factors are used. Major disadvantages with this simplified approach is that it is difficult to relate a risk ratio to the environmental protection goals, and that the use of fixed safety factors can result in over- as well as underprotective assessments. To reduce uncertainty and increase value relevance in ERA, it has been argued that population models should be used more frequently. In the present study, we have used matrix population models for 3 daphnid species (Ceriodaphnia dubia, Daphnia magna, and D. pulex) to reduce uncertainty and increase value relevance in the ERA of a pesticide (spinosad). The survival rates in the models were reduced in accordance with data from traditional acute mortality tests. As no data on reproductive effects were available, the conservative assumption that no reproduction occurred during the exposure period was made. The models were used to calculate the minimum population size and the time to recovery. These endpoints can be related to the European Union (EU) protection goals for aquatic ecosystems in the vicinity of agricultural fields, which state that reversible population level effects are acceptable if there is recovery within an acceptable (undefined) time frame. The results of the population models were compared to the acceptable (according to EU documents) toxicity exposure ratio (TER) that was based on the same data. At the acceptable TER, which was based on the most sensitive species (C. dubia
Rivera, Diego; Rivas, Yessica; Godoy, Alex
2015-02-01
Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s-1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.
Uncertainty analysis for a field-scale P loss model
Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...
Quantification of uncertainties of modeling and simulation
International Nuclear Information System (INIS)
Ma Zhibo; Yin Jianwei
2012-01-01
The principles of Modeling and Simulation (M and S) is interpreted by a functional relation, from which the total uncertainties of M and S are identified and sorted to three parts considered to vary along with the conceptual models' parameters. According to the idea of verification and validation, the space of the parameters is parted to verified and applied domains, uncertainties in the verified domain are quantified by comparison between numerical and standard results, and those in the applied domain are quantified by a newly developed extrapolating method. Examples are presented to demonstrate and qualify the ideas aimed to build a framework to quantify the uncertainties of M and S. (authors)
Empirical Bayesian inference and model uncertainty
International Nuclear Information System (INIS)
Poern, K.
1994-01-01
This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability
Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...
International Nuclear Information System (INIS)
Jin Hosang; Palta, Jatinder R.; Kim, You-Hyun; Kim, Siyong
2010-01-01
Purpose: To analyze dose uncertainty using a previously published dose-uncertainty model, and to assess potential dosimetric risks existing in prostate intensity-modulated radiotherapy (IMRT). Methods and Materials: The dose-uncertainty model provides a three-dimensional (3D) dose-uncertainty distribution in a given confidence level. For 8 retrospectively selected patients, dose-uncertainty maps were constructed using the dose-uncertainty model at the 95% CL. In addition to uncertainties inherent to the radiation treatment planning system, four scenarios of spatial errors were considered: machine only (S1), S1 + intrafraction, S1 + interfraction, and S1 + both intrafraction and interfraction errors. To evaluate the potential risks of the IMRT plans, three dose-uncertainty-based plan evaluation tools were introduced: confidence-weighted dose-volume histogram, confidence-weighted dose distribution, and dose-uncertainty-volume histogram. Results: Dose uncertainty caused by interfraction setup error was more significant than that of intrafraction motion error. The maximum dose uncertainty (95% confidence) of the clinical target volume (CTV) was smaller than 5% of the prescribed dose in all but two cases (13.9% and 10.2%). The dose uncertainty for 95% of the CTV volume ranged from 1.3% to 2.9% of the prescribed dose. Conclusions: The dose uncertainty in prostate IMRT could be evaluated using the dose-uncertainty model. Prostate IMRT plans satisfying the same plan objectives could generate a significantly different dose uncertainty because a complex interplay of many uncertainty sources. The uncertainty-based plan evaluation contributes to generating reliable and error-resistant treatment plans.
Uncertainty estimation in nuclear power plant probabilistic safety assessment
International Nuclear Information System (INIS)
Guarro, S.B.; Cummings, G.E.
1989-01-01
Probabilistic Risk Assessment (PRA) was introduced in the nuclear industry and the nuclear regulatory process in 1975 with the publication of the Reactor Safety Study by the U.S. Nuclear Regulatory Commission. Almost fifteen years later, the state-of-the-art in this field has been expanded and sharpened in many areas, and about thirty-five plant-specific PRAs (Probabilistic Risk Assessments) have been performed by the nuclear utility companies or by the U.S. Nuclear Regulatory commission. Among the areas where the most evident progress has been made in PRA and PSA (Probabilistic Safety Assessment, as these studies are more commonly referred to in the international community outside the U.S.) is the development of a consistent framework for the identification of sources of uncertainty and the estimation of their magnitude as it impacts various risk measures. Techniques to propagate uncertainty in reliability data through the risk models and display its effect on the top level risk estimates were developed in the early PRAs. The Seismic Safety Margin Research Program (SSMRP) study was the first major risk study to develop an approach to deal explicitly with uncertainty in risk estimates introduced not only by uncertainty in component reliability data, but by the incomplete state of knowledge of the assessor(s) with regard to basic phenomena that may trigger and drive a severe accident. More recently NUREG-1150, another major study of reactor risk sponsored by the NRC, has expanded risk uncertainty estimation and analysis into the realm of model uncertainty related to the relatively poorly known post-core-melt phenomena which determine the behavior of the molten core and of the rector containment structures
Evaluating variability and uncertainty in radiological impact assessment using SYMBIOSE
International Nuclear Information System (INIS)
Simon-Cornu, M.; Beaugelin-Seiller, K.; Boyer, P.; Calmon, P.; Garcia-Sanchez, L.; Mourlon, C.; Nicoulaud, V.; Sy, M.; Gonze, M.A.
2015-01-01
SYMBIOSE is a modelling platform that accounts for variability and uncertainty in radiological impact assessments, when simulating the environmental fate of radionuclides and assessing doses to human populations. The default database of SYMBIOSE is partly based on parameter values that are summarized within International Atomic Energy Agency (IAEA) documents. To characterize uncertainty on the transfer parameters, 331 Probability Distribution Functions (PDFs) were defined from the summary statistics provided within the IAEA documents (i.e. sample size, minimal and maximum values, arithmetic and geometric means, standard and geometric standard deviations) and are made available as spreadsheet files. The methods used to derive the PDFs without complete data sets, but merely the summary statistics, are presented. Then, a simple case-study illustrates the use of the database in a second-order Monte Carlo calculation, separating parametric uncertainty and inter-individual variability. - Highlights: • Parametric uncertainty in radioecology was derived from IAEA documents. • 331 Probability Distribution Functions were defined for transfer parameters. • Parametric uncertainty and inter-individual variability were propagated
Bayesian uncertainty analyses of probabilistic risk models
International Nuclear Information System (INIS)
Pulkkinen, U.
1989-01-01
Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed
Model Uncertainty Quantification Methods In Data Assimilation
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
Modeling of uncertainties in statistical inverse problems
International Nuclear Information System (INIS)
Kaipio, Jari
2008-01-01
In all real world problems, the models that tie the measurements to the unknowns of interest, are at best only approximations for reality. While moderate modeling and approximation errors can be tolerated with stable problems, inverse problems are a notorious exception. Typical modeling errors include inaccurate geometry, unknown boundary and initial data, properties of noise and other disturbances, and simply the numerical approximations of the physical models. In principle, the Bayesian approach to inverse problems, in which all uncertainties are modeled as random variables, is capable of handling these uncertainties. Depending on the type of uncertainties, however, different strategies may be adopted. In this paper we give an overview of typical modeling errors and related strategies within the Bayesian framework.
UNCERTAINTIES IN TRICHLOROETHYLENE PHARMACOKINETIC MODELS
Understanding the pharmacokinetics of a chemical¯its absorption, distribution, metabolism, and excretion in humans and laboratory animals ¯ is critical to the assessment of its human health risks. For trichloroethylene (TCE), numerous physiologically-based pharmacokinetic (PBPK)...
Uncertainty modeling process for semantic technology
Directory of Open Access Journals (Sweden)
Rommel N. Carvalho
2016-08-01
Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.
International Nuclear Information System (INIS)
Cranwell, R.M.
1987-01-01
Uncertainties in the performance assessment of geologic radioactive waste repositories have several sources. The more important ones include: 1) uncertainty in the conditions of a disposal system over the temporal scales set forth in regulations, 2) uncertainty in the conceptualization of the geohydrologic system, 3) uncertainty in the theoretical description of a given conceptual model of the system, 4) uncertainty in the development of computer codes to implement the solution of a mathematical model, and 5) uncertainty in the parameters and data required in the models and codes used to assess the long-term performance of the disposal system. This paper discusses each of these uncertainties and outlines methods for addressing these uncertainties
Quinn, J. D.; Zeng, Z.; Shoemaker, C. A.; Woodard, J.
2014-12-01
In sub-Saharan Africa, where the majority of the population earns their living from agriculture, government expenditures in many countries are being re-directed to the sector to increase productivity and decrease poverty. However, many of these investments are seeing low returns because they are poorly targeted. A geographic tool that accounts for spatial heterogeneity and temporal variability in the factors of production would allow governments and donors to optimize their investments by directing them to farmers for whom they are most profitable. One application for which this is particularly relevant is fertilizer recommendations. It is well-known that soil fertility in much of sub-Saharan Africa is declining due to insufficient nutrient inputs to replenish those lost through harvest. Since fertilizer application rates in sub-Saharan Africa are several times smaller than in other developing countries, it is often assumed that African farmers are under-applying fertilizer. However, this assumption ignores the risk farmers face in choosing whether or how much fertilizer to apply. Simply calculating the benefit/cost ratio of applying a given level of fertilizer in a particular year over a large, aggregated region (as is often done) overlooks the variability in yield response seen at different sites within the region, and at the same site from year to year. Using Ethiopia as an example, we are developing a 1 km resolution fertilizer distribution tool that provides pre-season fertilizer recommendations throughout the agricultural regions of the country, conditional on seasonal climate forecasts. By accounting for spatial heterogeneity in soil, climate, market and travel conditions, as well as uncertainty in climate and output prices at the time a farmer must purchase fertilizer, this stochastic optimization tool gives better recommendations to governments, fertilizer companies, and aid organizations looking to optimize the welfare benefits achieved by their
Chemical model reduction under uncertainty
Malpica Galassi, Riccardo; Valorani, Mauro; Najm, Habib N.; Safta, Cosmin; Khalil, Mohammad; Ciottoli, Pietro P.
2017-01-01
A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis
Denys Yemshanov; Frank H Koch; Mark Ducey
2015-01-01
Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision makerâs perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...
Modelling of data uncertainties on hybrid computers
Energy Technology Data Exchange (ETDEWEB)
Schneider, Anke (ed.)
2016-06-15
The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the
Uncertainty of Energy Consumption Assessment of Domestic Buildings
DEFF Research Database (Denmark)
Brohus, Henrik; Heiselberg, Per; Simonsen, A.
2009-01-01
In order to assess the influence of energy reduction initiatives, to determine the expected annual cost, to calculate life cycle cost, emission impact, etc. it is crucial to be able to assess the energy consumption reasonably accurate. The present work undertakes a theoretical and empirical study...... of the uncertainty of energy consumption assessment of domestic buildings. The calculated energy consumption of a number of almost identical domestic buildings in Denmark is compared with the measured energy consumption. Furthermore, the uncertainty is determined by means of stochastic modelling based on input...... to correspond reasonably well; however, it is also found that significant differences may occur between calculated and measured energy consumption due to the spread and due to the fact that the result can only be determined with a certain probability. It is found that occupants' behaviour is the major...
Taisne, B.; Pansino, S.; Manta, F.; Tay Wen Jing, C.
2017-12-01
Have you ever dreamed about continuous, high resolution InSAR data? Have you ever dreamed about a transparent earth allowing you to see what is actually going on under a volcano? Well, you likely dreamed about an analogue facility that allows you to scale down the natural system to fit into a room, with a controlled environment and complex visualisation system. Analogue modeling has been widely used to understand magmatic processes and thanks to a transparent analogue for the elastic Earth's crust, we can see, as it evolves with time, the migration of a dyke, the volume change of a chamber or the rise of a bubble in a conduit. All those phenomena are modeled theoretically or numerically, with their own simplifications. Therefore, how well are we really constraining the physical parameters describing the evolution of a dyke or a chamber? Getting access to those parameters, in real time and with high level of confidence is of paramount importance while dealing with unrest at volcanoes. The aim of this research is to estimate the uncertainties of the widely used Okada and Mogi models. To do so, we design a set of analogue experiments allowing us to explore different elastic properties of the medium, the characteristic of the fluid injected into the medium as well as the depth, size and volume change of a reservoir. The associated surface deformation is extracted using an array of synchronised cameras and using digital image correlation and structure from motion for horizontal and vertical deformation respectively. The surface deformation are then inverted to retrieve the controlling parameters (e.g. location and volume change of a chamber, or orientation, position, length, breadth and opening of a dyke). By comparing those results with the known parameters, that we can see and measure independently, we estimate the uncertainties of the models themself, and the associated level of confidence for each of the inverted parameters.
Estimating Coastal Digital Elevation Model (DEM) Uncertainty
Amante, C.; Mesick, S.
2017-12-01
Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.
Uncertainty quantification in wind farm flow models
DEFF Research Database (Denmark)
Murcia Leon, Juan Pablo
uncertainties through a model chain are presented and applied to several wind energy related problems such as: annual energy production estimation, wind turbine power curve estimation, wake model calibration and validation, and estimation of lifetime equivalent fatigue loads on a wind turbine. Statistical...
Barbetta, Silvia; Coccia, Gabriele; Moramarco, Tommaso; Brocca, Luca; Todini, Ezio
2017-08-01
This work extends the multi-temporal approach of the Model Conditional Processor (MCP-MT) to the multi-model case and to the four Truncated Normal Distributions (TNDs) approach, demonstrating the improvement on the single-temporal one. The study is framed in the context of probabilistic Bayesian decision-making that is appropriate to take rational decisions on uncertain future outcomes. As opposed to the direct use of deterministic forecasts, the probabilistic forecast identifies a predictive probability density function that represents a fundamental knowledge on future occurrences. The added value of MCP-MT is the identification of the probability that a critical situation will happen within the forecast lead-time and when, more likely, it will occur. MCP-MT is thoroughly tested for both single-model and multi-model configurations at a gauged site on the Tiber River, central Italy. The stages forecasted by two operative deterministic models, STAFOM-RCM and MISDc, are considered for the study. The dataset used for the analysis consists of hourly data from 34 flood events selected on a time series of six years. MCP-MT improves over the original models' forecasts: the peak overestimation and the rising limb delayed forecast, characterizing MISDc and STAFOM-RCM respectively, are significantly mitigated, with a reduced mean error on peak stage from 45 to 5 cm and an increased coefficient of persistence from 0.53 up to 0.75. The results show that MCP-MT outperforms the single-temporal approach and is potentially useful for supporting decision-making because the exceedance probability of hydrometric thresholds within a forecast horizon and the most probable flooding time can be estimated.
Åberg, Isabelle
2017-01-01
Expansion of cities and major infrastructure projects lead to changes in land use and river flows. The probability of flooding is expected to increase in the future as a result of these changes in combination with climate change. Hydraulic models can be used to obtain simulated water levels to investigate the risk of flooding and identify areas that might potentially be flooded due to climate change. Since a model is a simplification of the reality it is important to be aware of a model’s unc...
Uncertainty in biology a computational modeling approach
Gomez-Cabrero, David
2016-01-01
Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies. Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process. This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples. This book is intended for graduate stude...
Bellanti, Francesco
2015-01-01
Growing awareness about the relevance of formal evaluation of the efficacy and safety in children has resulted into important changes in the requirements for the approval of medicines for children. In this thesis a model-based approach is proposed to ensure more efficient use of the evidence
Partitioning uncertainty in streamflow projections under nonstationary model conditions
Chawla, Ila; Mujumdar, P. P.
2018-02-01
Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them
Return Predictability, Model Uncertainty, and Robust Investment
DEFF Research Database (Denmark)
Lukas, Manuel
Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....
Sensitivity to Uncertainty in Asteroid Impact Risk Assessment
Mathias, D.; Wheeler, L.; Prabhu, D. K.; Aftosmis, M.; Dotson, J.; Robertson, D. K.
2015-12-01
The Engineering Risk Assessment (ERA) team at NASA Ames Research Center is developing a physics-based impact risk model for probabilistically assessing threats from potential asteroid impacts on Earth. The model integrates probabilistic sampling of asteroid parameter ranges with physics-based analyses of entry, breakup, and impact to estimate damage areas and casualties from various impact scenarios. Assessing these threats is a highly coupled, dynamic problem involving significant uncertainties in the range of expected asteroid characteristics, how those characteristics may affect the level of damage, and the fidelity of various modeling approaches and assumptions. The presented model is used to explore the sensitivity of impact risk estimates to these uncertainties in order to gain insight into what additional data or modeling refinements are most important for producing effective, meaningful risk assessments. In the extreme cases of very small or very large impacts, the results are generally insensitive to many of the characterization and modeling assumptions. However, the nature of the sensitivity can change across moderate-sized impacts. Results will focus on the value of additional information in this critical, mid-size range, and how this additional data can support more robust mitigation decisions.
Quantifying uncertainties in wind energy assessment
Patlakas, Platon; Galanis, George; Kallos, George
2015-04-01
The constant rise of wind energy production and the subsequent penetration in global energy markets during the last decades resulted in new sites selection with various types of problems. Such problems arise due to the variability and the uncertainty of wind speed. The study of the wind speed distribution lower and upper tail may support the quantification of these uncertainties. Such approaches focused on extreme wind conditions or periods below the energy production threshold are necessary for a better management of operations. Towards this direction, different methodologies are presented for the credible evaluation of potential non-frequent/extreme values for these environmental conditions. The approaches used, take into consideration the structural design of the wind turbines according to their lifespan, the turbine failures, the time needed for repairing as well as the energy production distribution. In this work, a multi-parametric approach for studying extreme wind speed values will be discussed based on tools of Extreme Value Theory. In particular, the study is focused on extreme wind speed return periods and the persistence of no energy production based on a weather modeling system/hind cast/10-year dataset. More specifically, two methods (Annual Maxima and Peaks Over Threshold) were used for the estimation of extreme wind speeds and their recurrence intervals. Additionally, two different methodologies (intensity given duration and duration given intensity, both based on Annual Maxima method) were implied to calculate the extreme events duration, combined with their intensity as well as the event frequency. The obtained results prove that the proposed approaches converge, at least on the main findings, for each case. It is also remarkable that, despite the moderate wind speed climate of the area, several consequent days of no energy production are observed.
Energy Technology Data Exchange (ETDEWEB)
Scott, Michael J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Daly, Don S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Hathaway, John E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lansing, Carina S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Liu, Ying [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); McJeon, Haewon C. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moss, Richard H. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Patel, Pralit L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Peterson, Marty J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Rice, Jennie S. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhou, Yuyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2014-12-06
This report presents data and assumptions employed in an application of PNNL’s Global Change Assessment Model with a newly-developed Monte Carlo analysis capability. The model is used to analyze the impacts of more aggressive U.S. residential and commercial building-energy codes and equipment standards on energy consumption and energy service costs at the state level, explicitly recognizing uncertainty in technology effectiveness and cost, socioeconomics, presence or absence of carbon prices, and climate impacts on energy demand. The report provides a summary of how residential and commercial buildings are modeled, together with assumptions made for the distributions of state–level population, Gross Domestic Product (GDP) per worker, efficiency and cost of residential and commercial energy equipment by end use, and efficiency and cost of residential and commercial building shells. The cost and performance of equipment and of building shells are reported separately for current building and equipment efficiency standards and for more aggressive standards. The report also details assumptions concerning future improvements brought about by projected trends in technology.
Coping with uncertainty in environmental impact assessments: Open techniques
Chivatá Cárdenas, Ibsen; Halman, Johannes I.M.
2016-01-01
Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take
Modelling ecosystem service flows under uncertainty with stochiastic SPAN
Johnson, Gary W.; Snapp, Robert R.; Villa, Ferdinando; Bagstad, Kenneth J.
2012-01-01
Ecosystem service models are increasingly in demand for decision making. However, the data required to run these models are often patchy, missing, outdated, or untrustworthy. Further, communication of data and model uncertainty to decision makers is often either absent or unintuitive. In this work, we introduce a systematic approach to addressing both the data gap and the difﬁculty in communicating uncertainty through a stochastic adaptation of the Service Path Attribution Networks (SPAN) framework. The SPAN formalism assesses ecosystem services through a set of up to 16 maps, which characterize the services in a study area in terms of ﬂow pathways between ecosystems and human beneﬁciaries. Although the SPAN algorithms were originally deﬁned deterministically, we present them here in a stochastic framework which combines probabilistic input data with a stochastic transport model in order to generate probabilistic spatial outputs. This enables a novel feature among ecosystem service models: the ability to spatially visualize uncertainty in the model results. The stochastic SPAN model can analyze areas where data limitations are prohibitive for deterministic models. Greater uncertainty in the model inputs (including missing data) should lead to greater uncertainty expressed in the model’s output distributions. By using Bayesian belief networks to ﬁll data gaps and expert-provided trust assignments to augment untrustworthy or outdated information, we can account for uncertainty in input data, producing a model that is still able to run and provide information where strictly deterministic models could not. Taken together, these attributes enable more robust and intuitive modelling of ecosystem services under uncertainty.
Communicating uncertainties in assessments of future sea level rise
Wikman-Svahn, P.
2013-12-01
How uncertainty should be managed and communicated in policy-relevant scientific assessments is directly connected to the role of science and the responsibility of scientists. These fundamentally philosophical issues influence how scientific assessments are made and how scientific findings are communicated to policymakers. It is therefore of high importance to discuss implicit assumptions and value judgments that are made in policy-relevant scientific assessments. The present paper examines these issues for the case of scientific assessments of future sea level rise. The magnitude of future sea level rise is very uncertain, mainly due to poor scientific understanding of all physical mechanisms affecting the great ice sheets of Greenland and Antarctica, which together hold enough land-based ice to raise sea levels more than 60 meters if completely melted. There has been much confusion from policymakers on how different assessments of future sea levels should be interpreted. Much of this confusion is probably due to how uncertainties are characterized and communicated in these assessments. The present paper draws on the recent philosophical debate on the so-called "value-free ideal of science" - the view that science should not be based on social and ethical values. Issues related to how uncertainty is handled in scientific assessments are central to this debate. This literature has much focused on how uncertainty in data, parameters or models implies that choices have to be made, which can have social consequences. However, less emphasis has been on how uncertainty is characterized when communicating the findings of a study, which is the focus of the present paper. The paper argues that there is a tension between on the one hand the value-free ideal of science and on the other hand usefulness for practical applications in society. This means that even if the value-free ideal could be upheld in theory, by carefully constructing and hedging statements characterizing
Including model uncertainty in risk-informed decision making
International Nuclear Information System (INIS)
Reinert, Joshua M.; Apostolakis, George E.
2006-01-01
Model uncertainties can have a significant impact on decisions regarding licensing basis changes. We present a methodology to identify basic events in the risk assessment that have the potential to change the decision and are known to have significant model uncertainties. Because we work with basic event probabilities, this methodology is not appropriate for analyzing uncertainties that cause a structural change to the model, such as success criteria. We use the risk achievement worth (RAW) importance measure with respect to both the core damage frequency (CDF) and the change in core damage frequency (ΔCDF) to identify potentially important basic events. We cross-check these with generically important model uncertainties. Then, sensitivity analysis is performed on the basic event probabilities, which are used as a proxy for the model parameters, to determine how much error in these probabilities would need to be present in order to impact the decision. A previously submitted licensing basis change is used as a case study. Analysis using the SAPHIRE program identifies 20 basic events as important, four of which have model uncertainties that have been identified in the literature as generally important. The decision is fairly insensitive to uncertainties in these basic events. In three of these cases, one would need to show that model uncertainties would lead to basic event probabilities that would be between two and four orders of magnitude larger than modeled in the risk assessment before they would become important to the decision. More detailed analysis would be required to determine whether these higher probabilities are reasonable. Methods to perform this analysis from the literature are reviewed and an example is demonstrated using the case study
Uncertainty propagation in probabilistic risk assessment: A comparative study
International Nuclear Information System (INIS)
Ahmed, S.; Metcalf, D.R.; Pegram, J.W.
1982-01-01
Three uncertainty propagation techniques, namely method of moments, discrete probability distribution (DPD), and Monte Carlo simulation, generally used in probabilistic risk assessment, are compared and conclusions drawn in terms of the accuracy of the results. For small uncertainty in the basic event unavailabilities, the three methods give similar results. For large uncertainty, the method of moments is in error, and the appropriate method is to propagate uncertainty in the discrete form either by DPD method without sampling or by Monte Carlo. (orig.)
Indian Academy of Sciences (India)
To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...
Meteorological uncertainty of atmospheric dispersion model results (MUD)
Energy Technology Data Exchange (ETDEWEB)
Havskov Soerensen, J.; Amstrup, B.; Feddersen, H. [Danish Meteorological Institute, Copenhagen (Denmark)] [and others
2013-08-15
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)
Meteorological uncertainty of atmospheric dispersion model results (MUD)
International Nuclear Information System (INIS)
Havskov Soerensen, J.; Amstrup, B.; Feddersen, H.
2013-08-01
The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)
Brakefield, Linzy K.; White, Jeremy T.; Houston, Natalie A.; Thomas, Jonathan V.
2015-01-01
In 2010, the U.S. Geological Survey, in cooperation with the San Antonio Water System, began a study to assess the brackish-water movement within the Edwards aquifer (more specifically the potential for brackish-water encroachment into wells near the interface between the freshwater and brackish-water transition zones, referred to in this report as the transition-zone interface) and effects on spring discharge at Comal and San Marcos Springs under drought conditions using a numerical model. The quantitative targets of this study are to predict the effects of higher-than-average groundwater withdrawals from wells and drought-of-record rainfall conditions of 1950–56 on (1) dissolved-solids concentration changes at production wells near the transition-zone interface, (2) total spring discharge at Comal and San Marcos Springs, and (3) the groundwater head (head) at Bexar County index well J-17. The predictions of interest, and the parameters implemented into the model, were evaluated to quantify their uncertainty so the results of the predictions could be presented in terms of a 95-percent credible interval.
Uncertainty in reactive transport geochemical modelling
International Nuclear Information System (INIS)
Oedegaard-Jensen, A.; Ekberg, C.
2005-01-01
Full text of publication follows: Geochemical modelling is one way of predicting the transport of i.e. radionuclides in a rock formation. In a rock formation there will be fractures in which water and dissolved species can be transported. The composition of the water and the rock can either increase or decrease the mobility of the transported entities. When doing simulations on the mobility or transport of different species one has to know the exact water composition, the exact flow rates in the fracture and in the surrounding rock, the porosity and which minerals the rock is composed of. The problem with simulations on rocks is that the rock itself it not uniform i.e. larger fractures in some areas and smaller in other areas which can give different water flows. The rock composition can be different in different areas. In additions to this variance in the rock there are also problems with measuring the physical parameters used in a simulation. All measurements will perturb the rock and this perturbation will results in more or less correct values of the interesting parameters. The analytical methods used are also encumbered with uncertainties which in this case are added to the uncertainty from the perturbation of the analysed parameters. When doing simulation the effect of the uncertainties must be taken into account. As the computers are getting faster and faster the complexity of simulated systems are increased which also increase the uncertainty in the results from the simulations. In this paper we will show how the uncertainty in the different parameters will effect the solubility and mobility of different species. Small uncertainties in the input parameters can result in large uncertainties in the end. (authors)
Parametric uncertainty in optical image modeling
Potzick, James; Marx, Egon; Davidson, Mark
2006-10-01
Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.
Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.
2014-12-01
Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping
Optical Model and Cross Section Uncertainties
Energy Technology Data Exchange (ETDEWEB)
Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.
2009-10-05
Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.
Uncertainty quantification and stochastic modeling with Matlab
Souza de Cursi, Eduardo
2015-01-01
Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no
Representing uncertainty on model analysis plots
Directory of Open Access Journals (Sweden)
Trevor I. Smith
2016-09-01
Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.
Evaluation of uncertainties in selected environmental dispersion models
International Nuclear Information System (INIS)
Little, C.A.; Miller, C.W.
1979-01-01
Compliance with standards of radiation dose to the general public has necessitated the use of dispersion models to predict radionuclide concentrations in the environment due to releases from nuclear facilities. Because these models are only approximations of reality and because of inherent variations in the input parameters used in these models, their predictions are subject to uncertainty. Quantification of this uncertainty is necessary to assess the adequacy of these models for use in determining compliance with protection standards. This paper characterizes the capabilities of several dispersion models to predict accurately pollutant concentrations in environmental media. Three types of models are discussed: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations
UNCERTAINTIES IN GALACTIC CHEMICAL EVOLUTION MODELS
International Nuclear Information System (INIS)
Côté, Benoit; Ritter, Christian; Herwig, Falk; O’Shea, Brian W.; Pignatari, Marco; Jones, Samuel; Fryer, Chris L.
2016-01-01
We use a simple one-zone galactic chemical evolution model to quantify the uncertainties generated by the input parameters in numerical predictions for a galaxy with properties similar to those of the Milky Way. We compiled several studies from the literature to gather the current constraints for our simulations regarding the typical value and uncertainty of the following seven basic parameters: the lower and upper mass limits of the stellar initial mass function (IMF), the slope of the high-mass end of the stellar IMF, the slope of the delay-time distribution function of Type Ia supernovae (SNe Ia), the number of SNe Ia per M ⊙ formed, the total stellar mass formed, and the final mass of gas. We derived a probability distribution function to express the range of likely values for every parameter, which were then included in a Monte Carlo code to run several hundred simulations with randomly selected input parameters. This approach enables us to analyze the predicted chemical evolution of 16 elements in a statistical manner by identifying the most probable solutions, along with their 68% and 95% confidence levels. Our results show that the overall uncertainties are shaped by several input parameters that individually contribute at different metallicities, and thus at different galactic ages. The level of uncertainty then depends on the metallicity and is different from one element to another. Among the seven input parameters considered in this work, the slope of the IMF and the number of SNe Ia are currently the two main sources of uncertainty. The thicknesses of the uncertainty bands bounded by the 68% and 95% confidence levels are generally within 0.3 and 0.6 dex, respectively. When looking at the evolution of individual elements as a function of galactic age instead of metallicity, those same thicknesses range from 0.1 to 0.6 dex for the 68% confidence levels and from 0.3 to 1.0 dex for the 95% confidence levels. The uncertainty in our chemical evolution model
Uncertainty visualisation in the Model Web
Gerharz, L. E.; Autermann, C.; Hopmann, H.; Stasch, C.; Pebesma, E.
2012-04-01
Visualisation of geospatial data as maps is a common way to communicate spatially distributed information. If temporal and furthermore uncertainty information are included in the data, efficient visualisation methods are required. For uncertain spatial and spatio-temporal data, numerous visualisation methods have been developed and proposed, but only few tools for visualisation of data in a standardised way exist. Furthermore, usually they are realised as thick clients, and lack functionality of handling data coming from web services as it is envisaged in the Model Web. We present an interactive web tool for visualisation of uncertain spatio-temporal data developed in the UncertWeb project. The client is based on the OpenLayers JavaScript library. OpenLayers provides standard map windows and navigation tools, i.e. pan, zoom in/out, to allow interactive control for the user. Further interactive methods are implemented using jStat, a JavaScript library for statistics plots developed in UncertWeb, and flot. To integrate the uncertainty information into existing standards for geospatial data, the Uncertainty Markup Language (UncertML) was applied in combination with OGC Observations&Measurements 2.0 and JavaScript Object Notation (JSON) encodings for vector and NetCDF for raster data. The client offers methods to visualise uncertain vector and raster data with temporal information. Uncertainty information considered for the tool are probabilistic and quantified attribute uncertainties which can be provided as realisations or samples, full probability distributions functions and statistics. Visualisation is supported for uncertain continuous and categorical data. In the client, the visualisation is realised using a combination of different methods. Based on previously conducted usability studies, a differentiation between expert (in statistics or mapping) and non-expert users has been indicated as useful. Therefore, two different modes are realised together in the tool
On the proper use of Ensembles for Predictive Uncertainty assessment
Todini, Ezio; Coccia, Gabriele; Ortiz, Enrique
2015-04-01
uncertainty of the ensemble mean and that of the ensemble spread. The results of this new approach are illustrated by using data and forecasts from an operational real time flood forecasting. Coccia, G. and Todini, E. 2011. Recent developments in predictive uncertainty assessment based on the Model Conditional Processor approach. Hydrology and Earth System Sciences, 15, 3253-3274. doi:10.5194/hess-15-3253-2011. Krzysztofowicz, R. 1999 Bayesian theory of probabilistic forecasting via deterministic hydrologic model, Water Resour. Res., 35, 2739-2750. Raftery, A. E., T. Gneiting, F. Balabdaoui, and M. Polakowski, 2005. Using Bayesian model averaging to calibrate forecast ensembles, Mon. Weather Rev., 133, 1155-1174. Reggiani, P., Renner, M., Weerts, A., and van Gelder, P., 2009. Uncertainty assessment via Bayesian revision of ensemble streamflow predictions in the operational river Rhine forecasting system, Water Resour. Res., 45, W02428, doi:10.1029/2007WR006758. Todini E. 2004. Role and treatment of uncertainty in real-time flood forecasting. Hydrological Processes 18(14), 2743_2746 Todini, E. 2008. A model conditional processor to assess predictive uncertainty in flood forecasting. Intl. J. River Basin Management, 6(2): 123-137.
An educational model for ensemble streamflow simulation and uncertainty analysis
Directory of Open Access Journals (Sweden)
A. AghaKouchak
2013-02-01
Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.
Quantifying uncertainty and trade-offs in resilience assessments
Directory of Open Access Journals (Sweden)
Craig R. Allen
2018-03-01
Full Text Available Several frameworks have been developed to assess the resilience of social-ecological systems, but most require substantial data inputs, time, and technical expertise. Stakeholders and practitioners often lack the resources for such intensive efforts. Furthermore, most end with problem framing and fail to explicitly address trade-offs and uncertainty. To remedy this gap, we developed a rapid survey assessment that compares the relative resilience of social-ecological systems with respect to a number of resilience properties. This approach generates large amounts of information relative to stakeholder inputs. We targeted four stakeholder categories: government (policy, regulation, management, end users (farmers, ranchers, landowners, industry, agency/public science (research, university, extension, and NGOs (environmental, citizen, social justice in four North American watersheds, to assess social-ecological resilience through surveys. Conceptually, social-ecological systems are comprised of components ranging from strictly human to strictly ecological, but that relate directly or indirectly to one another. They have soft boundaries and several important dimensions or axes that together describe the nature of social-ecological interactions, e.g., variability, diversity, modularity, slow variables, feedbacks, capital, innovation, redundancy, and ecosystem services. There is no absolute measure of resilience, so our design takes advantage of cross-watershed comparisons and therefore focuses on relative resilience. Our approach quantifies and compares the relative resilience across watershed systems and potential trade-offs among different aspects of the social-ecological system, e.g., between social, economic, and ecological contributions. This approach permits explicit assessment of several types of uncertainty (e.g., self-assigned uncertainty for stakeholders; uncertainty across respondents, watersheds, and subsystems, and subjectivity in
Robustness for slope stability modelling under deep uncertainty
Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten
2015-04-01
Landslides can have large negative societal and economic impacts, such as loss of life and damage to infrastructure. However, the ability of slope stability assessment to guide management is limited by high levels of uncertainty in model predictions. Many of these uncertainties cannot be easily quantified, such as those linked to climate change and other future socio-economic conditions, restricting the usefulness of traditional decision analysis tools. Deep uncertainty can be managed more effectively by developing robust, but not necessarily optimal, policies that are expected to perform adequately under a wide range of future conditions. Robust strategies are particularly valuable when the consequences of taking a wrong decision are high as is often the case of when managing natural hazard risks such as landslides. In our work a physically based numerical model of hydrologically induced slope instability (the Combined Hydrology and Stability Model - CHASM) is applied together with robust decision making to evaluate the most important uncertainties (storm events, groundwater conditions, surface cover, slope geometry, material strata and geotechnical properties) affecting slope stability. Specifically, impacts of climate change on long-term slope stability are incorporated, accounting for the deep uncertainty in future climate projections. Our findings highlight the potential of robust decision making to aid decision support for landslide hazard reduction and risk management under conditions of deep uncertainty.
Energy Technology Data Exchange (ETDEWEB)
Rankinen, K.; Granlund, K. [Finnish Environmental Inst., Helsinki (Finland); Futter, M. N. [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden)
2013-11-01
The semi-distributed, dynamic INCA-N model was used to simulate the behaviour of dissolved inorganic nitrogen (DIN) in two Finnish research catchments. Parameter sensitivity and model structural uncertainty were analysed using generalized sensitivity analysis. The Mustajoki catchment is a forested upstream catchment, while the Savijoki catchment represents intensively cultivated lowlands. In general, there were more influential parameters in Savijoki than Mustajoki. Model results were sensitive to N-transformation rates, vegetation dynamics, and soil and river hydrology. Values of the sensitive parameters were based on long-term measurements covering both warm and cold years. The highest measured DIN concentrations fell between minimum and maximum values estimated during the uncertainty analysis. The lowest measured concentrations fell outside these bounds, suggesting that some retention processes may be missing from the current model structure. The lowest concentrations occurred mainly during low flow periods; so effects on total loads were small. (orig.)
Realising the Uncertainty Enabled Model Web
Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.
2012-12-01
The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address
Qian, Y.; Wang, C.; Huang, M.; Berg, L. K.; Duan, Q.; Feng, Z.; Shrivastava, M. B.; Shin, H. H.; Hong, S. Y.
2016-12-01
This study aims to quantify the relative importance and uncertainties of different physical processes and parameters in affecting simulated surface fluxes and land-atmosphere coupling strength over the Amazon region. We used two-legged coupling metrics, which include both terrestrial (soil moisture to surface fluxes) and atmospheric (surface fluxes to atmospheric state or precipitation) legs, to diagnose the land-atmosphere interaction and coupling strength. Observations made using the Department of Energy's Atmospheric Radiation Measurement (ARM) Mobile Facility during the GoAmazon field campaign together with satellite and reanalysis data are used to evaluate model performance. To quantify the uncertainty in physical parameterizations, we performed a 120 member ensemble of simulations with the WRF model using a stratified experimental design including 6 cloud microphysics, 3 convection, 6 PBL and surface layer, and 3 land surface schemes. A multiple-way analysis of variance approach is used to quantitatively analyze the inter- and intra-group (scheme) means and variances. To quantify parameter sensitivity, we conducted an additional 256 WRF simulations in which an efficient sampling algorithm is used to explore the multiple-dimensional parameter space. Three uncertainty quantification approaches are applied for sensitivity analysis (SA) of multiple variables of interest to 20 selected parameters in YSU PBL and MM5 surface layer schemes. Results show consistent parameter sensitivity across different SA methods. We found that 5 out of 20 parameters contribute more than 90% total variance, and first-order effects dominate comparing to the interaction effects. Results of this uncertainty quantification study serve as guidance for better understanding the roles of different physical processes in land-atmosphere interactions, quantifying model uncertainties from various sources such as physical processes, parameters and structural errors, and providing insights for
International Nuclear Information System (INIS)
Hammonds, J.S.; Hoffman, F.O.; Bartell, S.M.
1994-12-01
This report presents guidelines for evaluating uncertainty in mathematical equations and computer models applied to assess human health and environmental risk. Uncertainty analyses involve the propagation of uncertainty in model parameters and model structure to obtain confidence statements for the estimate of risk and identify the model components of dominant importance. Uncertainty analyses are required when there is no a priori knowledge about uncertainty in the risk estimate and when there is a chance that the failure to assess uncertainty may affect the selection of wrong options for risk reduction. Uncertainty analyses are effective when they are conducted in an iterative mode. When the uncertainty in the risk estimate is intolerable for decision-making, additional data are acquired for the dominant model components that contribute most to uncertainty. This process is repeated until the level of residual uncertainty can be tolerated. A analytical and numerical methods for error propagation are presented along with methods for identifying the most important contributors to uncertainty. Monte Carlo simulation with either Simple Random Sampling (SRS) or Latin Hypercube Sampling (LHS) is proposed as the most robust method for propagating uncertainty through either simple or complex models. A distinction is made between simulating a stochastically varying assessment endpoint (i.e., the distribution of individual risks in an exposed population) and quantifying uncertainty due to lack of knowledge about a fixed but unknown quantity (e.g., a specific individual, the maximally exposed individual, or the mean, median, or 95%-tile of the distribution of exposed individuals). Emphasis is placed on the need for subjective judgement to quantify uncertainty when relevant data are absent or incomplete
Parametric uncertainty modeling for robust control
DEFF Research Database (Denmark)
Rasmussen, K.H.; Jørgensen, Sten Bay
1999-01-01
The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...... point changes. It is shown that a diagonal PI control structure provides robust performance towards variations in feed flow rate or feed concentrations. However including both liquid and vapor flow delays robust performance specifications cannot be satisfied with this simple diagonal control structure...
International Nuclear Information System (INIS)
Scott, Michael J.; Daly, Don S.; Zhou, Yuyu; Rice, Jennie S.; Patel, Pralit L.; McJeon, Haewon C.; Page Kyle, G.; Kim, Son H.; Eom, Jiyong
2014-01-01
Improving the energy efficiency of building stock, commercial equipment, and household appliances can have a major positive impact on energy use, carbon emissions, and building services. Sub-national regions such as the U.S. states wish to increase energy efficiency, reduce carbon emissions, or adapt to climate change. Evaluating sub-national policies to reduce energy use and emissions is difficult because of the large uncertainties in socioeconomic factors, technology performance and cost, and energy and climate policies. Climate change itself may undercut such policies. However, assessing all of the uncertainties of large-scale energy and climate models by performing thousands of model runs can be a significant modeling effort with its accompanying computational burden. By applying fractional–factorial methods to the GCAM-USA 50-state integrated-assessment model in the context of a particular policy question, this paper demonstrates how a decision-focused sensitivity analysis strategy can greatly reduce computational burden in the presence of uncertainty and reveal the important drivers for decisions and more detailed uncertainty analysis. - Highlights: • We evaluate building energy codes and standards for climate mitigation. • We use an integrated assessment model and fractional factorial methods. • Decision criteria are energy use, CO2 emitted, and building service cost. • We demonstrate sensitivity analysis for three states. • We identify key variables to propagate with Monte Carlo or surrogate models
DEFF Research Database (Denmark)
Plósz, Benedek; De Clercq, Jeriffa; Nopens, Ingmar
2011-01-01
In WWTP models, the accurate assessment of solids inventory in bioreactors equipped with solidliquid separators, mostly described using one-dimensional (1-D) secondary settling tank (SST) models, is the most fundamental requirement of any calibration procedure. Scientific knowledge...... of the solids settling behaviour is investigated. It is found that the settler behaviour, simulated by the hyperbolic model, can introduce significant errors into the approximation of the solids retention time and thus solids inventory of the system. We demonstrate that these impacts can potentially cause...
Particle Swarm Optimization and Uncertainty Assessment in Inverse Problems
Directory of Open Access Journals (Sweden)
José L. G. Pallero
2018-01-01
Full Text Available Most inverse problems in the industry (and particularly in geophysical exploration are highly underdetermined because the number of model parameters too high to achieve accurate data predictions and because the sampling of the data space is scarce and incomplete; it is always affected by different kinds of noise. Additionally, the physics of the forward problem is a simplification of the reality. All these facts result in that the inverse problem solution is not unique; that is, there are different inverse solutions (called equivalent, compatible with the prior information that fits the observed data within similar error bounds. In the case of nonlinear inverse problems, these equivalent models are located in disconnected flat curvilinear valleys of the cost-function topography. The uncertainty analysis consists of obtaining a representation of this complex topography via different sampling methodologies. In this paper, we focus on the use of a particle swarm optimization (PSO algorithm to sample the region of equivalence in nonlinear inverse problems. Although this methodology has a general purpose, we show its application for the uncertainty assessment of the solution of a geophysical problem concerning gravity inversion in sedimentary basins, showing that it is possible to efficiently perform this task in a sampling-while-optimizing mode. Particularly, we explain how to use and analyze the geophysical models sampled by exploratory PSO family members to infer different descriptors of nonlinear uncertainty.
County-Level Climate Uncertainty for Risk Assessments: Volume 1.
Energy Technology Data Exchange (ETDEWEB)
Backus, George A.; Lowry, Thomas Stephen; Jones, Shannon M; Walker, La Tonya Nicole; Roberts, Barry L; Malczynski, Leonard A.
2017-06-01
This report uses the CMIP5 series of climate model simulations to produce country- level uncertainty distributions for use in socioeconomic risk assessments of climate change impacts. It provides appropriate probability distributions, by month, for 169 countries and autonomous-areas on temperature, precipitation, maximum temperature, maximum wind speed, humidity, runoff, soil moisture and evaporation for the historical period (1976-2005), and for decadal time periods to 2100. It also provides historical and future distributions for the Arctic region on ice concentration, ice thickness, age of ice, and ice ridging in 15-degree longitude arc segments from the Arctic Circle to 80 degrees latitude, plus two polar semicircular regions from 80 to 90 degrees latitude. The uncertainty is meant to describe the lack of knowledge rather than imprecision in the physical simulation because the emphasis is on unfalsified risk and its use to determine potential socioeconomic impacts. The full report is contained in 27 volumes.
Reliability assessment of complex electromechanical systems under epistemic uncertainty
International Nuclear Information System (INIS)
Mi, Jinhua; Li, Yan-Feng; Yang, Yuan-Jian; Peng, Weiwen; Huang, Hong-Zhong
2016-01-01
The appearance of macro-engineering and mega-project have led to the increasing complexity of modern electromechanical systems (EMSs). The complexity of the system structure and failure mechanism makes it more difficult for reliability assessment of these systems. Uncertainty, dynamic and nonlinearity characteristics always exist in engineering systems due to the complexity introduced by the changing environments, lack of data and random interference. This paper presents a comprehensive study on the reliability assessment of complex systems. In view of the dynamic characteristics within the system, it makes use of the advantages of the dynamic fault tree (DFT) for characterizing system behaviors. The lifetime of system units can be expressed as bounded closed intervals by incorporating field failures, test data and design expertize. Then the coefficient of variation (COV) method is employed to estimate the parameters of life distributions. An extended probability-box (P-Box) is proposed to convey the present of epistemic uncertainty induced by the incomplete information about the data. By mapping the DFT into an equivalent Bayesian network (BN), relevant reliability parameters and indexes have been calculated. Furthermore, the Monte Carlo (MC) simulation method is utilized to compute the DFT model with consideration of system replacement policy. The results show that this integrated approach is more flexible and effective for assessing the reliability of complex dynamic systems. - Highlights: • A comprehensive study on the reliability assessment of complex system is presented. • An extended probability-box is proposed to convey the present of epistemic uncertainty. • The dynamic fault tree model is built. • Bayesian network and Monte Carlo simulation methods are used. • The reliability assessment of a complex electromechanical system is performed.
Physical and Model Uncertainty for Fatigue Design of Composite Material
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...
Uncertainty propagation through dynamic models of assemblies of mechanical structures
International Nuclear Information System (INIS)
Daouk, Sami
2016-01-01
When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)
Information Uncertainty to Compare Qualitative Reasoning Security Risk Assessment Results
Energy Technology Data Exchange (ETDEWEB)
Chavez, Gregory M [Los Alamos National Laboratory; Key, Brian P [Los Alamos National Laboratory; Zerkle, David K [Los Alamos National Laboratory; Shevitz, Daniel W [Los Alamos National Laboratory
2009-01-01
The security risk associated with malevolent acts such as those of terrorism are often void of the historical data required for a traditional PRA. Most information available to conduct security risk assessments for these malevolent acts is obtained from subject matter experts as subjective judgements. Qualitative reasoning approaches such as approximate reasoning and evidential reasoning are useful for modeling the predicted risk from information provided by subject matter experts. Absent from these approaches is a consistent means to compare the security risk assessment results. Associated with each predicted risk reasoning result is a quantifiable amount of information uncertainty which can be measured and used to compare the results. This paper explores using entropy measures to quantify the information uncertainty associated with conflict and non-specificity in the predicted reasoning results. The measured quantities of conflict and non-specificity can ultimately be used to compare qualitative reasoning results which are important in triage studies and ultimately resource allocation. Straight forward extensions of previous entropy measures are presented here to quantify the non-specificity and conflict associated with security risk assessment results obtained from qualitative reasoning models.
International Nuclear Information System (INIS)
Wiborgh, M.; Elert, M.; Hoeglund, L.O.; Jones, C.; Grundfelt, B.; Skagius, K.; Bengtsson, A.
1992-06-01
Radioactive waste disposal systems for spent nuclear fuel are designed to isolate the radioactive waste from the human environment for long period of time. The isolation is provided by a combination of engineered and natural barriers. Safety assessments are performed to describe and quantify the performance of the individual barriers and the disposal system over long-term periods. These assessments will always be associated with uncertainties. Uncertainties can originate from the variability of natural systems and will also be introduced in the predictive modelling performed to quantitatively evaluate the behaviour of the disposal system as a consequence of the incomplete knowledge about the governing processes. Uncertainties in safety assessments can partly be reduced by additional measurements and research. The aim of this study has been to identify uncertainties in assessments of radiological consequences from the disposal of spent nuclear fuel based on the Swedish KBS-3 concept. The identified uncertainties have been classified with respect to their origin, i.e. in conceptual, modelling and data uncertainties. The possibilities to reduce the uncertainties are also commented upon. In assessments it is important to decrease uncertainties which are of major importance for the performance of the disposal system. These could to some extent be identified by uncertainty analysis. However, conceptual uncertainties and some type of model uncertainties are difficult to evaluate. To be able to decrease uncertainties in conceptual models, it is essential that the processes describing and influencing the radionuclide transport in the engineered and natural barriers are sufficiently understood. In this study a qualitative approach has been used. The importance of different barriers and processes are indicated by their influence on the release of some representative radionuclides. (122 refs.) (au)
Assessment of volcanic hazards, vulnerability, risk and uncertainty (Invited)
Sparks, R. S.
2009-12-01
many sources of uncertainty in forecasting the areas that volcanic activity will effect and the severity of the effects. Uncertainties arise from: natural variability, inadequate data, biased data, incomplete data, lack of understanding of the processes, limitations to predictive models, ambiguity, and unknown unknowns. The description of volcanic hazards is thus necessarily probabilistic and requires assessment of the attendant uncertainties. Several issues arise from the probabilistic nature of volcanic hazards and the intrinsic uncertainties. Although zonation maps require well-defined boundaries for administrative pragmatism, such boundaries cannot divide areas that are completely safe from those that are unsafe. Levels of danger or safety need to be defined to decide on and justify boundaries through the concepts of vulnerability and risk. More data, better observations, improved models may reduce uncertainties, but can increase uncertainties and may lead to re-appraisal of zone boundaries. Probabilities inferred by statistical techniques are hard to communicate. Expert elicitation is an emerging methodology for risk assessment and uncertainty evaluation. The method has been applied at one major volcanic crisis (Soufrière Hills Volcano, Montserrat), and is being applied in planning for volcanic crises at Vesuvius.
Assessing framing of uncertainties in water management practice
Isendahl, N.; Dewulf, A.; Brugnach, M.; Francois, G.; Möllenkamp, S.; Pahl-Wostl, C.
2009-01-01
Dealing with uncertainties in water management is an important issue and is one which will only increase in light of global changes, particularly climate change. So far, uncertainties in water management have mostly been assessed from a scientific point of view, and in quantitative terms. In this
Dealing with uncertainties in environmental burden of disease assessment
Directory of Open Access Journals (Sweden)
van der Sluijs Jeroen P
2009-04-01
Full Text Available Abstract Disability Adjusted Life Years (DALYs combine the number of people affected by disease or mortality in a population and the duration and severity of their condition into one number. The environmental burden of disease is the number of DALYs that can be attributed to environmental factors. Environmental burden of disease estimates enable policy makers to evaluate, compare and prioritize dissimilar environmental health problems or interventions. These estimates often have various uncertainties and assumptions which are not always made explicit. Besides statistical uncertainty in input data and parameters – which is commonly addressed – a variety of other types of uncertainties may substantially influence the results of the assessment. We have reviewed how different types of uncertainties affect environmental burden of disease assessments, and we give suggestions as to how researchers could address these uncertainties. We propose the use of an uncertainty typology to identify and characterize uncertainties. Finally, we argue that uncertainties need to be identified, assessed, reported and interpreted in order for assessment results to adequately support decision making.
Energy Technology Data Exchange (ETDEWEB)
Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others
1997-06-01
This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.
International Nuclear Information System (INIS)
Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.
1997-06-01
This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G
Intrinsic Uncertainties in Modeling Complex Systems.
Energy Technology Data Exchange (ETDEWEB)
Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.
2014-09-01
Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.
Modeling of uncertainties in biochemical reactions.
Mišković, Ljubiša; Hatzimanikatis, Vassily
2011-02-01
Mathematical modeling is an indispensable tool for research and development in biotechnology and bioengineering. The formulation of kinetic models of biochemical networks depends on knowledge of the kinetic properties of the enzymes of the individual reactions. However, kinetic data acquired from experimental observations bring along uncertainties due to various experimental conditions and measurement methods. In this contribution, we propose a novel way to model the uncertainty in the enzyme kinetics and to predict quantitatively the responses of metabolic reactions to the changes in enzyme activities under uncertainty. The proposed methodology accounts explicitly for mechanistic properties of enzymes and physico-chemical and thermodynamic constraints, and is based on formalism from systems theory and metabolic control analysis. We achieve this by observing that kinetic responses of metabolic reactions depend: (i) on the distribution of the enzymes among their free form and all reactive states; (ii) on the equilibrium displacements of the overall reaction and that of the individual enzymatic steps; and (iii) on the net fluxes through the enzyme. Relying on this observation, we develop a novel, efficient Monte Carlo sampling procedure to generate all states within a metabolic reaction that satisfy imposed constrains. Thus, we derive the statistics of the expected responses of the metabolic reactions to changes in enzyme levels and activities, in the levels of metabolites, and in the values of the kinetic parameters. We present aspects of the proposed framework through an example of the fundamental three-step reversible enzymatic reaction mechanism. We demonstrate that the equilibrium displacements of the individual enzymatic steps have an important influence on kinetic responses of the enzyme. Furthermore, we derive the conditions that must be satisfied by a reversible three-step enzymatic reaction operating far away from the equilibrium in order to respond to
Gallagher, Daniel; Ebel, Eric D; Gallagher, Owen; Labarre, David; Williams, Michael S; Golden, Neal J; Pouillot, Régis; Dearfield, Kerry L; Kause, Janell
2013-04-01
This report illustrates how the uncertainty about food safety metrics may influence the selection of a performance objective (PO). To accomplish this goal, we developed a model concerning Listeria monocytogenes in ready-to-eat (RTE) deli meats. This application used a second order Monte Carlo model that simulates L. monocytogenes concentrations through a series of steps: the food-processing establishment, transport, retail, the consumer's home and consumption. The model accounted for growth inhibitor use, retail cross contamination, and applied an FAO/WHO dose response model for evaluating the probability of illness. An appropriate level of protection (ALOP) risk metric was selected as the average risk of illness per serving across all consumed servings-per-annum and the model was used to solve for the corresponding performance objective (PO) risk metric as the maximum allowable L. monocytogenes concentration (cfu/g) at the processing establishment where regulatory monitoring would occur. Given uncertainty about model inputs, an uncertainty distribution of the PO was estimated. Additionally, we considered how RTE deli meats contaminated at levels above the PO would be handled by the industry using three alternative approaches. Points on the PO distribution represent the probability that - if the industry complies with a particular PO - the resulting risk-per-serving is less than or equal to the target ALOP. For example, assuming (1) a target ALOP of -6.41 log10 risk of illness per serving, (2) industry concentrations above the PO that are re-distributed throughout the remaining concentration distribution and (3) no dose response uncertainty, establishment PO's of -4.98 and -4.39 log10 cfu/g would be required for 90% and 75% confidence that the target ALOP is met, respectively. The PO concentrations from this example scenario are more stringent than the current typical monitoring level of an absence in 25 g (i.e., -1.40 log10 cfu/g) or a stricter criteria of absence
Uncertainty assessment for accelerator-driven systems
International Nuclear Information System (INIS)
Finck, P. J.; Gomes, I.; Micklich, B.; Palmiotti, G.
1999-01-01
The concept of a subcritical system driven by an external source of neutrons provided by an accelerator ADS (Accelerator Driver System) has been recently revived and is becoming more popular in the world technical community with active programs in Europe, Russia, Japan, and the U.S. A general consensus has been reached in adopting for the subcritical component a fast spectrum liquid metal cooled configuration. Both a lead-bismuth eutectic, sodium and gas are being considered as a coolant; each has advantages and disadvantages. The major expected advantage is that subcriticality avoids reactivity induced transients. The potentially large subcriticality margin also should allow for the introduction of very significant quantities of waste products (minor Actinides and Fission Products) which negatively impact the safety characteristics of standard cores. In the U.S. these arguments are the basis for the development of the Accelerator Transmutation of Waste (ATW), which has significant potential in reducing nuclear waste levels. Up to now, neutronic calculations have not attached uncertainties on the values of the main nuclear integral parameters that characterize the system. Many of these parameters (e.g., degree of subcriticality) are crucial to demonstrate the validity and feasibility of this concept. In this paper we will consider uncertainties related to nuclear data only. The present knowledge of the cross sections of many isotopes that are not usually utilized in existing reactors (like Bi, Pb-207, Pb-208, and also Minor Actinides and Fission Products) suggests that uncertainties in the integral parameters will be significantly larger than for conventional reactor systems, and this raises concerns on the neutronic performance of those systems
Skinner, Daniel J C; Rocks, Sophie A; Pollard, Simon J T
2016-12-01
A reliable characterisation of uncertainties can aid uncertainty identification during environmental risk assessments (ERAs). However, typologies can be implemented inconsistently, causing uncertainties to go unidentified. We present an approach based on nine structured elicitations, in which subject-matter experts, for pesticide risks to surface water organisms, validate and assess three dimensions of uncertainty: its level (the severity of uncertainty, ranging from determinism to ignorance); nature (whether the uncertainty is epistemic or aleatory); and location (the data source or area in which the uncertainty arises). Risk characterisation contains the highest median levels of uncertainty, associated with estimating, aggregating and evaluating the magnitude of risks. Regarding the locations in which uncertainty is manifest, data uncertainty is dominant in problem formulation, exposure assessment and effects assessment. The comprehensive description of uncertainty described will enable risk analysts to prioritise the required phases, groups of tasks, or individual tasks within a risk analysis according to the highest levels of uncertainty, the potential for uncertainty to be reduced or quantified, or the types of location-based uncertainty, thus aiding uncertainty prioritisation during environmental risk assessments. In turn, it is expected to inform investment in uncertainty reduction or targeted risk management action. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Uncertainties in model-based outcome predictions for treatment planning
International Nuclear Information System (INIS)
Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry
2001-01-01
Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment
A structured analysis of uncertainty surrounding modeled impacts of groundwater-extraction rules
Guillaume, Joseph H. A.; Qureshi, M. Ejaz; Jakeman, Anthony J.
2012-08-01
Integrating economic and groundwater models for groundwater-management can help improve understanding of trade-offs involved between conflicting socioeconomic and biophysical objectives. However, there is significant uncertainty in most strategic decision-making situations, including in the models constructed to represent them. If not addressed, this uncertainty may be used to challenge the legitimacy of the models and decisions made using them. In this context, a preliminary uncertainty analysis was conducted of a dynamic coupled economic-groundwater model aimed at assessing groundwater extraction rules. The analysis demonstrates how a variety of uncertainties in such a model can be addressed. A number of methods are used including propagation of scenarios and bounds on parameters, multiple models, block bootstrap time-series sampling and robust linear regression for model calibration. These methods are described within the context of a theoretical uncertainty management framework, using a set of fundamental uncertainty management tasks and an uncertainty typology.
Sensitivity and uncertainty analysis for a field-scale P loss model
Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that there are inherent uncertainties with model predictions, limited studies have addressed model prediction uncertainty. In this study we assess the effect of model input error on predict...
Dimitriadis, Panayiotis; Tegos, Aristoteles; Oikonomou, Athanasios; Pagana, Vassiliki; Koukouvinos, Antonios; Mamassis, Nikos; Koutsoyiannis, Demetris; Efstratiadis, Andreas
2016-03-01
One-dimensional and quasi-two-dimensional hydraulic freeware models (HEC-RAS, LISFLOOD-FP and FLO-2d) are widely used for flood inundation mapping. These models are tested on a benchmark test with a mixed rectangular-triangular channel cross section. Using a Monte-Carlo approach, we employ extended sensitivity analysis by simultaneously varying the input discharge, longitudinal and lateral gradients and roughness coefficients, as well as the grid cell size. Based on statistical analysis of three output variables of interest, i.e. water depths at the inflow and outflow locations and total flood volume, we investigate the uncertainty enclosed in different model configurations and flow conditions, without the influence of errors and other assumptions on topography, channel geometry and boundary conditions. Moreover, we estimate the uncertainty associated to each input variable and we compare it to the overall one. The outcomes of the benchmark analysis are further highlighted by applying the three models to real-world flood propagation problems, in the context of two challenging case studies in Greece.
Model uncertainty from a regulatory point of view
International Nuclear Information System (INIS)
Abramson, L.R.
1994-01-01
This paper discusses model uncertainty in the larger context of knowledge and random uncertainty. It explores some regulatory implications of model uncertainty and argues that, from a regulator's perspective, a conservative approach must be taken. As a consequence of this perspective, averaging over model results is ruled out
Uncertainties in life cycle assessment of waste management systems
DEFF Research Database (Denmark)
Clavreul, Julie; Christensen, Thomas Højlund
2011-01-01
Life cycle assessment has been used to assess environmental performances of waste management systems in many studies. The uncertainties inherent to its results are often pointed out but not always quantified, which should be the case to ensure a good decisionmaking process. This paper proposes...... a method to assess all parameter uncertainties and quantify the overall uncertainty of the assessment. The method is exemplified in a case study, where the goal is to determine if anaerobic digestion of organic waste is more beneficial than incineration in Denmark, considering only the impact on global...... warming. The sensitivity analysis pointed out ten parameters particularly highly influencing the result of the study. In the uncertainty analysis, the distributions of these ten parameters were used in a Monte Carlo analysis, which concluded that incineration appeared more favourable than anaerobic...
Uncertainty associated with selected environmental transport models
International Nuclear Information System (INIS)
Little, C.A.; Miller, C.W.
1979-11-01
A description is given of the capabilities of several models to predict accurately either pollutant concentrations in environmental media or radiological dose to human organs. The models are discussed in three sections: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations. This procedure is infeasible for food chain models and, therefore, the uncertainty embodied in the models input parameters, rather than the model output, is estimated. Aquatic transport models are divided into one-dimensional, longitudinal-vertical, and longitudinal-horizontal models. Several conclusions were made about the ability of the Gaussian plume atmospheric dispersion model to predict accurately downwind air concentrations from releases under several sets of conditions. It is concluded that no validation study has been conducted to test the predictions of either aquatic or terrestrial food chain models. Using the aquatic pathway from water to fish to an adult for 137 Cs as an example, a 95% one-tailed confidence limit interval for the predicted exposure is calculated by examining the distributions of the input parameters. Such an interval is found to be 16 times the value of the median exposure. A similar one-tailed limit for the air-grass-cow-milk-thyroid for 131 I and infants was 5.6 times the median dose. Of the three model types discussed in this report,the aquatic transport models appear to do the best job of predicting observed concentrations. However, this conclusion is based on many fewer aquatic validation data than were availaable for atmospheric model validation
An assessment of uncertainty in forest carbon budget projections
Linda S. Heath; James E. Smith
2000-01-01
Estimates of uncertainty are presented for projections of forest carbon inventory and average annual net carbon flux on private timberland in the US using the model FORCARB. Uncertainty in carbon inventory was approximately ±9% (2000 million metric tons) of the estimated median in the year 2000, rising to 11% (2800 million metric tons) in projection year 2040...
Bayesian uncertainty quantification in linear models for diffusion MRI.
Sjölund, Jens; Eklund, Anders; Özarslan, Evren; Herberthson, Magnus; Bånkestad, Maria; Knutsson, Hans
2018-03-29
Diffusion MRI (dMRI) is a valuable tool in the assessment of tissue microstructure. By fitting a model to the dMRI signal it is possible to derive various quantitative features. Several of the most popular dMRI signal models are expansions in an appropriately chosen basis, where the coefficients are determined using some variation of least-squares. However, such approaches lack any notion of uncertainty, which could be valuable in e.g. group analyses. In this work, we use a probabilistic interpretation of linear least-squares methods to recast popular dMRI models as Bayesian ones. This makes it possible to quantify the uncertainty of any derived quantity. In particular, for quantities that are affine functions of the coefficients, the posterior distribution can be expressed in closed-form. We simulated measurements from single- and double-tensor models where the correct values of several quantities are known, to validate that the theoretically derived quantiles agree with those observed empirically. We included results from residual bootstrap for comparison and found good agreement. The validation employed several different models: Diffusion Tensor Imaging (DTI), Mean Apparent Propagator MRI (MAP-MRI) and Constrained Spherical Deconvolution (CSD). We also used in vivo data to visualize maps of quantitative features and corresponding uncertainties, and to show how our approach can be used in a group analysis to downweight subjects with high uncertainty. In summary, we convert successful linear models for dMRI signal estimation to probabilistic models, capable of accurate uncertainty quantification. Copyright © 2018 Elsevier Inc. All rights reserved.
Managing geological uncertainty in CO2-EOR reservoir assessments
Welkenhuysen, Kris; Piessens, Kris
2014-05-01
therefore not suited for cost-benefit analysis. They likely result in too optimistic results because onshore configurations are cheaper and different. We propose to translate the detailed US data to the North Sea, retaining their uncertainty ranges. In a first step, a general cost correction can be applied to account for costs specific to the EU and the offshore setting. In a second step site-specific data, including laboratory tests and reservoir modelling, are used to further adapt the EOR ratio values taking into account all available geological reservoir-specific knowledge. And lastly, an evaluation of the field configuration will have an influence on both the cost and local geology dimension, because e.g. horizontal drilling is needed (cost) to improve injectivity (geology). As such, a dataset of the EOR field is obtained which contains all aspects and their uncertainty ranges. With these, a geologically realistic basis is obtained for further cost-benefit analysis of a specific field, where the uncertainties are accounted for using a stochastic evaluation. Such ad-hoc evaluation of geological parameters will provide a better assessment of the CO2-EOR potential of the North Sea oil fields.
DEFF Research Database (Denmark)
Arnbjerg-Nielsen, Karsten; Zhou, Qianqian
2014-01-01
uncertainty analysis, which can assess and quantify the overall uncertainty in relation to climate change adaptation to urban flash floods. The analysis is based on an uncertainty cascade that by means of Monte Carlo simulations of flood risk assessments incorporates climate change impacts as a key driver......There has been a significant increase in climatic extremes in many regions. In Central and Northern Europe, this has led to more frequent and more severe floods. Along with improved flood modelling technologies this has enabled development of economic assessment of climate change adaptation...... to increasing urban flood risk. Assessment of adaptation strategies often requires a comprehensive risk-based economic analysis of current risk, drivers of change of risk over time, and measures to reduce the risk. However, such studies are often associated with large uncertainties. The uncertainties arise from...
Uncertainty and Preference Modelling for Multiple Criteria Vehicle Evaluation
Directory of Open Access Journals (Sweden)
Qiuping Yang
2010-12-01
Full Text Available A general framework for vehicle assessment is proposed based on both mass survey information and the evidential reasoning (ER approach. Several methods for uncertainty and preference modeling are developed within the framework, including the measurement of uncertainty caused by missing information, the estimation of missing information in original surveys, the use of nonlinear functions for data mapping, and the use of nonlinear functions as utility function to combine distributed assessments into a single index. The results of the investigation show that various measures can be used to represent the different preferences of decision makers towards the same feedback from respondents. Based on the ER approach, credible and informative analysis can be conducted through the complete understanding of the assessment problem in question and the full exploration of available information.
Dittes, Beatrice; Špačková, Olga; Ebrahimian, Negin; Kaiser, Maria; Rieger, Wolfgang; Disse, Markus; Straub, Daniel
2017-04-01
Flood risk estimates are subject to significant uncertainties, e.g. due to limited records of historic flood events, uncertainty in flood modeling, uncertain impact of climate change or uncertainty in the exposure and loss estimates. In traditional design of flood protection systems, these uncertainties are typically just accounted for implicitly, based on engineering judgment. In the AdaptRisk project, we develop a fully quantitative framework for planning of flood protection systems under current and future uncertainties using quantitative pre-posterior Bayesian decision analysis. In this contribution, we focus on the quantification of the uncertainties and study their relative influence on the flood risk estimate and on the planning of flood protection systems. The following uncertainty components are included using a Bayesian approach: 1) inherent and statistical (i.e. limited record length) uncertainty; 2) climate uncertainty that can be learned from an ensemble of GCM-RCM models; 3) estimates of climate uncertainty components not covered in 2), such as bias correction, incomplete ensemble, local specifics not captured by the GCM-RCM models; 4) uncertainty in the inundation modelling; 5) uncertainty in damage estimation. We also investigate how these uncertainties are possibly reduced in the future when new evidence - such as new climate models, observed extreme events, and socio-economic data - becomes available. Finally, we look into how this new evidence influences the risk assessment and effectivity of flood protection systems. We demonstrate our methodology for a pre-alpine catchment in southern Germany: the Mangfall catchment in Bavaria that includes the city of Rosenheim, which suffered significant losses during the 2013 flood event.
Ecosystem Services Mapping Uncertainty Assessment: A Case Study in the Fitzroy Basin Mining Region
Directory of Open Access Journals (Sweden)
Zhenyu Wang
2018-01-01
Full Text Available Ecosystem services mapping is becoming increasingly popular through the use of various readily available mapping tools, however, uncertainties in assessment outputs are commonly ignored. Uncertainties from different sources have the potential to lower the accuracy of mapping outputs and reduce their reliability for decision-making. Using a case study in an Australian mining region, this paper assessed the impact of uncertainties on the modelling of the hydrological ecosystem service, water provision. Three types of uncertainty were modelled using multiple uncertainty scenarios: (1 spatial data sources; (2 modelling scales (temporal and spatial and (3 parameterization and model selection. We found that the mapping scales can induce significant changes to the spatial pattern of outputs and annual totals of water provision. In addition, differences in parameterization using differing sources from the literature also led to obvious differences in base flow. However, the impact of each uncertainty associated with differences in spatial data sources were not so great. The results of this study demonstrate the importance of uncertainty assessment and highlight that any conclusions drawn from ecosystem services mapping, such as the impacts of mining, are likely to also be a property of the uncertainty in ecosystem services mapping methods.
Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty
DEFF Research Database (Denmark)
Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens
more reliable predictions with a new and improved set of model parameters for GC (group contribution) based and CI (atom connectivity index) based models and to quantify the uncertainties in the estimated property values from a process design point-of-view. This includes: (i) parameter estimation using....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column......, the developed methodology can be used to quantify the sensitivity of process design to uncertainties in property estimates; obtain rationally the risk/safety factors in process design; and identify additional experimentation needs in order to reduce most critical uncertainties....
Classification and moral evaluation of uncertainties in engineering modeling.
Murphy, Colleen; Gardoni, Paolo; Harris, Charles E
2011-09-01
Engineers must deal with risks and uncertainties as a part of their professional work and, in particular, uncertainties are inherent to engineering models. Models play a central role in engineering. Models often represent an abstract and idealized version of the mathematical properties of a target. Using models, engineers can investigate and acquire understanding of how an object or phenomenon will perform under specified conditions. This paper defines the different stages of the modeling process in engineering, classifies the various sources of uncertainty that arise in each stage, and discusses the categories into which these uncertainties fall. The paper then considers the way uncertainty and modeling are approached in science and the criteria for evaluating scientific hypotheses, in order to highlight the very different criteria appropriate for the development of models and the treatment of the inherent uncertainties in engineering. Finally, the paper puts forward nine guidelines for the treatment of uncertainty in engineering modeling.
The uncertainty analysis of model results a practical guide
Hofer, Eduard
2018-01-01
This book is a practical guide to the uncertainty analysis of computer model applications. Used in many areas, such as engineering, ecology and economics, computer models are subject to various uncertainties at the level of model formulations, parameter values and input data. Naturally, it would be advantageous to know the combined effect of these uncertainties on the model results as well as whether the state of knowledge should be improved in order to reduce the uncertainty of the results most effectively. The book supports decision-makers, model developers and users in their argumentation for an uncertainty analysis and assists them in the interpretation of the analysis results.
Uncertainty management in radioactive waste repository site assessment
International Nuclear Information System (INIS)
Baldwin, J.f.; Martin, T.P.; Tocatlidou
1994-01-01
The problem of performance assessment of a site to serve as a repository for the final disposal of radioactive waste involves different types of uncertainties. Their main sources include the large temporal and spatial considerations over which safety of the system has to be ensured, our inability to completely understand and describe a very complex structure such as the repository system, lack of precision in the measured information etc. These issues underlie most of the problems faced when rigid probabilistic approaches are used. Nevertheless a framework is needed, that would allow for an optimal aggregation of the available knowledge and an efficient management of the various types of uncertainty involved. In this work a knowledge-based modelling of the repository selection process is proposed that through a consequence analysis, evaluates the potential impact that hypothetical scenarios will have on a candidate site. The model is organised around a hierarchical structure, relating the scenarios with the possible events and processes that characterise them, and the site parameters. The scheme provides for both crisp and fuzzy parameter values and uses fuzzy semantic unification and evidential support logic reference mechanisms. It is implemented using the artificial intelligence language FRIL and the interaction with the user is performed through a windows interface
How to: understanding SWAT model uncertainty relative to measured results
Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...
Assessing concentration uncertainty estimates from passive microwave sea ice products
Meier, W.; Brucker, L.; Miller, J. A.
2017-12-01
Sea ice concentration is an essential climate variable and passive microwave derived estimates of concentration are one of the longest satellite-derived climate records. However, until recently uncertainty estimates were not provided. Numerous validation studies provided insight into general error characteristics, but the studies have found that concentration error varied greatly depending on sea ice conditions. Thus, an uncertainty estimate from each observation is desired, particularly for initialization, assimilation, and validation of models. Here we investigate three sea ice products that include an uncertainty for each concentration estimate: the NASA Team 2 algorithm product, the EUMETSAT Ocean and Sea Ice Satellite Application Facility (OSI-SAF) product, and the NOAA/NSIDC Climate Data Record (CDR) product. Each product estimates uncertainty with a completely different approach. The NASA Team 2 product derives uncertainty internally from the algorithm method itself. The OSI-SAF uses atmospheric reanalysis fields and a radiative transfer model. The CDR uses spatial variability from two algorithms. Each approach has merits and limitations. Here we evaluate the uncertainty estimates by comparing the passive microwave concentration products with fields derived from the NOAA VIIRS sensor. The results show that the relationship between the product uncertainty estimates and the concentration error (relative to VIIRS) is complex. This may be due to the sea ice conditions, the uncertainty methods, as well as the spatial and temporal variability of the passive microwave and VIIRS products.
Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation
Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.
2008-01-01
By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was
Representing and managing uncertainty in qualitative ecological models
Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.
2009-01-01
Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete
Statistical approach for uncertainty quantification of experimental modal model parameters
DEFF Research Database (Denmark)
Luczak, M.; Peeters, B.; Kahsin, M.
2014-01-01
Composite materials are widely used in manufacture of aerospace and wind energy structural components. These load carrying structures are subjected to dynamic time-varying loading conditions. Robust structural dynamics identification procedure impose tight constraints on the quality of modal models...... represent different complexity levels ranging from coupon, through sub-component up to fully assembled aerospace and wind energy structural components made of composite materials. The proposed method is demonstrated on two application cases of a small and large wind turbine blade........ This paper aims at a systematic approach for uncertainty quantification of the parameters of the modal models estimated from experimentally obtained data. Statistical analysis of modal parameters is implemented to derive an assessment of the entire modal model uncertainty measure. Investigated structures...
International Nuclear Information System (INIS)
Smith, G.M.; Pinedo, P.; Cancio, D.
1997-01-01
The purpose of this paper is to raise some issues concerning uncertainties in the estimation of doses of ionizing radiation arising from waste management practices and the contribution to those uncertainties arising from dosimetry modelling. The intentions are: (a) to provide perspective on the relative uncertainties in the different aspects of radiological assessments of waste management; (b) to give pointers as to where resources could best be targeted as regards reduction in overall uncertainties; and (c) to provide regulatory insight to decisions on low dose management as related to waste management practices. (author)
Uncertainty evaluation methods for waste package performance assessment
International Nuclear Information System (INIS)
Wu, Y.T.; Nair, P.K.; Journel, A.G.; Abramson, L.R.
1991-01-01
This report identifies and investigates methodologies to deal with uncertainties in assessing high-level nuclear waste package performance. Four uncertainty evaluation methods (probability-distribution approach, bounding approach, expert judgment, and sensitivity analysis) are suggested as the elements of a methodology that, without either diminishing or enhancing the input uncertainties, can evaluate performance uncertainty. Such a methodology can also help identify critical inputs as a guide to reducing uncertainty so as to provide reasonable assurance that the risk objectives are met. This report examines the current qualitative waste containment regulation and shows how, in conjunction with the identified uncertainty evaluation methodology, a framework for a quantitative probability-based rule can be developed that takes account of the uncertainties. Current US Nuclear Regulatory Commission (NRC) regulation requires that the waste packages provide ''substantially complete containment'' (SCC) during the containment period. The term ''SCC'' is ambiguous and subject to interpretation. This report, together with an accompanying report that describes the technical considerations that must be addressed to satisfy high-level waste containment requirements, provides a basis for a third report to develop recommendations for regulatory uncertainty reduction in the ''containment''requirement of 10 CFR Part 60. 25 refs., 3 figs., 2 tabs
Qualitative uncertainty analysis in probabilistic safety assessment context
International Nuclear Information System (INIS)
Apostol, M.; Constantin, M; Turcu, I.
2007-01-01
In Probabilistic Safety Assessment (PSA) context, an uncertainty analysis is performed either to estimate the uncertainty in the final results (the risk to public health and safety) or to estimate the uncertainty in some intermediate quantities (the core damage frequency, the radionuclide release frequency or fatality frequency). The identification and evaluation of uncertainty are important tasks because they afford credit to the results and help in the decision-making process. Uncertainty analysis can be performed qualitatively or quantitatively. This paper performs a preliminary qualitative uncertainty analysis, by identification of major uncertainty in PSA level 1- level 2 interface and in the other two major procedural steps of a level 2 PSA i.e. the analysis of accident progression and of the containment and analysis of source term for severe accidents. One should mention that a level 2 PSA for a Nuclear Power Plant (NPP) involves the evaluation and quantification of the mechanisms, amount and probabilities of subsequent radioactive material releases from the containment. According to NUREG 1150, an important task in source term analysis is fission products transport analysis. The uncertainties related to the isotopes distribution in CANDU NPP primary circuit and isotopes' masses transferred in the containment, using SOPHAEROS module from ASTEC computer code will be also presented. (authors)
Coping with uncertainty in environmental impact assessments: Open techniques
Energy Technology Data Exchange (ETDEWEB)
Cardenas, Ibsen C., E-mail: c.cardenas@utwente.nl [IceBridge Research Institutea, Universiteit Twente, P.O. Box 217, 7500 AE Enschede (Netherlands); Halman, Johannes I.M., E-mail: J.I.M.Halman@utwente.nl [Universiteit Twente, P.O. Box 217, 7500 AE Enschede (Netherlands)
2016-09-15
Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take place within an EIA setting. More specifically, we have identified uncertainties involved in each decision-making step and discussed the extent to which these can be treated and managed in the context of an activity or project that may have environmental impacts. To further demonstrate the relevance of the techniques identified, we have examined the extent to which the EIA guidelines currently used in Colombia consider and provide guidance on managing the uncertainty involved in these assessments. Some points that should be considered in order to provide greater robustness in impact assessments in Colombia have been identified. These include the management of stakeholder values, the systematic generation of project options, and their associated impacts as well as the associated management actions, and the evaluation of uncertainties and assumptions. We believe that the relevant and specific techniques reported here can be a reference for future evaluations of other EIA guidelines in different countries. - Highlights: • uncertainty is unavoidable in environmental impact assessments, EIAs; • we have identified some open techniques to EIAs for treating and managing uncertainty in these assessments; • points for improvement that should be considered in order to provide greater robustness in EIAs in Colombia have been identified; • the paper provides substantiated a reference for possible examinations of EIAs guidelines in other countries.
Coping with uncertainty in environmental impact assessments: Open techniques
International Nuclear Information System (INIS)
Cardenas, Ibsen C.; Halman, Johannes I.M.
2016-01-01
Uncertainty is virtually unavoidable in environmental impact assessments (EIAs). From the literature related to treating and managing uncertainty, we have identified specific techniques for coping with uncertainty in EIAs. Here, we have focused on basic steps in the decision-making process that take place within an EIA setting. More specifically, we have identified uncertainties involved in each decision-making step and discussed the extent to which these can be treated and managed in the context of an activity or project that may have environmental impacts. To further demonstrate the relevance of the techniques identified, we have examined the extent to which the EIA guidelines currently used in Colombia consider and provide guidance on managing the uncertainty involved in these assessments. Some points that should be considered in order to provide greater robustness in impact assessments in Colombia have been identified. These include the management of stakeholder values, the systematic generation of project options, and their associated impacts as well as the associated management actions, and the evaluation of uncertainties and assumptions. We believe that the relevant and specific techniques reported here can be a reference for future evaluations of other EIA guidelines in different countries. - Highlights: • uncertainty is unavoidable in environmental impact assessments, EIAs; • we have identified some open techniques to EIAs for treating and managing uncertainty in these assessments; • points for improvement that should be considered in order to provide greater robustness in EIAs in Colombia have been identified; • the paper provides substantiated a reference for possible examinations of EIAs guidelines in other countries.
A simplified model of choice behavior under uncertainty
Directory of Open Access Journals (Sweden)
Ching-Hung Lin
2016-08-01
Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.
An introductory guide to uncertainty analysis in environmental and health risk assessment
International Nuclear Information System (INIS)
Hoffman, F.O.; Hammonds, J.S.
1992-10-01
To compensate for the potential for overly conservative estimates of risk using standard US Environmental Protection Agency methods, an uncertainty analysis should be performed as an integral part of each risk assessment. Uncertainty analyses allow one to obtain quantitative results in the form of confidence intervals that will aid in decision making and will provide guidance for the acquisition of additional data. To perform an uncertainty analysis, one must frequently rely on subjective judgment in the absence of data to estimate the range and a probability distribution describing the extent of uncertainty about a true but unknown value for each parameter of interest. This information is formulated from professional judgment based on an extensive review of literature, analysis of the data, and interviews with experts. Various analytical and numerical techniques are available to allow statistical propagation of the uncertainty in the model parameters to a statement of uncertainty in the risk to a potentially exposed individual. Although analytical methods may be straightforward for relatively simple models, they rapidly become complicated for more involved risk assessments. Because of the tedious efforts required to mathematically derive analytical approaches to propagate uncertainty in complicated risk assessments, numerical methods such as Monte Carlo simulation should be employed. The primary objective of this report is to provide an introductory guide for performing uncertainty analysis in risk assessments being performed for Superfund sites
Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops
Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said
2017-11-01
The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.
Energy Technology Data Exchange (ETDEWEB)
Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others
1997-06-01
This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.
International Nuclear Information System (INIS)
Brown, J.; Goossens, L.H.J.; Kraan, B.C.P.
1997-06-01
This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses
Vector network analyzer (VNA) measurements and uncertainty assessment
Shoaib, Nosherwan
2017-01-01
This book describes vector network analyzer measurements and uncertainty assessments, particularly in waveguide test-set environments, in order to establish their compatibility to the International System of Units (SI) for accurate and reliable characterization of communication networks. It proposes a fully analytical approach to measurement uncertainty evaluation, while also highlighting the interaction and the linear propagation of different uncertainty sources to compute the final uncertainties associated with the measurements. The book subsequently discusses the dimensional characterization of waveguide standards and the quality of the vector network analyzer (VNA) calibration techniques. The book concludes with an in-depth description of the novel verification artefacts used to assess the performance of the VNAs. It offers a comprehensive reference guide for beginners to experts, in both academia and industry, whose work involves the field of network analysis, instrumentation and measurements.
Risk assessment under deep uncertainty: A methodological comparison
International Nuclear Information System (INIS)
Shortridge, Julie; Aven, Terje; Guikema, Seth
2017-01-01
Probabilistic Risk Assessment (PRA) has proven to be an invaluable tool for evaluating risks in complex engineered systems. However, there is increasing concern that PRA may not be adequate in situations with little underlying knowledge to support probabilistic representation of uncertainties. As analysts and policy makers turn their attention to deeply uncertain hazards such as climate change, a number of alternatives to traditional PRA have been proposed. This paper systematically compares three diverse approaches for risk analysis under deep uncertainty (qualitative uncertainty factors, probability bounds, and robust decision making) in terms of their representation of uncertain quantities, analytical output, and implications for risk management. A simple example problem is used to highlight differences in the way that each method relates to the traditional risk assessment process and fundamental issues associated with risk assessment and description. We find that the implications for decision making are not necessarily consistent between approaches, and that differences in the representation of uncertain quantities and analytical output suggest contexts in which each method may be most appropriate. Finally, each methodology demonstrates how risk assessment can inform decision making in deeply uncertain contexts, informing more effective responses to risk problems characterized by deep uncertainty. - Highlights: • We compare three diverse approaches to risk assessment under deep uncertainty. • A simple example problem highlights differences in analytical process and results. • Results demonstrate how methodological choices can impact risk assessment results.
International Nuclear Information System (INIS)
Pourgol-Mohamad, Mohammad; Modarres, Mohammad; Mosleh, Ali
2013-01-01
This paper discusses an approach called Integrated Methodology for Thermal-Hydraulics Uncertainty Analysis (IMTHUA) to characterize and integrate a wide range of uncertainties associated with the best estimate models and complex system codes used for nuclear power plant safety analyses. Examples of applications include complex thermal hydraulic and fire analysis codes. In identifying and assessing uncertainties, the proposed methodology treats the complex code as a 'white box', thus explicitly treating internal sub-model uncertainties in addition to the uncertainties related to the inputs to the code. The methodology accounts for uncertainties related to experimental data used to develop such sub-models, and efficiently propagates all uncertainties during best estimate calculations. Uncertainties are formally analyzed and probabilistically treated using a Bayesian inference framework. This comprehensive approach presents the results in a form usable in most other safety analyses such as the probabilistic safety assessment. The code output results are further updated through additional Bayesian inference using any available experimental data, for example from thermal hydraulic integral test facilities. The approach includes provisions to account for uncertainties associated with user-specified options, for example for choices among alternative sub-models, or among several different correlations. Complex time-dependent best-estimate calculations are computationally intense. The paper presents approaches to minimize computational intensity during the uncertainty propagation. Finally, the paper will report effectiveness and practicality of the methodology with two applications to a complex thermal-hydraulics system code as well as a complex fire simulation code. In case of multiple alternative models, several techniques, including dynamic model switching, user-controlled model selection, and model mixing, are discussed. (authors)
Assessment of dose measurement uncertainty using RisoScan
International Nuclear Information System (INIS)
Helt-Hansen, Jakob; Miller, Arne
2006-01-01
The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer
Assessment of dose measurement uncertainty using RisøScan
DEFF Research Database (Denmark)
Helt-Hansen, J.; Miller, A.
2006-01-01
The dose measurement uncertainty of the dosimeter system RisoScan, office scanner and Riso B3 dosimeters has been assessed by comparison with spectrophotometer measurements of the same dosimeters. The reproducibility and the combined uncertainty were found to be approximately 2% and 4%, respectiv......%, respectively, at one standard deviation. The subroutine in RisoScan for electron energy measurement is shown to give results that are equivalent to the measurements with a scanning spectrophotometer. (c) 2006 Elsevier Ltd. All rights reserved....
Uncertainty in Impact Assessment – EIA in Denmark
DEFF Research Database (Denmark)
Larsen, Sanne Vammen
as problematic, as this is important information for decision makers and public actors. Taking point of departure in these issues, this paper seeks to add to the discussions by presenting the results of a study on the handling of uncertainty in Environmental Impact Assessment (EIA) reports in Denmark. The study...... is based on analysis of 100 EIA reports. The results will shed light on the extent to which uncertainties is addressed in EIA in Denmark and discuss how the practice can be categorised....
Quantifying remarks to the question of uncertainties of the 'general dose assessment fundamentals'
International Nuclear Information System (INIS)
Brenk, H.D.; Vogt, K.J.
1982-12-01
Dose prediction models are always subject to uncertainties due to a number of factors including deficiencies in the model structure and uncertainties of the model input parameter values. In lieu of validation experiments the evaluation of these uncertainties is restricted to scientific judgement. Several attempts have been made in the literature to evaluate the uncertainties of the current dose assessment models resulting from uncertainties of the model input parameter values using stochastic approaches. Less attention, however, has been paid to potential sources of systematic over- and underestimations of the predicted doses due to deficiencies in the model structure. The present study addresses this aspect with regard to dose assessment models currently used for regulatory purposes. The influence of a number of basic simplifications and conservative assumptions has been investigated. Our systematic approach is exemplified by a comparison of doses evaluated on the basis of the regulatory guide model and a more realistic model respectively. This is done for 3 critical exposure pathways. As a result of this comparison it can be concluded that the currently used regularoty-type models include significant safety factors resulting in a systematic overprediction of dose to man up to two orders of magnitude. For this reason there are some indications that these models usually more than compensate the bulk of the stochastic uncertainties caused by the variability of the input parameter values. (orig.) [de
Statistically based uncertainty assessments in nuclear risk analysis
International Nuclear Information System (INIS)
Spencer, F.W.; Diegert, K.V.; Easterling, R.G.
1987-01-01
Over the last decade, the problems of estimation and uncertainty assessment in probabilistics risk assessment (PRAs) have been addressed in a variety of NRC and industry-sponsored projects. These problems have received attention because of a recognition that major uncertainties in risk estimation exist, which can be reduced by collecting more and better data and other information, and because of a recognition that better methods for assessing these uncertainties are needed. In particular, a clear understanding of the nature and magnitude of various sources of uncertainty is needed to facilitate descision-making on possible plant changes and research options. Recent PRAs have employed methods of probability propagation, sometimes involving the use of Bayes Theorem, and intended to formalize the use of ''engineering judgment'' or ''expert opinion.'' All sources, or feelings, of uncertainty are expressed probabilistically, so that uncertainty analysis becomes simply a matter of probability propagation. Alternatives to forcing a probabilistic framework at all stages of a PRA are a major concern in this paper, however
Uncertainty Assessment: What Good Does it Do? (Invited)
Oreskes, N.; Lewandowsky, S.
2013-12-01
the public debate or advance public policy. We argue that attempts to address public doubts by improving uncertainty assessment are bound to fail, insofar as the motives for doubt-mongering are independent of scientific uncertainty, and therefore remain unaffected even as those uncertainties are diminished. We illustrate this claim by consideration of the evolution of the debate over the past ten years over the relationship between hurricanes and anthropogenic climate change. We suggest that scientists should pursue uncertainty assessment if such assessment improves scientific understanding, but not as a means to reduce public doubts or advance public policy in relation to anthropogenic climate change.
Workshop on Model Uncertainty and its Statistical Implications
1988-01-01
In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.
Integration of expert knowledge and uncertainty in natural risk assessment
Baruffini, Mirko; Jaboyedoff, Michel
2010-05-01
Natural hazards occurring in alpine regions during the last decades have clearly shown that interruptions of the Swiss railway power supply and closures of the Gotthard highway due to those events have increased the awareness of infrastructure vulnerability also in Switzerland and illustrate the potential impacts of failures on the performance of infrastructure systems. This asks for a high level of surveillance and preservation along the transalpine lines. Traditional simulation models are only partially capable to predict complex systems behaviours and the subsequently designed and implemented protection strategies are not able to mitigate the full spectrum of risk consequences. They are costly, and maximal protection is most probably not economically feasible. In addition, the quantitative risk assessment approaches such as fault tree analysis, event tree analysis and equivalent annual fatality analysis rely heavily on statistical information. Collecting sufficient data to base a statistical probability of risk is costly and, in many situations, such data does not exist; thus, expert knowledge and experience or engineering judgment can be exploited to estimate risk qualitatively. In order to overcome the statistics lack we used models based on expert's knowledge in order to qualitatively predict based on linguistic appreciation that are more expressive and natural in risk assessment. Fuzzy reasoning (FR) can be used providing a mechanism of computing with words (Zadeh, 1965) for modelling qualitative human thought processes in analyzing complex systems and decisions. Uncertainty in predicting the risk levels arises from such situations because no fully-formalized knowledge are available. Another possibility is to use probability based on triangular probability density function (T-PDF) that can be used to follow the same flow-chart as FR. We implemented the Swiss natural hazard recommendations FR and probability using T-PDF in order to obtain hazard zoning and
Uncertainty on faecal analysis on dose assessment
Energy Technology Data Exchange (ETDEWEB)
Juliao, Ligia M.Q.C.; Melo, Dunstana R.; Sousa, Wanderson de O.; Santos, Maristela S.; Fernandes, Paulo Cesar P. [Instituto de Radioprotecao e Dosimetria, Comissao Nacional de Energia Nuclear, Av. Salvador Allende s/n. Via 9, Recreio, CEP 22780-160, Rio de Janeiro, RJ (Brazil)
2007-07-01
Monitoring programmes for internal dose assessment may need to have a combination of bioassay techniques, e.g. urine and faecal analysis, especially in workplaces where compounds of different solubilities are handled and also in cases of accidental intakes. Faecal analysis may be an important data for assessment of committed effective dose due to exposure to insoluble compounds, since the activity excreted by urine may not be detectable, unless a very sensitive measurement system is available. This paper discusses the variability of the daily faecal excretion based on data from just one daily collection; collection during three consecutive days: samples analysed individually and samples analysed as a pool. The results suggest that just 1 d collection is not appropriate for dose assessment, since the 24 h uranium excretion may vary by a factor of 40. On the basis of this analysis, the recommendation should be faecal collection during three consecutive days, and samples analysed as a pool, it is more economic and faster. (authors)
Uncertainty and sensitivity analysis in nuclear accident consequence assessment
International Nuclear Information System (INIS)
Karlberg, Olof.
1989-01-01
This report contains the results of a four year project in research contracts with the Nordic Cooperation in Nuclear Safety and the National Institute for Radiation Protection. An uncertainty/sensitivity analysis methodology consisting of Latin Hypercube sampling and regression analysis was applied to an accident consequence model. A number of input parameters were selected and the uncertainties related to these parameter were estimated within a Nordic group of experts. Individual doses, collective dose, health effects and their related uncertainties were then calculated for three release scenarios and for a representative sample of meteorological situations. From two of the scenarios the acute phase after an accident were simulated and from one the long time consequences. The most significant parameters were identified. The outer limits of the calculated uncertainty distributions are large and will grow to several order of magnitudes for the low probability consequences. The uncertainty in the expectation values are typical a factor 2-5 (1 Sigma). The variation in the model responses due to the variation of the weather parameters is fairly equal to the parameter uncertainty induced variation. The most important parameters showed out to be different for each pathway of exposure, which could be expected. However, the overall most important parameters are the wet deposition coefficient and the shielding factors. A general discussion of the usefulness of uncertainty analysis in consequence analysis is also given. (au)
The role of uncertainty analysis in dose reconstruction and risk assessment
International Nuclear Information System (INIS)
Hoffman, F.O.; Simon, S.L.; Thiessen. K.M.
1996-01-01
Dose reconstruction and risk assessment rely heavily on the use of mathematical models to extrapolate information beyond the realm of direct observation. Because models are merely approximations of real systems, their predictions are inherently uncertain. As a result, full disclosure of uncertainty in dose and risk estimates is essential to achieve scientific credibility and to build public trust. The need for formal analysis of uncertainty in model predictions was presented during the nineteenth annual meeting of the NCRP. At that time, quantitative uncertainty analysis was considered a relatively new and difficult subject practiced by only a few investigators. Today, uncertainty analysis has become synonymous with the assessment process itself. When an uncertainty analysis is used iteratively within the assessment process, it can guide experimental research to refine dose and risk estimates, deferring potentially high cost or high consequence decisions until uncertainty is either acceptable or irreducible. Uncertainty analysis is now mandated for all ongoing dose reconstruction projects within the United States, a fact that distinguishes dose reconstruction from other types of exposure and risk assessments. 64 refs., 6 figs., 1 tab
Review of strategies for handling geological uncertainty in groundwater flow and transport modeling
DEFF Research Database (Denmark)
Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.
2012-01-01
parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...
Hou, Dibo; Ge, Xiaofan; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo
2014-01-01
A real-time, dynamic, early-warning model (EP-risk model) is proposed to cope with sudden water quality pollution accidents affecting downstream areas with raw-water intakes (denoted as EPs). The EP-risk model outputs the risk level of water pollution at the EP by calculating the likelihood of pollution and evaluating the impact of pollution. A generalized form of the EP-risk model for river pollution accidents based on Monte Carlo simulation, the analytic hierarchy process (AHP) method, and the risk matrix method is proposed. The likelihood of water pollution at the EP is calculated by the Monte Carlo method, which is used for uncertainty analysis of pollutants' transport in rivers. The impact of water pollution at the EP is evaluated by expert knowledge and the results of Monte Carlo simulation based on the analytic hierarchy process. The final risk level of water pollution at the EP is determined by the risk matrix method. A case study of the proposed method is illustrated with a phenol spill accident in China.
Uncertainty propagation in urban hydrology water quality modelling
Torres Matallana, Arturo; Leopold, U.; Heuvelink, G.B.M.
2016-01-01
Uncertainty is often ignored in urban hydrology modelling. Engineering practice typically ignores uncertainties and uncertainty propagation. This can have large impacts, such as the wrong dimensioning of urban drainage systems and the inaccurate estimation of pollution in the environment caused
Elevation uncertainty in coastal inundation hazard assessments
Gesch, Dean B.; Cheval, Sorin
2012-01-01
Coastal inundation has been identified as an important natural hazard that affects densely populated and built-up areas (Subcommittee on Disaster Reduction, 2008). Inundation, or coastal flooding, can result from various physical processes, including storm surges, tsunamis, intense precipitation events, and extreme high tides. Such events cause quickly rising water levels. When rapidly rising water levels overwhelm flood defenses, especially in heavily populated areas, the potential of the hazard is realized and a natural disaster results. Two noteworthy recent examples of such natural disasters resulting from coastal inundation are the Hurricane Katrina storm surge in 2005 along the Gulf of Mexico coast in the United States, and the tsunami in northern Japan in 2011. Longer term, slowly varying processes such as land subsidence (Committee on Floodplain Mapping Technologies, 2007) and sea-level rise also can result in coastal inundation, although such conditions do not have the rapid water level rise associated with other flooding events. Geospatial data are a critical resource for conducting assessments of the potential impacts of coastal inundation, and geospatial representations of the topography in the form of elevation measurements are a primary source of information for identifying the natural and human components of the landscape that are at risk. Recently, the quantity and quality of elevation data available for the coastal zone have increased markedly, and this availability facilitates more detailed and comprehensive hazard impact assessments.
Uncertainty in a spatial evacuation model
Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De
2017-08-01
Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.
Quality in environmental science for policy: Assessing uncertainty as a component of policy analysis
International Nuclear Information System (INIS)
Maxim, Laura; Sluijs, Jeroen P. van der
2011-01-01
The sheer number of attempts to define and classify uncertainty reveals an awareness of its importance in environmental science for policy, though the nature of uncertainty is often misunderstood. The interdisciplinary field of uncertainty analysis is unstable; there are currently several incomplete notions of uncertainty leading to different and incompatible uncertainty classifications. One of the most salient shortcomings of present-day practice is that most of these classifications focus on quantifying uncertainty while ignoring the qualitative aspects that tend to be decisive in the interface between science and policy. Consequently, the current practices of uncertainty analysis contribute to increasing the perceived precision of scientific knowledge, but do not adequately address its lack of socio-political relevance. The 'positivistic' uncertainty analysis models (like those that dominate the fields of climate change modelling and nuclear or chemical risk assessment) have little social relevance, as they do not influence negotiations between stakeholders. From the perspective of the science-policy interface, the current practices of uncertainty analysis are incomplete and incorrectly focused. We argue that although scientific knowledge produced and used in a context of political decision-making embodies traditional scientific characteristics, it also holds additional properties linked to its influence on social, political, and economic relations. Therefore, the significance of uncertainty cannot be assessed based on quality criteria that refer to the scientific content only; uncertainty must also include quality criteria specific to the properties and roles of this scientific knowledge within political, social, and economic contexts and processes. We propose a conceptual framework designed to account for such substantive, contextual, and procedural criteria of knowledge quality. At the same time, the proposed framework includes and synthesizes the various
Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models
Energy Technology Data Exchange (ETDEWEB)
Ahmed Hassan; Jenny Chapman
2006-02-01
The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP
Characterization uncertainty and its effects on models and performance
International Nuclear Information System (INIS)
Rautman, C.A.; Treadway, A.H.
1991-01-01
Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization
Energy Technology Data Exchange (ETDEWEB)
Freixa, Jordi, E-mail: jordi.freixa-terradas@upc.edu; Alfonso, Elsa de, E-mail: elsa.de.alfonso@upc.edu; Reventós, Francesc, E-mail: francesc.reventos@upc.edu
2016-08-15
Highlights: • Uncertainty of physical models are a key issue in Best estimate plus uncertainty analysis. • Estimation of uncertainties of physical models of thermal hydraulics system codes. • Comparison of CIRCÉ and FFTBM methodologies. • Simulation of reflood experiments in order to evaluate uncertainty of physical models related to the reflood scenario. - Abstract: The increasing importance of Best-Estimate Plus Uncertainty (BEPU) analyses in nuclear safety and licensing processes have lead to several international activities. The latest findings highlighted the uncertainties of physical models as one of the most controversial aspects of BEPU. This type of uncertainties is an important contributor to the total uncertainty of NPP BE calculations. Due to the complexity of estimating this uncertainty, it is often assessed solely by engineering judgment. The present study comprises a comparison of two different state-of-the-art methodologies CIRCÉ and IPREM (FFTBM) capable of quantifying the uncertainty of physical models. Similarities and differences of their results are discussed through the observation of probability distribution functions and envelope calculations. In particular, the analyzed scenario is core reflood. Experimental data from the FEBA and PERICLES test facilities is employed while the thermal hydraulic simulations are carried out with RELAP5/mod3.3. This work is undertaken under the framework of PREMIUM (Post-BEMUSE Reflood Model Input Uncertainty Methods) benchmark.
Communicating uncertainty: lessons learned and suggestions for climate change assessment
International Nuclear Information System (INIS)
Patt, A.; Dessai, S.
2005-01-01
Assessments of climate change face the task of making information about uncertainty accessible and useful to decision-makers. The literature in behavior economics provides many examples of how people make decisions under conditions of uncertainty relying on inappropriate heuristics, leading to inconsistent and counterproductive choices. Modern risk communication practices recommend a number of methods to overcome these hurdles, which have been recommended for the Intergovernmental Panel on Climate Change (IPCC) assessment reports. This paper evaluates the success of the most recent IPCC approach to uncertainty communication, based on a controlled survey of climate change experts. Evaluating the results from the survey, and from a similar survey recently conducted among university students, the paper suggests that the most recent IPCC approach leaves open the possibility for biased and inconsistent responses to the information. The paper concludes by suggesting ways to improve the approach for future IPCC assessment reports. (authors)
Identification and communication of uncertainties of phenomenological models in PSA
International Nuclear Information System (INIS)
Pulkkinen, U.; Simola, K.
2001-11-01
This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)
Uncertainty and sensitivity analysis on probabilistic safety assessment of an experimental facility
International Nuclear Information System (INIS)
Burgazzi, L.
2000-01-01
The aim of this work is to perform an uncertainty and sensitivity analysis on the probabilistic safety assessment of the International Fusion Materials Irradiation Facility (IFMIF), in order to assess the effect on the final risk values of the uncertainties associated with the generic data used for the initiating events and component reliability and to identify the key quantities contributing to this uncertainty. The analysis is conducted on the expected frequency calculated for the accident sequences, defined through the event tree (ET) modeling. This is in order to increment credit to the ET model quantification, to calculate frequency distributions for the occurrence of events and, consequently, to assess if sequences have been correctly selected on the probability standpoint and finally to verify the fulfillment of the safety conditions. Uncertainty and sensitivity analysis are performed using respectively Monte Carlo sampling and an importance parameter technique. (author)
The role of sensitivity analysis in assessing uncertainty
International Nuclear Information System (INIS)
Crick, M.J.; Hill, M.D.
1987-01-01
Outside the specialist world of those carrying out performance assessments considerable confusion has arisen about the meanings of sensitivity analysis and uncertainty analysis. In this paper we attempt to reduce this confusion. We then go on to review approaches to sensitivity analysis within the context of assessing uncertainty, and to outline the types of test available to identify sensitive parameters, together with their advantages and disadvantages. The views expressed in this paper are those of the authors; they have not been formally endorsed by the National Radiological Protection Board and should not be interpreted as Board advice
Assessment of uncertainties in severe accident management strategies
International Nuclear Information System (INIS)
Kastenberg, W.E.; Apostolakis, G.; Catton, I.; Dhir, V.K.; Okrent, D.
1990-01-01
Recent progress on the development of Probabilistic Risk Assessment (PRA) as a tool for qualifying nuclear reactor safety and on research devoted to severe accident phenomena has made severe accident management an achievable goal. Severe accident management strategies may involve operational changes, modification and/or addition of hardware, and institutional changes. In order to achieve the goal of managing severe accidents, a method for assessment of strategies must be developed which integrates PRA methodology and our current knowledge concerning severe accident phenomena, including uncertainty. The research project presented in this paper is aimed at delineating uncertainties in severe accident progression and their impact on severe accident management strategies
Quantifying uncertainty in LCA-modelling of waste management systems
DEFF Research Database (Denmark)
Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund
2012-01-01
Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...
Doroszkiewicz, Joanna; Romanowicz, Renata
2016-04-01
Uncertainty in the results of the hydraulic model is not only associated with the limitations of that model and the shortcomings of data. An important factor that has a major impact on the uncertainty of the flood risk assessment in a changing climate conditions is associated with the uncertainty of future climate scenarios (IPCC WG I, 2013). Future climate projections provided by global climate models are used to generate future runoff required as an input to hydraulic models applied in the derivation of flood risk maps. Biala Tarnowska catchment, situated in southern Poland is used as a case study. Future discharges at the input to a hydraulic model are obtained using the HBV model and climate projections obtained from the EUROCORDEX project. The study describes a cascade of uncertainty related to different stages of the process of derivation of flood risk maps under changing climate conditions. In this context it takes into account the uncertainty of future climate projections, an uncertainty of flow routing model, the propagation of that uncertainty through the hydraulic model, and finally, the uncertainty related to the derivation of flood risk maps. One of the aims of this study is an assessment of a relative impact of different sources of uncertainty on the uncertainty of flood risk maps. Due to the complexity of the process, an assessment of total uncertainty of maps of inundation probability might be very computer time consuming. As a way forward we present an application of a hydraulic model simulator based on a nonlinear transfer function model for the chosen locations along the river reach. The transfer function model parameters are estimated based on the simulations of the hydraulic model at each of the model cross-section. The study shows that the application of the simulator substantially reduces the computer requirements related to the derivation of flood risk maps under future climatic conditions. Acknowledgements: This work was supported by the
Assessing performance of flaw characterization methods through uncertainty propagation
Miorelli, R.; Le Bourdais, F.; Artusi, X.
2018-04-01
In this work, we assess the inversion performance in terms of crack characterization and localization based on synthetic signals associated to ultrasonic and eddy current physics. More precisely, two different standard iterative inversion algorithms are used to minimize the discrepancy between measurements (i.e., the tested data) and simulations. Furthermore, in order to speed up the computational time and get rid of the computational burden often associated to iterative inversion algorithms, we replace the standard forward solver by a suitable metamodel fit on a database built offline. In a second step, we assess the inversion performance by adding uncertainties on a subset of the database parameters and then, through the metamodel, we propagate these uncertainties within the inversion procedure. The fast propagation of uncertainties enables efficiently evaluating the impact due to the lack of knowledge on some parameters employed to describe the inspection scenarios, which is a situation commonly encountered in the industrial NDE context.
Confronting Uncertainty in Life Cycle Assessment Used for Decision Support
DEFF Research Database (Denmark)
Herrmann, Ivan Tengbjerg; Hauschild, Michael Zwicky; Sohn, Michael D.
2014-01-01
the decision maker (DM) in making the best possible choice for the environment. At present, some DMs do not trust the LCA to be a reliable decisionsupport tool—often because DMs consider the uncertainty of an LCA to be too large. The standard evaluation of uncertainty in LCAs is an ex-post approach that can...... regarding which type of LCA study to employ for the decision context at hand. This taxonomy enables the derivation of an LCA classification matrix to clearly identify and communicate the type of a given LCA. By relating the LCA classification matrix to statistical principles, we can also rank the different......The aim of this article is to help confront uncertainty in life cycle assessments (LCAs) used for decision support. LCAs offer a quantitative approach to assess environmental effects of products, technologies, and services and are conducted by an LCA practitioner or analyst (AN) to support...
International Nuclear Information System (INIS)
Hoseyni, Seyed Mohsen; Pourgol-Mohammad, Mohammad; Tehranifard, Ali Abbaspour; Yousefpour, Faramarz
2014-01-01
This paper describes a systematic framework for characterizing important phenomena and quantifying the degree of contribution of each parameter to the output in severe accident uncertainty assessment. The proposed methodology comprises qualitative as well as quantitative phases. The qualitative part so called Modified PIRT, being a robust process of PIRT for more precise quantification of uncertainties, is a two step process for identifying and ranking based on uncertainty importance in severe accident phenomena. In this process identified severe accident phenomena are ranked according to their effect on the figure of merit and their level of knowledge. Analytical Hierarchical Process (AHP) serves here as a systematic approach for severe accident phenomena ranking. Formal uncertainty importance technique is used to estimate the degree of credibility of the severe accident model(s) used to represent the important phenomena. The methodology uses subjective justification by evaluating available information and data from experiments, and code predictions for this step. The quantitative part utilizes uncertainty importance measures for the quantification of the effect of each input parameter to the output uncertainty. A response surface fitting approach is proposed for estimating associated uncertainties with less calculation cost. The quantitative results are used to plan in reducing epistemic uncertainty in the output variable(s). The application of the proposed methodology is demonstrated for the ACRR MP-2 severe accident test facility. - Highlights: • A two stage framework for severe accident uncertainty analysis is proposed. • Modified PIRT qualitatively identifies and ranks uncertainty sources more precisely. • Uncertainty importance measure quantitatively calculates effect of each uncertainty source. • Methodology is applied successfully on ACRR MP-2 severe accident test facility
Imprecision and Uncertainty in the UFO Database Model.
Van Gyseghem, Nancy; De Caluwe, Rita
1998-01-01
Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…
Numerical Modelling of Structures with Uncertainties
Directory of Open Access Journals (Sweden)
Kahsin Maciej
2017-04-01
Full Text Available The nature of environmental interactions, as well as large dimensions and complex structure of marine offshore objects, make designing, building and operation of these objects a great challenge. This is the reason why a vast majority of investment cases of this type include structural analysis, performed using scaled laboratory models and complemented by extended computer simulations. The present paper focuses on FEM modelling of the offshore wind turbine supporting structure. Then problem is studied using the modal analysis, sensitivity analysis, as well as the design of experiment (DOE and response surface model (RSM methods. The results of modal analysis based simulations were used for assessing the quality of the FEM model against the data measured during the experimental modal analysis of the scaled laboratory model for different support conditions. The sensitivity analysis, in turn, has provided opportunities for assessing the effect of individual FEM model parameters on the dynamic response of the examined supporting structure. The DOE and RSM methods allowed to determine the effect of model parameter changes on the supporting structure response.
Han, Feng; Zheng, Yi
2018-06-01
Significant Input uncertainty is a major source of error in watershed water quality (WWQ) modeling. It remains challenging to address the input uncertainty in a rigorous Bayesian framework. This study develops the Bayesian Analysis of Input and Parametric Uncertainties (BAIPU), an approach for the joint analysis of input and parametric uncertainties through a tight coupling of Markov Chain Monte Carlo (MCMC) analysis and Bayesian Model Averaging (BMA). The formal likelihood function for this approach is derived considering a lag-1 autocorrelated, heteroscedastic, and Skew Exponential Power (SEP) distributed error model. A series of numerical experiments were performed based on a synthetic nitrate pollution case and on a real study case in the Newport Bay Watershed, California. The Soil and Water Assessment Tool (SWAT) and Differential Evolution Adaptive Metropolis (DREAM(ZS)) were used as the representative WWQ model and MCMC algorithm, respectively. The major findings include the following: (1) the BAIPU can be implemented and used to appropriately identify the uncertain parameters and characterize the predictive uncertainty; (2) the compensation effect between the input and parametric uncertainties can seriously mislead the modeling based management decisions, if the input uncertainty is not explicitly accounted for; (3) the BAIPU accounts for the interaction between the input and parametric uncertainties and therefore provides more accurate calibration and uncertainty results than a sequential analysis of the uncertainties; and (4) the BAIPU quantifies the credibility of different input assumptions on a statistical basis and can be implemented as an effective inverse modeling approach to the joint inference of parameters and inputs.
Modelling sensitivity and uncertainty in a LCA model for waste management systems - EASETECH
DEFF Research Database (Denmark)
Damgaard, Anders; Clavreul, Julie; Baumeister, Hubert
2013-01-01
In the new model, EASETECH, developed for LCA modelling of waste management systems, a general approach for sensitivity and uncertainty assessment for waste management studies has been implemented. First general contribution analysis is done through a regular interpretation of inventory and impact...
Multi-scenario modelling of uncertainty in stochastic chemical systems
International Nuclear Information System (INIS)
Evans, R. David; Ricardez-Sandoval, Luis A.
2014-01-01
Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo
Improving default risk prediction using Bayesian model uncertainty techniques.
Kazemi, Reza; Mosleh, Ali
2012-11-01
Credit risk is the potential exposure of a creditor to an obligor's failure or refusal to repay the debt in principal or interest. The potential of exposure is measured in terms of probability of default. Many models have been developed to estimate credit risk, with rating agencies dating back to the 19th century. They provide their assessment of probability of default and transition probabilities of various firms in their annual reports. Regulatory capital requirements for credit risk outlined by the Basel Committee on Banking Supervision have made it essential for banks and financial institutions to develop sophisticated models in an attempt to measure credit risk with higher accuracy. The Bayesian framework proposed in this article uses the techniques developed in physical sciences and engineering for dealing with model uncertainty and expert accuracy to obtain improved estimates of credit risk and associated uncertainties. The approach uses estimates from one or more rating agencies and incorporates their historical accuracy (past performance data) in estimating future default risk and transition probabilities. Several examples demonstrate that the proposed methodology can assess default probability with accuracy exceeding the estimations of all the individual models. Moreover, the methodology accounts for potentially significant departures from "nominal predictions" due to "upsetting events" such as the 2008 global banking crisis. © 2012 Society for Risk Analysis.
Analytic uncertainty and sensitivity analysis of models with input correlations
Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu
2018-03-01
Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.
Dealing with uncertainty arising out of probabilistic risk assessment
International Nuclear Information System (INIS)
Solomon, K.A.; Kastenberg, W.E.; Nelson, P.F.
1984-03-01
In addressing the area of safety goal implementation, the question of uncertainty arises. This report suggests that the Nuclear Regulatory Commission (NRC) should examine how other regulatory organizations have addressed the issue. Several examples are given from the chemical industry, and comparisons are made to nuclear power risks. Recommendations are made as to various considerations that the NRC should require in probabilistic risk assessments in order to properly treat uncertainties in the implementation of the safety goal policy. 40 references, 7 figures, 5 tables
Epistemic uncertainties and natural hazard risk assessment - Part 1: A review of the issues
Beven, K. J.; Aspinall, W. P.; Bates, P. D.; Borgomeo, E.; Goda, K.; Hall, J. W.; Page, T.; Phillips, J. C.; Rougier, J. T.; Simpson, M.; Stephenson, D. B.; Smith, P. J.; Wagener, T.; Watson, M.
2015-12-01
Uncertainties in natural hazard risk assessment are generally dominated by the sources arising from lack of knowledge or understanding of the processes involved. There is a lack of knowledge about frequencies, process representations, parameters, present and future boundary conditions, consequences and impacts, and the meaning of observations in evaluating simulation models. These are the epistemic uncertainties that can be difficult to constrain, especially in terms of event or scenario probabilities, even as elicited probabilities rationalized on the basis of expert judgements. This paper reviews the issues raised by trying to quantify the effects of epistemic uncertainties. Such scientific uncertainties might have significant influence on decisions that are made for risk management, so it is important to communicate the meaning of an uncertainty estimate and to provide an audit trail of the assumptions on which it is based. Some suggestions for good practice in doing so are made.
Use of quantitative uncertainty analysis for human health risk assessment
International Nuclear Information System (INIS)
Duncan, F.L.W.; Gordon, J.W.; Kelly, M.
1994-01-01
Current human health risk assessment method for environmental risks typically use point estimates of risk accompanied by qualitative discussions of uncertainty. Alternatively, Monte Carlo simulations may be used with distributions for input parameters to estimate the resulting risk distribution and descriptive risk percentiles. These two techniques are applied for the ingestion of 1,1=dichloroethene in ground water. The results indicate that Monte Carlo simulations provide significantly more information for risk assessment and risk management than do point estimates
Assessing Power System Stability Following Load Changes and Considering Uncertainty
Directory of Open Access Journals (Sweden)
D. V. Ngo
2018-04-01
Full Text Available An increase in load capacity during the operation of a power system usually causes voltage drop and leads to system instability, so it is necessary to monitor the effect of load changes. This article presents a method of assessing the power system stability according to the load node capacity considering uncertainty factors in the system. The proposed approach can be applied to large-scale power systems for voltage stability assessment in real-time.
Uncertainty models applied to the substation planning
Energy Technology Data Exchange (ETDEWEB)
Fontoura Filho, Roberto N [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)
1994-12-31
The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.
Risk Assessment and Decision-Making under Uncertainty in Tunnel and Underground Engineering
Directory of Open Access Journals (Sweden)
Yuanpu Xia
2017-10-01
Full Text Available The impact of uncertainty on risk assessment and decision-making is increasingly being prioritized, especially for large geotechnical projects such as tunnels, where uncertainty is often the main source of risk. Epistemic uncertainty, which can be reduced, is the focus of attention. In this study, the existing entropy-risk decision model is first discussed and analyzed, and its deficiencies are improved upon and overcome. Then, this study addresses the fact that existing studies only consider parameter uncertainty and ignore the influence of the model uncertainty. Here, focus is on the issue of model uncertainty and differences in risk consciousness with different decision-makers. The utility theory is introduced in the model. Finally, a risk decision model is proposed based on the sensitivity analysis and the tolerance cost, which can improve decision-making efficiency. This research can provide guidance or reference for the evaluation and decision-making of complex systems engineering problems, and indicate a direction for further research of risk assessment and decision-making issues.
Measurement, simulation and uncertainty assessment of implant heating during MRI
International Nuclear Information System (INIS)
Neufeld, E; Kuehn, S; Kuster, N; Szekely, G
2009-01-01
The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.
Measurement, simulation and uncertainty assessment of implant heating during MRI
Energy Technology Data Exchange (ETDEWEB)
Neufeld, E; Kuehn, S; Kuster, N [Foundation for Research on Information Technologies in Society (IT' IS), Zeughausstr. 43, 8004 Zurich (Switzerland); Szekely, G [Computer Vision Laboratory, Swiss Federal Institute of Technology (ETHZ), Sternwartstr 7, ETH Zentrum, 8092 Zurich (Switzerland)], E-mail: neufeld@itis.ethz.ch
2009-07-07
The heating of tissues around implants during MRI can pose severe health risks, and careful evaluation is required for leads to be labeled as MR conditionally safe. A recent interlaboratory comparison study has shown that different groups can produce widely varying results (sometimes with more than a factor of 5 difference) when performing measurements according to current guidelines. To determine the related difficulties and to derive optimized procedures, two different generic lead structures have been investigated in this study by using state-of-the-art temperature and dosimetric probes, as well as simulations for which detailed uncertainty budgets have been determined. The agreement between simulations and measurements is well within the combined uncertainty. The study revealed that the uncertainty can be kept below 17% if appropriate instrumentation and procedures are applied. Optimized experimental assessment techniques can be derived from the findings presented herein.
Uncertainty assessment in geodetic network adjustment by combining GUM and Monte-Carlo-simulations
Niemeier, Wolfgang; Tengen, Dieter
2017-06-01
In this article first ideas are presented to extend the classical concept of geodetic network adjustment by introducing a new method for uncertainty assessment as two-step analysis. In the first step the raw data and possible influencing factors are analyzed using uncertainty modeling according to GUM (Guidelines to the Expression of Uncertainty in Measurements). This approach is well established in metrology, but rarely adapted within Geodesy. The second step consists of Monte-Carlo-Simulations (MC-simulations) for the complete processing chain from raw input data and pre-processing to adjustment computations and quality assessment. To perform these simulations, possible realizations of raw data and the influencing factors are generated, using probability distributions for all variables and the established concept of pseudo-random number generators. Final result is a point cloud which represents the uncertainty of the estimated coordinates; a confidence region can be assigned to these point clouds, as well. This concept may replace the common concept of variance propagation and the quality assessment of adjustment parameters by using their covariance matrix. It allows a new way for uncertainty assessment in accordance with the GUM concept for uncertainty modelling and propagation. As practical example the local tie network in "Metsähovi Fundamental Station", Finland is used, where classical geodetic observations are combined with GNSS data.
Pulles, M.P.J.; Kok, H.; Quass, U.
2006-01-01
This study uses an improved emission inventory model to assess the uncertainties in emissions of dioxins and furans associated with both knowledge on the exact technologies and processes used, and with the uncertainties of both activity data and emission factors. The annual total emissions for the
International Nuclear Information System (INIS)
Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project
International Nuclear Information System (INIS)
Harper, F.T.; Young, M.L.; Miller, L.A.; Hora, S.C.; Lui, C.H.; Goossens, L.H.J.; Cooke, R.M.; Paesler-Sauer, J.; Helton, J.C.
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project
Energy Technology Data Exchange (ETDEWEB)
Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.
Uncertainty and endogenous technical change in climate policy models
International Nuclear Information System (INIS)
Baker, Erin; Shittu, Ekundayo
2008-01-01
Until recently endogenous technical change and uncertainty have been modeled separately in climate policy models. In this paper, we review the emerging literature that considers both these elements together. Taken as a whole the literature indicates that explicitly including uncertainty has important quantitative and qualitative impacts on optimal climate change technology policy. (author)
Appropriatie spatial scales to achieve model output uncertainty goals
Booij, Martijn J.; Melching, Charles S.; Chen, Xiaohong; Chen, Yongqin; Xia, Jun; Zhang, Hailun
2008-01-01
Appropriate spatial scales of hydrological variables were determined using an existing methodology based on a balance in uncertainties from model inputs and parameters extended with a criterion based on a maximum model output uncertainty. The original methodology uses different relationships between
International Nuclear Information System (INIS)
Silva, T.A. da
1988-01-01
The comparison between the uncertainty method recommended by International Atomic Energy Agency (IAEA) and the and the International Weight and Measure Commitee (CIPM) are showed, for the calibration of clinical dosimeters in the secondary standard Dosimetry Laboratory (SSDL). (C.G.C.) [pt
Urban drainage models simplifying uncertainty analysis for practitioners
DEFF Research Database (Denmark)
Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana
2013-01-01
in each measured/observed datapoint; an issue that is commonly overlooked in the uncertainty analysis of urban drainage models. This comparison allows the user to intuitively estimate the optimum number of simulations required to conduct uncertainty analyses. The output of the method includes parameter......There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here...
Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor
Directory of Open Access Journals (Sweden)
Jae-Han Park
2012-06-01
Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
Spatial uncertainty model for visual features using a Kinect™ sensor.
Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong
2012-01-01
This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.
International Nuclear Information System (INIS)
Abrahamse, Augusta; Knox, Lloyd; Schmidt, Samuel; Thorman, Paul; Anthony Tyson, J.; Zhan Hu
2011-01-01
The uncertainty in the redshift distributions of galaxies has a significant potential impact on the cosmological parameter values inferred from multi-band imaging surveys. The accuracy of the photometric redshifts measured in these surveys depends not only on the quality of the flux data, but also on a number of modeling assumptions that enter into both the training set and spectral energy distribution (SED) fitting methods of photometric redshift estimation. In this work we focus on the latter, considering two types of modeling uncertainties: uncertainties in the SED template set and uncertainties in the magnitude and type priors used in a Bayesian photometric redshift estimation method. We find that SED template selection effects dominate over magnitude prior errors. We introduce a method for parameterizing the resulting ignorance of the redshift distributions, and for propagating these uncertainties to uncertainties in cosmological parameters.
Uncertainty of Modal Parameters Estimated by ARMA Models
DEFF Research Database (Denmark)
Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders
In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the param...
A Bayesian approach for quantification of model uncertainty
International Nuclear Information System (INIS)
Park, Inseok; Amarchinta, Hemanth K.; Grandhi, Ramana V.
2010-01-01
In most engineering problems, more than one model can be created to represent an engineering system's behavior. Uncertainty is inevitably involved in selecting the best model from among the models that are possible. Uncertainty in model selection cannot be ignored, especially when the differences between the predictions of competing models are significant. In this research, a methodology is proposed to quantify model uncertainty using measured differences between experimental data and model outcomes under a Bayesian statistical framework. The adjustment factor approach is used to propagate model uncertainty into prediction of a system response. A nonlinear vibration system is used to demonstrate the processes for implementing the adjustment factor approach. Finally, the methodology is applied on the engineering benefits of a laser peening process, and a confidence band for residual stresses is established to indicate the reliability of model prediction.
Modeling uncertainty in requirements engineering decision support
Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.
2005-01-01
One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.
Uncertainty and Risk Assessment in the Design Process for Wind
Energy Technology Data Exchange (ETDEWEB)
Damiani, Rick R. [National Renewable Energy Lab. (NREL), Golden, CO (United States)
2018-02-09
This report summarizes the concepts and opinions that emerged from an initial study on the subject of uncertainty in wind design that included expert elicitation during a workshop held at the National Wind Technology Center at the National Renewable Energy Laboratory July 12-13, 2016. In this paper, five major categories of uncertainties are identified. The first category is associated with direct impacts on turbine loads, (i.e., the inflow including extreme events, aero-hydro-servo-elastic response, soil-structure inter- action, and load extrapolation). The second category encompasses material behavior and strength. Site suitability and due-diligence aspects pertain to the third category. Calibration of partial safety factors and optimal reliability levels make up the fourth one. And last but not least, is the category associated with uncertainties in computational modeling. The main sections of this paper follow this organization.
Szatmári, Gábor; Pásztor, László
2016-04-01
Uncertainty is a general term expressing our imperfect knowledge in describing an environmental process and we are aware of it (Bárdossy and Fodor, 2004). Sampling, laboratory measurements, models and so on are subject to uncertainty. Effective quantification and visualization of uncertainty would be indispensable to stakeholders (e.g. policy makers, society). Soil related features and their spatial models should be stressfully targeted to uncertainty assessment because their inferences are further used in modelling and decision making process. The aim of our present study was to assess and effectively visualize the local uncertainty of the countrywide soil organic matter (SOM) spatial distribution model of Hungary using geostatistical tools and concepts. The Hungarian Soil Information and Monitoring System's SOM data (approximately 1,200 observations) and environmental related, spatially exhaustive secondary information (i.e. digital elevation model, climatic maps, MODIS satellite images and geological map) were used to model the countrywide SOM spatial distribution by regression kriging. It would be common to use the calculated estimation (or kriging) variance as a measure of uncertainty, however the normality and homoscedasticity hypotheses have to be refused according to our preliminary analysis on the data. Therefore, a normal score transformation and a sequential stochastic simulation approach was introduced to be able to model and assess the local uncertainty. Five hundred equally probable realizations (i.e. stochastic images) were generated. The number of the stochastic images is fairly enough to provide a model of uncertainty at each location, which is a complete description of uncertainty in geostatistics (Deutsch and Journel, 1998). Furthermore, these models can be applied e.g. to contour the probability of any events, which can be regarded as goal oriented digital soil maps and are of interest for agricultural management and decision making as well. A
On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori
International Nuclear Information System (INIS)
Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer
2006-01-01
Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties
Energy Technology Data Exchange (ETDEWEB)
Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.
International Nuclear Information System (INIS)
Harper, F.T.; Young, M.L.; Miller, L.A.
1995-01-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes
DEFF Research Database (Denmark)
Minsley, Burke; Christensen, Nikolaj Kruse; Christensen, Steen
of airborne electromagnetic (AEM) data to estimate large-scale model structural geometry, i.e. the spatial distribution of different lithological units based on assumed or estimated resistivity-lithology relationships, and the uncertainty in those structures given imperfect measurements. Geophysically derived...... estimates of model structural uncertainty are then combined with hydrologic observations to assess the impact of model structural error on hydrologic calibration and prediction errors. Using a synthetic numerical model, we describe a sequential hydrogeophysical approach that: (1) uses Bayesian Markov chain...... Monte Carlo (McMC) methods to produce a robust estimate of uncertainty in electrical resistivity parameter values, (2) combines geophysical parameter uncertainty estimates with borehole observations of lithology to produce probabilistic estimates of model structural uncertainty over the entire AEM...
Accounting for multiple sources of uncertainty in impact assessments: The example of the BRACE study
O'Neill, B. C.
2015-12-01
Assessing climate change impacts often requires the use of multiple scenarios, types of models, and data sources, leading to a large number of potential sources of uncertainty. For example, a single study might require a choice of a forcing scenario, climate model, bias correction and/or downscaling method, societal development scenario, model (typically several) for quantifying elements of societal development such as economic and population growth, biophysical model (such as for crop yields or hydrology), and societal impact model (e.g. economic or health model). Some sources of uncertainty are reduced or eliminated by the framing of the question. For example, it may be useful to ask what an impact outcome would be conditional on a given societal development pathway, forcing scenario, or policy. However many sources of uncertainty remain, and it is rare for all or even most of these sources to be accounted for. I use the example of a recent integrated project on the Benefits of Reduced Anthropogenic Climate changE (BRACE) to explore useful approaches to uncertainty across multiple components of an impact assessment. BRACE comprises 23 papers that assess the differences in impacts between two alternative climate futures: those associated with Representative Concentration Pathways (RCPs) 4.5 and 8.5. It quantifies difference in impacts in terms of extreme events, health, agriculture, tropical cyclones, and sea level rise. Methodologically, it includes climate modeling, statistical analysis, integrated assessment modeling, and sector-specific impact modeling. It employs alternative scenarios of both radiative forcing and societal development, but generally uses a single climate model (CESM), partially accounting for climate uncertainty by drawing heavily on large initial condition ensembles. Strengths and weaknesses of the approach to uncertainty in BRACE are assessed. Options under consideration for improving the approach include the use of perturbed physics
Performance Assessment Uncertainty Analysis for Japan's HLW Program Feasibility Study (H12)
International Nuclear Information System (INIS)
BABA, T.; ISHIGURO, K.; ISHIHARA, Y.; SAWADA, A.; UMEKI, H.; WAKASUGI, K.; WEBB, ERIK K.
1999-01-01
Most HLW programs in the world recognize that any estimate of long-term radiological performance must be couched in terms of the uncertainties derived from natural variation, changes through time and lack of knowledge about the essential processes. The Japan Nuclear Cycle Development Institute followed a relatively standard procedure to address two major categories of uncertainty. First, a FEatures, Events and Processes (FEPs) listing, screening and grouping activity was pursued in order to define the range of uncertainty in system processes as well as possible variations in engineering design. A reference and many alternative cases representing various groups of FEPs were defined and individual numerical simulations performed for each to quantify the range of conceptual uncertainty. Second, parameter distributions were developed for the reference case to represent the uncertainty in the strength of these processes, the sequencing of activities and geometric variations. Both point estimates using high and low values for individual parameters as well as a probabilistic analysis were performed to estimate parameter uncertainty. A brief description of the conceptual model uncertainty analysis is presented. This paper focuses on presenting the details of the probabilistic parameter uncertainty assessment
Energy Technology Data Exchange (ETDEWEB)
Olea, Ricardo A.; Luppens, James A.; Tewalt, Susan J. [U.S. Geological Survey, Reston, VA (United States)
2011-01-01
A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. (author)
Olea, R.A.; Luppens, J.A.; Tewalt, S.J.
2011-01-01
A common practice for characterizing uncertainty in coal resource assessments has been the itemization of tonnage at the mining unit level and the classification of such units according to distance to drilling holes. Distance criteria, such as those used in U.S. Geological Survey Circular 891, are still widely used for public disclosure. A major deficiency of distance methods is that they do not provide a quantitative measure of uncertainty. Additionally, relying on distance between data points alone does not take into consideration other factors known to have an influence on uncertainty, such as spatial correlation, type of probability distribution followed by the data, geological discontinuities, and boundary of the deposit. Several geostatistical methods have been combined to formulate a quantitative characterization for appraising uncertainty. Drill hole datasets ranging from widespread exploration drilling to detailed development drilling from a lignite deposit in Texas were used to illustrate the modeling. The results show that distance to the nearest drill hole is almost completely unrelated to uncertainty, which confirms the inadequacy of characterizing uncertainty based solely on a simple classification of resources by distance classes. The more complex statistical methods used in this study quantify uncertainty and show good agreement between confidence intervals in the uncertainty predictions and data from additional drilling. ?? 2010.
Reservoir management under geological uncertainty using fast model update
Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.
2015-01-01
Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU
Energy Technology Data Exchange (ETDEWEB)
Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Energy Technology Data Exchange (ETDEWEB)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1997-12-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Energy Technology Data Exchange (ETDEWEB)
Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)
1998-04-01
The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.
Rohmer, Jeremy; Verdel, Thierry
2017-04-01
Uncertainty analysis is an unavoidable task of stability analysis of any geotechnical systems. Such analysis usually relies on the safety factor SF (if SF is below some specified threshold), the failure is possible). The objective of the stability analysis is then to estimate the failure probability P for SF to be below the specified threshold. When dealing with uncertainties, two facets should be considered as outlined by several authors in the domain of geotechnics, namely "aleatoric uncertainty" (also named "randomness" or "intrinsic variability") and "epistemic uncertainty" (i.e. when facing "vague, incomplete or imprecise information" such as limited databases and observations or "imperfect" modelling). The benefits of separating both facets of uncertainty can be seen from a risk management perspective because: - Aleatoric uncertainty, being a property of the system under study, cannot be reduced. However, practical actions can be taken to circumvent the potentially dangerous effects of such variability; - Epistemic uncertainty, being due to the incomplete/imprecise nature of available information, can be reduced by e.g., increasing the number of tests (lab or in site survey), improving the measurement methods or evaluating calculation procedure with model tests, confronting more information sources (expert opinions, data from literature, etc.). Uncertainty treatment in stability analysis usually restricts to the probabilistic framework to represent both facets of uncertainty. Yet, in the domain of geo-hazard assessments (like landslides, mine pillar collapse, rockfalls, etc.), the validity of this approach can be debatable. In the present communication, we propose to review the major criticisms available in the literature against the systematic use of probability in situations of high degree of uncertainty. On this basis, the feasibility of using a more flexible uncertainty representation tool is then investigated, namely Possibility distributions (e
Plasticity models of material variability based on uncertainty quantification techniques
Energy Technology Data Exchange (ETDEWEB)
Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)
2017-11-01
The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.
Flood risk assessment and robust management under deep uncertainty: Application to Dhaka City
Mojtahed, Vahid; Gain, Animesh Kumar; Giupponi, Carlo
2014-05-01
The socio-economic changes as well as climatic changes have been the main drivers of uncertainty in environmental risk assessment and in particular flood. The level of future uncertainty that researchers face when dealing with problems in a future perspective with focus on climate change is known as Deep Uncertainty (also known as Knightian uncertainty), since nobody has already experienced and undergone those changes before and our knowledge is limited to the extent that we have no notion of probabilities, and therefore consolidated risk management approaches have limited potential.. Deep uncertainty is referred to circumstances that analysts and experts do not know or parties to decision making cannot agree on: i) the appropriate models describing the interaction among system variables, ii) probability distributions to represent uncertainty about key parameters in the model 3) how to value the desirability of alternative outcomes. The need thus emerges to assist policy-makers by providing them with not a single and optimal solution to the problem at hand, such as crisp estimates for the costs of damages of natural hazards considered, but instead ranges of possible future costs, based on the outcomes of ensembles of assessment models and sets of plausible scenarios. Accordingly, we need to substitute optimality as a decision criterion with robustness. Under conditions of deep uncertainty, the decision-makers do not have statistical and mathematical bases to identify optimal solutions, while instead they should prefer to implement "robust" decisions that perform relatively well over all conceivable outcomes out of all future unknown scenarios. Under deep uncertainty, analysts cannot employ probability theory or other statistics that usually can be derived from observed historical data and therefore, we turn to non-statistical measures such as scenario analysis. We construct several plausible scenarios with each scenario being a full description of what may happen
Incorporating parametric uncertainty into population viability analysis models
McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.
2011-01-01
Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.
Discussion of OECD LWR Uncertainty Analysis in Modelling Benchmark
International Nuclear Information System (INIS)
Ivanov, K.; Avramova, M.; Royer, E.; Gillford, J.
2013-01-01
The demand for best estimate calculations in nuclear reactor design and safety evaluations has increased in recent years. Uncertainty quantification has been highlighted as part of the best estimate calculations. The modelling aspects of uncertainty and sensitivity analysis are to be further developed and validated on scientific grounds in support of their performance and application to multi-physics reactor simulations. The Organization for Economic Co-operation and Development (OECD) / Nuclear Energy Agency (NEA) Nuclear Science Committee (NSC) has endorsed the creation of an Expert Group on Uncertainty Analysis in Modelling (EGUAM). Within the framework of activities of EGUAM/NSC the OECD/NEA initiated the Benchmark for Uncertainty Analysis in Modelling for Design, Operation, and Safety Analysis of Light Water Reactor (OECD LWR UAM benchmark). The general objective of the benchmark is to propagate the predictive uncertainties of code results through complex coupled multi-physics and multi-scale simulations. The benchmark is divided into three phases with Phase I highlighting the uncertainty propagation in stand-alone neutronics calculations, while Phase II and III are focused on uncertainty analysis of reactor core and system respectively. This paper discusses the progress made in Phase I calculations, the Specifications for Phase II and the incoming challenges in defining Phase 3 exercises. The challenges of applying uncertainty quantification to complex code systems, in particular the time-dependent coupled physics models are the large computational burden and the utilization of non-linear models (expected due to the physics coupling). (authors)
How uncertainty in socio-economic variables affects large-scale transport model forecasts
DEFF Research Database (Denmark)
Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo
2015-01-01
A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...
Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model
Energy Technology Data Exchange (ETDEWEB)
Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.
2001-11-09
Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.
Bayesian analysis for uncertainty estimation of a canopy transpiration model
Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.
2007-04-01
A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.
Uncertainty Assessment of Space-Borne Passive Soil Moisture Retrievals
Quets, Jan; De Lannoy, Gabrielle; Reichle, Rolf; Cosh, Michael; van der Schalie, Robin; Wigneron, Jean-Pierre
2017-01-01
The uncertainty associated with passive soil moisture retrieval is hard to quantify, and known to be underlain by various, diverse, and complex causes. Factors affecting space-borne retrieved soil moisture estimation include: (i) the optimization or inversion method applied to the radiative transfer model (RTM), such as e.g. the Single Channel Algorithm (SCA), or the Land Parameter Retrieval Model (LPRM), (ii) the selection of the observed brightness temperatures (Tbs), e.g. polarization and incidence angle, (iii) the definition of the cost function and the impact of prior information in it, and (iv) the RTM parameterization (e.g. parameterizations officially used by the SMOS L2 and SMAP L2 retrieval products, ECMWF-based SMOS assimilation product, SMAP L4 assimilation product, and perturbations from those configurations). This study aims at disentangling the relative importance of the above-mentioned sources of uncertainty, by carrying out soil moisture retrieval experiments, using SMOS Tb observations in different settings, of which some are mentioned above. The ensemble uncertainties are evaluated at 11 reference CalVal sites, over a time period of more than 5 years. These experimental retrievals were inter-compared, and further confronted with in situ soil moisture measurements and operational SMOS L2 retrievals, using commonly used skill metrics to quantify the temporal uncertainty in the retrievals.
Modelling geological uncertainty for mine planning
Energy Technology Data Exchange (ETDEWEB)
Mitchell, M
1980-07-01
Geosimplan is an operational gaming approach used in testing a proposed mining strategy against uncertainty in geological disturbance. Geoplan is a technique which facilitates the preparation of summary analyses to give an impression of size, distribution and quality of reserves, and to assist in calculation of year by year output estimates. Geoplan concentrates on variations in seam properties and the interaction between geological information and marketing and output requirements.
Development of a Prototype Model-Form Uncertainty Knowledge Base
Green, Lawrence L.
2016-01-01
Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.
Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark
2011-01-01
Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.
Evaluation of uncertainty in geological framework models at Yucca Mountain, Nevada
International Nuclear Information System (INIS)
Bagtzoglou, A.C.; Stirewalt, G.L.; Henderson, D.B.; Seida, S.B.
1995-01-01
The first step towards determining compliance with the performance objectives for both the repository system and the geologic setting at Yucca Mountain requires the development of detailed geostratigraphic models. This paper proposes an approach for the evaluation of the degree of uncertainty inherent in geologic maps and associated three-dimensional geological models. Following this approach, an assessment of accuracy and completeness of the data and evaluation of conceptual uncertainties in the geological framework models can be performed
A Model-Free Definition of Increasing Uncertainty
Grant, S.; Quiggin, J.
2001-01-01
We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a
Improved Wave-vessel Transfer Functions by Uncertainty Modelling
DEFF Research Database (Denmark)
Nielsen, Ulrik Dam; Fønss Bach, Kasper; Iseki, Toshio
2016-01-01
This paper deals with uncertainty modelling of wave-vessel transfer functions used to calculate or predict wave-induced responses of a ship in a seaway. Although transfer functions, in theory, can be calculated to exactly reflect the behaviour of the ship when exposed to waves, uncertainty in inp...
Assessing measurement uncertainty in meteorology in urban environments
International Nuclear Information System (INIS)
Curci, S; Lavecchia, C; Frustaci, G; Pilati, S; Paganelli, C; Paolini, R
2017-01-01
Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network ® ) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer. (paper)
On economic resolution and uncertainty in hydrocarbon exploration assessment
International Nuclear Information System (INIS)
Lerche, I.
1998-01-01
When assessment of parameters of a decision tree for a hydrocarbon exploration project can lie within estimated ranges, it is shown that the ensemble average expected value has two sorts of uncertainties: one is due to the expected value of each realization of the decision tree being different than the average; the second is due to intrinsic variance of each decision tree. The total standard error of the average expected value combines both sorts. The use of additional statistical measures, such as standard error, volatility, and cumulative probability of making a profit, provide insight into the selection process leading to a more appropriate decision. In addition, the use of relative contributions and relative importance for the uncertainty measures guides one to a better determination of those parameters that dominantly influence the total ensemble uncertainty. In this way one can concentrate resources on efforts to minimize the uncertainty ranges of such dominant parameters. A numerical illustration is provided to indicate how such calculations can be performed simply with a hand calculator. (author)
Assessing measurement uncertainty in meteorology in urban environments
Curci, S.; Lavecchia, C.; Frustaci, G.; Paolini, R.; Pilati, S.; Paganelli, C.
2017-10-01
Measurement uncertainty in meteorology has been addressed in a number of recent projects. In urban environments, uncertainty is also affected by local effects which are more difficult to deal with than for synoptic stations. In Italy, beginning in 2010, an urban meteorological network (Climate Network®) was designed, set up and managed at national level according to high metrological standards and homogeneity criteria to support energy applications. The availability of such a high-quality operative automatic weather station network represents an opportunity to investigate the effects of station siting and sensor exposure and to estimate the related measurement uncertainty. An extended metadata set was established for the stations in Milan, including siting and exposure details. Statistical analysis on an almost 3-year-long operational period assessed network homogeneity, quality and reliability. Deviations from reference mean values were then evaluated in selected low-gradient local weather situations in order to investigate siting and exposure effects. In this paper the methodology is depicted and preliminary results of its application to air temperature discussed; this allowed the setting of an upper limit of 1 °C for the added measurement uncertainty at the top of the urban canopy layer.
Uncertainty Estimate in Resources Assessment: A Geostatistical Contribution
International Nuclear Information System (INIS)
Souza, Luis Eduardo de; Costa, Joao Felipe C. L.; Koppe, Jair C.
2004-01-01
For many decades the mining industry regarded resources/reserves estimation and classification as a mere calculation requiring basic mathematical and geological knowledge. Most methods were based on geometrical procedures and spatial data distribution. Therefore, uncertainty associated with tonnages and grades either were ignored or mishandled, although various mining codes require a measure of confidence in the values reported. Traditional methods fail in reporting the level of confidence in the quantities and grades. Conversely, kriging is known to provide the best estimate and its associated variance. Among kriging methods, Ordinary Kriging (OK) probably is the most widely used one for mineral resource/reserve estimation, mainly because of its robustness and its facility in uncertainty assessment by using the kriging variance. It also is known that OK variance is unable to recognize local data variability, an important issue when heterogeneous mineral deposits with higher and poorer grade zones are being evaluated. Alternatively, stochastic simulation are used to build local or global uncertainty about a geological attribute respecting its statistical moments. This study investigates methods capable of incorporating uncertainty to the estimates of resources and reserves via OK and sequential gaussian and sequential indicator simulation The results showed that for the type of mineralization studied all methods classified the tonnages similarly. The methods are illustrated using an exploration drill hole data sets from a large Brazilian coal deposit
Environmental impact and risk assessments and key factors contributing to the overall uncertainties.
Salbu, Brit
2016-01-01
There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms
Bayesian models for comparative analysis integrating phylogenetic uncertainty
Directory of Open Access Journals (Sweden)
Villemereuil Pierre de
2012-06-01
Full Text Available Abstract Background Uncertainty in comparative analyses can come from at least two sources: a phylogenetic uncertainty in the tree topology or branch lengths, and b uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow and inflated significance in hypothesis testing (e.g. p-values will be too small. Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible
Bayesian models for comparative analysis integrating phylogenetic uncertainty
2012-01-01
Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for
Uncertainty analysis of hydrological modeling in a tropical area using different algorithms
Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh
2018-01-01
Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18
Statistical Uncertainty Quantification of Physical Models during Reflood of LBLOCA
Energy Technology Data Exchange (ETDEWEB)
Oh, Deog Yeon; Seul, Kwang Won; Woo, Sweng Woong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)
2015-05-15
The use of the best-estimate (BE) computer codes in safety analysis for loss-of-coolant accident (LOCA) is the major trend in many countries to reduce the significant conservatism. A key feature of this BE evaluation requires the licensee to quantify the uncertainty of the calculations. So, it is very important how to determine the uncertainty distribution before conducting the uncertainty evaluation. Uncertainty includes those of physical model and correlation, plant operational parameters, and so forth. The quantification process is often performed mainly by subjective expert judgment or obtained from reference documents of computer code. In this respect, more mathematical methods are needed to reasonably determine the uncertainty ranges. The first uncertainty quantification are performed with the various increments for two influential uncertainty parameters to get the calculated responses and their derivatives. The different data set with two influential uncertainty parameters for FEBA tests, are chosen applying more strict criteria for selecting responses and their derivatives, which may be considered as the user’s effect in the CIRCÉ applications. Finally, three influential uncertainty parameters are considered to study the effect on the number of uncertainty parameters due to the limitation of CIRCÉ method. With the determined uncertainty ranges, uncertainty evaluations for FEBA tests are performed to check whether the experimental responses such as the cladding temperature or pressure drop are inside the limits of calculated uncertainty bounds. A confirmation step will be performed to evaluate the quality of the information in the case of the different reflooding PERICLES experiments. The uncertainty ranges of physical model in MARS-KS thermal-hydraulic code during the reflooding were quantified by CIRCÉ method using FEBA experiment tests, instead of expert judgment. Also, through the uncertainty evaluation for FEBA and PERICLES tests, it was confirmed
Uncertainty analysis on probabilistic fracture mechanics assessment methodology
International Nuclear Information System (INIS)
Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.
1999-01-01
Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)
Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model
International Nuclear Information System (INIS)
Otis, M.D.
1983-01-01
Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs
A risk assessment methodology for incorporating uncertainties using fuzzy concepts
International Nuclear Information System (INIS)
Cho, Hyo-Nam; Choi, Hyun-Ho; Kim, Yoon-Bae
2002-01-01
This paper proposes a new methodology for incorporating uncertainties using fuzzy concepts into conventional risk assessment frameworks. This paper also introduces new forms of fuzzy membership curves, designed to consider the uncertainty range that represents the degree of uncertainties involved in both probabilistic parameter estimates and subjective judgments, since it is often difficult or even impossible to precisely estimate the occurrence rate of an event in terms of one single crisp probability. It is to be noted that simple linguistic variables such as 'High/Low' and 'Good/Bad' have the limitations in quantifying the various risks inherent in construction projects, but only represent subjective mental cognition adequately. Therefore, in this paper, the statements that include some quantification with giving specific value or scale, such as 'Close to any value' or 'Higher/Lower than analyzed value', are used in order to get over the limitations. It may be stated that the proposed methodology will be very useful for the systematic and rational risk assessment of construction projects
Uncertainty propagation in a 3-D thermal code for performance assessment of a nuclear waste disposal
International Nuclear Information System (INIS)
Dutfoy, A.; Ritz, J.B.
2001-01-01
Given the very large time scale involved, the performance assessment of a nuclear waste repository requires numerical modelling. Because we are uncertain of the exact value of the input parameters, we have to analyse the impact of these uncertainties on the outcome of the physical models. The EDF Division Research and Development has set a reliability method to propagate these uncertainties or variability through models which requires much less physical simulations than the usual simulation methods. We apply the reliability method MEFISTO to a base case modelling the heat transfers in a virtual disposal in the future site of the French underground research laboratory, in the East of France. This study is led in collaboration with ANDRA which is the French Nuclear Waste Management Agency. With this exercise, we want to evaluate the thermal behaviour of a concept related to the variation of physical parameters and their uncertainty. (author)
Sensitivity, uncertainty, and importance analysis of a risk assessment
International Nuclear Information System (INIS)
Andsten, R.S.; Vaurio, J.K.
1992-01-01
In this paper a number of supplementary studies and applications associated with probabilistic safety assessment (PSA) are described, including sensitivity and importance evaluations of failures, errors, systems, and groups of components. The main purpose is to illustrate the usefulness of a PSA for making decisions about safety improvements, training, allowed outage times, and test intervals. A useful measure of uncertainty importance is presented, and it points out areas needing development, such as reactor vessel aging phenomena, for reducing overall uncertainty. A time-dependent core damage frequency is also presented, illustrating the impact of testing scenarios and intervals. Tea methods and applications presented are based on the Level 1 PSA carried out for the internal initiating event of the Loviisa 1 nuclear power station. Steam generator leakages and associated operator actions are major contributors to the current core-damage frequency estimate of 2 x10 -4 /yr. The results are used to improve the plant and procedures and to guide future improvements
Ahmadalipour, Ali; Moradkhani, Hamid
2017-12-01
Hydrologic modeling is one of the primary tools utilized for drought monitoring and drought early warning systems. Several sources of uncertainty in hydrologic modeling have been addressed in the literature. However, few studies have assessed the uncertainty of gridded observation datasets from a drought monitoring perspective. This study provides a hydrologic modeling oriented analysis of the gridded observation data uncertainties over the Pacific Northwest (PNW) and its implications on drought assessment. We utilized a recently developed 100-member ensemble-based observed forcing data to simulate hydrologic fluxes at 1/8° spatial resolution using Variable Infiltration Capacity (VIC) model, and compared the results with a deterministic observation. Meteorological and hydrological droughts are studied at multiple timescales over the basin, and seasonal long-term trends and variations of drought extent is investigated for each case. Results reveal large uncertainty of observed datasets at monthly timescale, with systematic differences for temperature records, mainly due to different lapse rates. The uncertainty eventuates in large disparities of drought characteristics. In general, an increasing trend is found for winter drought extent across the PNW. Furthermore, a ∼3% decrease per decade is detected for snow water equivalent (SWE) over the PNW, with the region being more susceptible to SWE variations of the northern Rockies than the western Cascades. The agricultural areas of southern Idaho demonstrate decreasing trend of natural soil moisture as a result of precipitation decline, which implies higher appeal for anthropogenic water storage and irrigation systems.
International Nuclear Information System (INIS)
Boardman, J.; Pearce, K.I.; Ponting, A.C.
2000-01-01
Probabilistic Consequence Assessment (PCA) models describe the dispersion of released radioactive materials and predict the resulting interaction with and influence on the environment and man. Increasing use is being made of PCA tools as an input to the evaluation and improvement of safety for nuclear installations. The nature and extent of the assessment performed varies considerably according to its intended purpose. Nevertheless with the increasing use of such techniques, greater attention has been given to the reliability of the methods used and the inherent uncertainty associated with their predictions. Uncertainty analyses can provide the decision-maker with information to quantify how uncertain the answer is and what drives that uncertainty. They often force a review of the baseline assumptions for any PCA methodology and provide a benchmark against which the impact of further changes in models and recommendations can be compared. This process provides valuable management information to help prioritise further actions or research. (author)
International Nuclear Information System (INIS)
Bogen, K.T.
1993-01-01
A distinction between uncertainty (or the extent of lack of knowledge) and interindividual variability (or the extent of person-to-person heterogeneity) regarding the values of input variates must be maintained if a quantitative characterization of uncertainty in population risk or in individual risk is sought. Here, some practical methods are presented that should facilitate implementation of the analytic framework for uncertainty and variability proposed by Bogen and Spear. (1,2) Two types of methodology are discussed: one that facilitates the distinction between uncertainty and variability per se, and another that may be used to simplify quantitative analysis of distributed inputs representing either uncertainty or variability. A simple and a complex form for modeled increased risk are presented and then used to illustrate methods facilitating the distinction between uncertainty and variability in reference to characterization of both population and individual risk. Finally, a simple form of discrete probability calculus is proposed as an easily implemented, practical altemative to Monte-Carlo based procedures to quantitative integration of uncertainty and variability in risk assessment
Modeling theoretical uncertainties in phenomenological analyses for particle physics
Energy Technology Data Exchange (ETDEWEB)
Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)
2017-04-15
The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)
A sliding mode observer for hemodynamic characterization under modeling uncertainties
Zayane, Chadia; Laleg-Kirati, Taous-Meriem
2014-01-01
This paper addresses the case of physiological states reconstruction in a small region of the brain under modeling uncertainties. The misunderstood coupling between the cerebral blood volume and the oxygen extraction fraction has lead to a partial
Uncertainty modelling of critical column buckling for reinforced ...
Indian Academy of Sciences (India)
for columns, having major importance to a building's safety, are considered stability limits. ... Various research works have been carried out for uncertainty analysis in ... need appropriate material models, advanced structural simulation tools.
Uncertainty of Modal Parameters Estimated by ARMA Models
DEFF Research Database (Denmark)
Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders
1990-01-01
In this paper the uncertainties of identified modal parameters such as eidenfrequencies and damping ratios are assed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the parameters...... by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore......, it is shown that the model errors may also contribute significantly to the uncertainty....
Innovative supply chain optimization models with multiple uncertainty factors
DEFF Research Database (Denmark)
Choi, Tsan Ming; Govindan, Kannan; Li, Xiang
2017-01-01
Uncertainty is an inherent factor that affects all dimensions of supply chain activities. In today’s business environment, initiatives to deal with one specific type of uncertainty might not be effective since other types of uncertainty factors and disruptions may be present. These factors relate...... to supply chain competition and coordination. Thus, to achieve a more efficient and effective supply chain requires the deployment of innovative optimization models and novel methods. This preface provides a concise review of critical research issues regarding innovative supply chain optimization models...
Li, Wei Bo; Greiter, Matthias; Oeh, Uwe; Hoeschen, Christoph
2011-12-01
The reliability of biokinetic models is essential in internal dose assessments and radiation risk analysis for the public, occupational workers, and patients exposed to radionuclides. In this paper, a method for assessing the reliability of biokinetic models by means of uncertainty and sensitivity analysis was developed. The paper is divided into two parts. In the first part of the study published here, the uncertainty sources of the model parameters for zirconium (Zr), developed by the International Commission on Radiological Protection (ICRP), were identified and analyzed. Furthermore, the uncertainty of the biokinetic experimental measurement performed at the Helmholtz Zentrum München-German Research Center for Environmental Health (HMGU) for developing a new biokinetic model of Zr was analyzed according to the Guide to the Expression of Uncertainty in Measurement, published by the International Organization for Standardization. The confidence interval and distribution of model parameters of the ICRP and HMGU Zr biokinetic models were evaluated. As a result of computer biokinetic modelings, the mean, standard uncertainty, and confidence interval of model prediction calculated based on the model parameter uncertainty were presented and compared to the plasma clearance and urinary excretion measured after intravenous administration. It was shown that for the most important compartment, the plasma, the uncertainty evaluated for the HMGU model was much smaller than that for the ICRP model; that phenomenon was observed for other organs and tissues as well. The uncertainty of the integral of the radioactivity of Zr up to 50 y calculated by the HMGU model after ingestion by adult members of the public was shown to be smaller by a factor of two than that of the ICRP model. It was also shown that the distribution type of the model parameter strongly influences the model prediction, and the correlation of the model input parameters affects the model prediction to a
International Nuclear Information System (INIS)
Sig Drellack, Lance Prothro
2007-01-01
The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The
Environmental impact and risk assessments and key factors contributing to the overall uncertainties
International Nuclear Information System (INIS)
Salbu, Brit
2016-01-01
There is a significant number of nuclear and radiological sources that have contributed, are still contributing, or have the potential to contribute to radioactive contamination of the environment in the future. To protect the environment from radioactive contamination, impact and risk assessments are performed prior to or during a release event, short or long term after deposition or prior and after implementation of countermeasures. When environmental impact and risks are assessed, however, a series of factors will contribute to the overall uncertainties. To provide environmental impact and risk assessments, information on processes, kinetics and a series of input variables is needed. Adding problems such as variability, questionable assumptions, gaps in knowledge, extrapolations and poor conceptual model structures, a series of factors are contributing to large and often unacceptable uncertainties in impact and risk assessments. Information on the source term and the release scenario is an essential starting point in impact and risk models; the source determines activity concentrations and atom ratios of radionuclides released, while the release scenario determine the physico-chemical forms of released radionuclides such as particle size distribution, structure and density. Releases will most often contain other contaminants such as metals, and due to interactions, contaminated sites should be assessed as a multiple stressor scenario. Following deposition, a series of stressors, interactions and processes will influence the ecosystem transfer of radionuclide species and thereby influence biological uptake (toxicokinetics) and responses (toxicodynamics) in exposed organisms. Due to the variety of biological species, extrapolation is frequently needed to fill gaps in knowledge e.g., from effects to no effects, from effects in one organism to others, from one stressor to mixtures. Most toxtests are, however, performed as short term exposure of adult organisms
Debry, E.; Malherbe, L.; Schillinger, C.; Bessagnet, B.; Rouil, L.
2009-04-01
Evaluation of human exposure to atmospheric pollution usually requires the knowledge of pollutants concentrations in ambient air. In the framework of PAISA project, which studies the influence of socio-economical status on relationships between air pollution and short term health effects, the concentrations of gas and particle pollutants are computed over Strasbourg with the ADMS-Urban model. As for any modeling result, simulated concentrations come with uncertainties which have to be characterized and quantified. There are several sources of uncertainties related to input data and parameters, i.e. fields used to execute the model like meteorological fields, boundary conditions and emissions, related to the model formulation because of incomplete or inaccurate treatment of dynamical and chemical processes, and inherent to the stochastic behavior of atmosphere and human activities [1]. Our aim is here to assess the uncertainties of the simulated concentrations with respect to input data and model parameters. In this scope the first step consisted in bringing out the input data and model parameters that contribute most effectively to space and time variability of predicted concentrations. Concentrations of several pollutants were simulated for two months in winter 2004 and two months in summer 2004 over five areas of Strasbourg. The sensitivity analysis shows the dominating influence of boundary conditions and emissions. Among model parameters, the roughness and Monin-Obukhov lengths appear to have non neglectable local effects. Dry deposition is also an important dynamic process. The second step of the characterization and quantification of uncertainties consists in attributing a probability distribution to each input data and model parameter and in propagating the joint distribution of all data and parameters into the model so as to associate a probability distribution to the modeled concentrations. Several analytical and numerical methods exist to perform an
Uncertainty studies and risk assessment for CO{sub 2} storage in geological formations
Energy Technology Data Exchange (ETDEWEB)
Walter, Lena Sophie
2013-07-01
Carbon capture and storage (CCS) in deep geological formations is one possible option to mitigate the greenhouse gas effect by reducing CO{sub 2} emissions into the atmosphere. The assessment of the risks related to CO{sub 2} storage is an important task. Events such as CO{sub 2} leakage and brine displacement could result in hazards for human health and the environment. In this thesis, a systematic and comprehensive risk assessment concept is presented to investigate various levels of uncertainties and to assess risks using numerical simulations. Depending on the risk and the processes, which should be assessed, very complex models, large model domains, large time scales, and many simulations runs for estimating probabilities are required. To reduce the resulting high computational costs, a model reduction technique (the arbitrary polynomial chaos expansion) and a method for model coupling in space are applied. The different levels of uncertainties are: statistical uncertainty in parameter distributions, scenario uncertainty, e.g. different geological features, and recognized ignorance due to assumptions in the conceptual model set-up. Recognized ignorance and scenario uncertainty are investigated by simulating well defined model set-ups and scenarios. According to damage values, which are defined as a model output, the set-ups and scenarios can be compared and ranked. For statistical uncertainty probabilities can be determined by running Monte Carlo simulations with the reduced model. The results are presented in various ways: e.g., mean damage, probability density function, cumulative distribution function, or an overall risk value by multiplying the damage with the probability. If the model output (damage) cannot be compared to provided criteria (e.g. water quality criteria), analytical approximations are presented to translate the damage into comparable values. The overall concept is applied for the risks related to brine displacement and infiltration into
Uncertainty studies and risk assessment for CO2 storage in geological formations
International Nuclear Information System (INIS)
Walter, Lena Sophie
2013-01-01
Carbon capture and storage (CCS) in deep geological formations is one possible option to mitigate the greenhouse gas effect by reducing CO 2 emissions into the atmosphere. The assessment of the risks related to CO 2 storage is an important task. Events such as CO 2 leakage and brine displacement could result in hazards for human health and the environment. In this thesis, a systematic and comprehensive risk assessment concept is presented to investigate various levels of uncertainties and to assess risks using numerical simulations. Depending on the risk and the processes, which should be assessed, very complex models, large model domains, large time scales, and many simulations runs for estimating probabilities are required. To reduce the resulting high computational costs, a model reduction technique (the arbitrary polynomial chaos expansion) and a method for model coupling in space are applied. The different levels of uncertainties are: statistical uncertainty in parameter distributions, scenario uncertainty, e.g. different geological features, and recognized ignorance due to assumptions in the conceptual model set-up. Recognized ignorance and scenario uncertainty are investigated by simulating well defined model set-ups and scenarios. According to damage values, which are defined as a model output, the set-ups and scenarios can be compared and ranked. For statistical uncertainty probabilities can be determined by running Monte Carlo simulations with the reduced model. The results are presented in various ways: e.g., mean damage, probability density function, cumulative distribution function, or an overall risk value by multiplying the damage with the probability. If the model output (damage) cannot be compared to provided criteria (e.g. water quality criteria), analytical approximations are presented to translate the damage into comparable values. The overall concept is applied for the risks related to brine displacement and infiltration into drinking water
Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions
Jung, J. Y.; Niemann, J. D.; Greimann, B. P.
2016-12-01
Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.
Model-specification uncertainty in future forest pest outbreak.
Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis
2016-04-01
Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John
A statistical methodology for quantification of uncertainty in best estimate code physical models
International Nuclear Information System (INIS)
Vinai, Paolo; Macian-Juan, Rafael; Chawla, Rakesh
2007-01-01
A novel uncertainty assessment methodology, based on a statistical non-parametric approach, is presented in this paper. It achieves quantification of code physical model uncertainty by making use of model performance information obtained from studies of appropriate separate-effect tests. Uncertainties are quantified in the form of estimated probability density functions (pdf's), calculated with a newly developed non-parametric estimator. The new estimator objectively predicts the probability distribution of the model's 'error' (its uncertainty) from databases reflecting the model's accuracy on the basis of available experiments. The methodology is completed by applying a novel multi-dimensional clustering technique based on the comparison of model error samples with the Kruskall-Wallis test. This takes into account the fact that a model's uncertainty depends on system conditions, since a best estimate code can give predictions for which the accuracy is affected by the regions of the physical space in which the experiments occur. The final result is an objective, rigorous and accurate manner of assigning uncertainty to coded models, i.e. the input information needed by code uncertainty propagation methodologies used for assessing the accuracy of best estimate codes in nuclear systems analysis. The new methodology has been applied to the quantification of the uncertainty in the RETRAN-3D void model and then used in the analysis of an independent separate-effect experiment. This has clearly demonstrated the basic feasibility of the approach, as well as its advantages in yielding narrower uncertainty bands in quantifying the code's accuracy for void fraction predictions
Zell, Wesley O.; Culver, Teresa B.; Sanford, Ward E.
2018-06-01
Uncertainties about the age of base-flow discharge can have serious implications for the management of degraded environmental systems where subsurface pathways, and the ongoing release of pollutants that accumulated in the subsurface during past decades, dominate the water quality signal. Numerical groundwater models may be used to estimate groundwater return times and base-flow ages and thus predict the time required for stakeholders to see the results of improved agricultural management practices. However, the uncertainty inherent in the relationship between (i) the observations of atmospherically-derived tracers that are required to calibrate such models and (ii) the predictions of system age that the observations inform have not been investigated. For example, few if any studies have assessed the uncertainty of numerically-simulated system ages or evaluated the uncertainty reductions that may result from the expense of collecting additional subsurface tracer data. In this study we combine numerical flow and transport modeling of atmospherically-derived tracers with prediction uncertainty methods to accomplish four objectives. First, we show the relative importance of head, discharge, and tracer information for characterizing response times in a uniquely data rich catchment that includes 266 age-tracer measurements (SF6, CFCs, and 3H) in addition to long term monitoring of water levels and stream discharge. Second, we calculate uncertainty intervals for model-simulated base-flow ages using both linear and non-linear methods, and find that the prediction sensitivity vector used by linear first-order second-moment methods results in much larger uncertainties than non-linear Monte Carlo methods operating on the same parameter uncertainty. Third, by combining prediction uncertainty analysis with multiple models of the system, we show that data-worth calculations and monitoring network design are sensitive to variations in the amount of water leaving the system via
International Nuclear Information System (INIS)
Ahn, Kwang Il; Yang, Joon Eon
2003-01-01
In the risk and reliability analysis of complex technological systems, the primary concern of formal uncertainty analysis is to understand why uncertainties arise, and to evaluate how they impact the results of the analysis. In recent times, many of the uncertainty analyses have focused on parameters of the risk and reliability analysis models, whose values are uncertain in an aleatory or an epistemic way. As the field of parametric uncertainty analysis matures, however, more attention is being paid to the explicit treatment of uncertainties that are addressed in the predictive model itself as well as the accuracy of the predictive model. The essential steps for evaluating impacts of these model uncertainties in the presence of parameter uncertainties are to determine rigorously various sources of uncertainties to be addressed in an underlying model itself and in turn model parameters, based on our state-of-knowledge and relevant evidence. Answering clearly the question of how to characterize and treat explicitly the forgoing different sources of uncertainty is particularly important for practical aspects such as risk and reliability optimization of systems as well as more transparent risk information and decision-making under various uncertainties. The main purpose of this paper is to provide practical guidance for quantitatively treating various model uncertainties that would often be encountered in the risk and reliability modeling process of complex technological systems
International Nuclear Information System (INIS)
Monte, Luigi; Hakanson, Lars; Bergstroem, Ulla; Brittain, John; Heling, Rudie
1996-01-01
The principles of Empirically Based Uncertainty Analysis (EBUA) are described. EBUA is based on the evaluation of 'performance indices' that express the level of agreement between the model and sets of empirical independent data collected in different experimental circumstances. Some of these indices may be used to evaluate the confidence limits of the model output. The method is based on the statistical analysis of the distribution of the index values and on the quantitative relationship of these values with the ratio 'experimental data/model output'. Some performance indices are described in the present paper. Among these, the so-called 'functional distance' (d) between the logarithm of model output and the logarithm of the experimental data, defined as d 2 =Σ n 1 ( ln M i - ln O i ) 2 /n where M i is the i-th experimental value, O i the corresponding model evaluation and n the number of the couplets 'experimental value, predicted value', is an important tool for the EBUA method. From the statistical distribution of this performance index, it is possible to infer the characteristics of the distribution of the ratio 'experimental data/model output' and, consequently to evaluate the confidence limits for the model predictions. This method was applied to calculate the uncertainty level of a model developed to predict the migration of radiocaesium in lacustrine systems. Unfortunately, performance indices are affected by the uncertainty of the experimental data used in validation. Indeed, measurement results of environmental levels of contamination are generally associated with large uncertainty due to the measurement and sampling techniques and to the large variability in space and time of the measured quantities. It is demonstrated that this non-desired effect, in some circumstances, may be corrected by means of simple formulae
Dependencies, human interactions and uncertainties in probabilistic safety assessment
International Nuclear Information System (INIS)
Hirschberg, S.
1990-01-01
In the context of Probabilistic Safety Assessment (PSA), three areas were investigated in a 4-year Nordic programme: dependencies with special emphasis on common cause failures, human interactions and uncertainty aspects. The approach was centered around comparative analyses in form of Benchmark/Reference Studies and retrospective reviews. Weak points in available PSAs were identified and recommendations were made aiming at improving consistency of the PSAs. The sensitivity of PSA-results to basic assumptions was demonstrated and the sensitivity to data assignment and to choices of methods for analysis of selected topics was investigated. (author)
Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.
Soleimani, Hossein; Hensman, James; Saria, Suchi
2017-08-21
Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.
Quantile uncertainty and value-at-risk model risk.
Alexander, Carol; Sarabia, José María
2012-08-01
This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.
Estimation of spatial uncertainties of tomographic velocity models
Energy Technology Data Exchange (ETDEWEB)
Jordan, M.; Du, Z.; Querendez, E. [SINTEF Petroleum Research, Trondheim (Norway)
2012-12-15
This research project aims to evaluate the possibility of assessing the spatial uncertainties in tomographic velocity model building in a quantitative way. The project is intended to serve as a test of whether accurate and specific uncertainty estimates (e.g., in meters) can be obtained. The project is based on Monte Carlo-type perturbations of the velocity model as obtained from the tomographic inversion guided by diagonal and off-diagonal elements of the resolution and the covariance matrices. The implementation and testing of this method was based on the SINTEF in-house stereotomography code, using small synthetic 2D data sets. To test the method the calculation and output of the covariance and resolution matrices was implemented, and software to perform the error estimation was created. The work included the creation of 2D synthetic data sets, the implementation and testing of the software to conduct the tests (output of the covariance and resolution matrices which are not implicitly provided by stereotomography), application to synthetic data sets, analysis of the test results, and creating the final report. The results show that this method can be used to estimate the spatial errors in tomographic images quantitatively. The results agree with' the known errors for our synthetic models. However, the method can only be applied to structures in the model, where the change of seismic velocity is larger than the predicted error of the velocity parameter amplitudes. In addition, the analysis is dependent on the tomographic method, e.g., regularization and parameterization. The conducted tests were very successful and we believe that this method could be developed further to be applied to third party tomographic images.
A GLUE uncertainty analysis of a drying model of pharmaceutical granules
DEFF Research Database (Denmark)
Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn
2013-01-01
unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...... on the prediction uncertainty is assessed. Secondly, the paper focuses on the influence of the most sensitive parameters in the model. Finally, a combined analysis (particle level plus most sensitive parameters) is performed and discussed. To propagate the uncertainty originating from the parameter uncertainty...
Procedures for uncertainty and sensitivity analysis in repository performance assessment
International Nuclear Information System (INIS)
Poern, K.; Aakerlund, O.
1985-10-01
The objective of the project was mainly a literature study of available methods for the treatment of parameter uncertainty propagation and sensitivity aspects in complete models such as those concerning geologic disposal of radioactive waste. The study, which has run parallel with the development of a code package (PROPER) for computer assisted analysis of function, also aims at the choice of accurate, cost-affective methods for uncertainty and sensitivity analysis. Such a choice depends on several factors like the number of input parameters, the capacity of the model and the computer reresources required to use the model. Two basic approaches are addressed in the report. In one of these the model of interest is directly simulated by an efficient sampling technique to generate an output distribution. Applying the other basic method the model is replaced by an approximating analytical response surface, which is then used in the sampling phase or in moment matching to generate the output distribution. Both approaches are illustrated by simple examples in the report. (author)
Uncertainty and sensitivity analysis using probabilistic system assessment code. 1
International Nuclear Information System (INIS)
Honma, Toshimitsu; Sasahara, Takashi.
1993-10-01
This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)
Representing Uncertainty on Model Analysis Plots
Smith, Trevor I.
2016-01-01
Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…
Compilation of information on uncertainties involved in deposition modeling
International Nuclear Information System (INIS)
Lewellen, W.S.; Varma, A.K.; Sheng, Y.P.
1985-04-01
The current generation of dispersion models contains very simple parameterizations of deposition processes. The analysis here looks at the physical mechanisms governing these processes in an attempt to see if more valid parameterizations are available and what level of uncertainty is involved in either these simple parameterizations or any more advanced parameterization. The report is composed of three parts. The first, on dry deposition model sensitivity, provides an estimate of the uncertainty existing in current estimates of the deposition velocity due to uncertainties in independent variables such as meteorological stability, particle size, surface chemical reactivity and canopy structure. The range of uncertainty estimated for an appropriate dry deposition velocity for a plume generated by a nuclear power plant accident is three orders of magnitude. The second part discusses the uncertainties involved in precipitation scavenging rates for effluents resulting from a nuclear reactor accident. The conclusion is that major uncertainties are involved both as a result of the natural variability of the atmospheric precipitation process and due to our incomplete understanding of the underlying process. The third part involves a review of the important problems associated with modeling the interaction between the atmosphere and a forest. It gives an indication of the magnitude of the problem involved in modeling dry deposition in such environments. Separate analytics have been done for each section and are contained in the EDB
Sensitivities and uncertainties of modeled ground temperatures in mountain environments
Directory of Open Access Journals (Sweden)
S. Gubler
2013-08-01
Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several
Uncertainty calculation in transport models and forecasts
DEFF Research Database (Denmark)
Manzo, Stefano; Prato, Carlo Giacomo
Transport projects and policy evaluations are often based on transport model output, i.e. traffic flows and derived effects. However, literature has shown that there is often a considerable difference between forecasted and observed traffic flows. This difference causes misallocation of (public...... implemented by using an approach based on stochastic techniques (Monte Carlo simulation and Bootstrap re-sampling) or scenario analysis combined with model sensitivity tests. Two transport models are used as case studies: the Næstved model and the Danish National Transport Model. 3 The first paper...... in a four-stage transport model related to different variable distributions (to be used in a Monte Carlo simulation procedure), assignment procedures and levels of congestion, at both the link and the network level. The analysis used as case study the Næstved model, referring to the Danish town of Næstved2...
Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties
Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.
2015-01-01
For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.
UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS
Directory of Open Access Journals (Sweden)
Fabiana Lucena Oliveira
2014-05-01
Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material. The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.
Feyen, Luc; Caers, Jef
2006-06-01
In this work, we address the problem of characterizing the heterogeneity and uncertainty of hydraulic properties for complex geological settings. Hereby, we distinguish between two scales of heterogeneity, namely the hydrofacies structure and the intrafacies variability of the hydraulic properties. We employ multiple-point geostatistics to characterize the hydrofacies architecture. The multiple-point statistics are borrowed from a training image that is designed to reflect the prior geological conceptualization. The intrafacies variability of the hydraulic properties is represented using conventional two-point correlation methods, more precisely, spatial covariance models under a multi-Gaussian spatial law. We address the different levels and sources of uncertainty in characterizing the subsurface heterogeneity, and explore their effect on groundwater flow and transport predictions. Typically, uncertainty is assessed by way of many images, termed realizations, of a fixed statistical model. However, in many cases, sampling from a fixed stochastic model does not adequately represent the space of uncertainty. It neglects the uncertainty related to the selection of the stochastic model and the estimation of its input parameters. We acknowledge the uncertainty inherent in the definition of the prior conceptual model of aquifer architecture and in the estimation of global statistics, anisotropy, and correlation scales. Spatial bootstrap is used to assess the uncertainty of the unknown statistical parameters. As an illustrative example, we employ a synthetic field that represents a fluvial setting consisting of an interconnected network of channel sands embedded within finer-grained floodplain material. For this highly non-stationary setting we quantify the groundwater flow and transport model prediction uncertainty for various levels of hydrogeological uncertainty. Results indicate the importance of accurately describing the facies geometry, especially for transport
Uncertainty quantification in Rothermel's Model using an efficient sampling method
Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick
2007-01-01
The purpose of the present work is to quantify parametric uncertainty in Rothermelâs wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...
Model Uncertainty and Robustness: A Computational Framework for Multimodel Analysis
Young, Cristobal; Holsteen, Katherine
2017-01-01
Model uncertainty is pervasive in social science. A key question is how robust empirical results are to sensible changes in model specification. We present a new approach and applied statistical software for computational multimodel analysis. Our approach proceeds in two steps: First, we estimate the modeling distribution of estimates across all…
Impact of inherent meteorology uncertainty on air quality model predictions
It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...
Directory of Open Access Journals (Sweden)
M. Migliavacca
2012-06-01
Full Text Available Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere.
Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity.
Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements.
We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO_{2} emissions vs. low CO_{2} emissions scenario. Parameter uncertainty was the smallest (average 95% Confidence Interval – CI: 2.4 days century^{−1} for scenario B1 and 4.5 days century^{−1} for A1fi, whereas driver uncertainty was the largest (up to 8.4 days century^{−1} in the simulated trends. The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century^{−1} for A1fi, ±3.6 days century^{−1} for B1. The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per
The French biofuels mandates under cost uncertainty - an assessment based on robust optimization
International Nuclear Information System (INIS)
Lorne, Daphne; Tchung-Ming, Stephane
2012-01-01
This paper investigates the impact of primary energy and technology cost uncertainty on the achievement of renewable and especially biofuel policies - mandates and norms - in France by 2030. A robust optimization technique that allows to deal with uncertainty sets of high dimensionality is implemented in a TIMES-based long-term planning model of the French energy transport and electricity sectors. The energy system costs and potential benefits (GHG emissions abatements, diversification) of the French renewable mandates are assessed within this framework. The results of this systemic analysis highlight how setting norms and mandates allows to reduce the variability of CO 2 emissions reductions and supply mix diversification when the costs of technological progress and prices are uncertain. Beyond that, we discuss the usefulness of robust optimization in complement of other techniques to integrate uncertainty in large-scale energy models. (authors)
Energy Technology Data Exchange (ETDEWEB)
Srinivasan, Sanjay [Univ. of Texas, Austin, TX (United States)
2014-09-30
In-depth understanding of the long-term fate of CO₂ in the subsurface requires study and analysis of the reservoir formation, the overlaying caprock formation, and adjacent faults. Because there is significant uncertainty in predicting the location and extent of geologic heterogeneity that can impact the future migration of CO₂ in the subsurface, there is a need to develop algorithms that can reliably quantify this uncertainty in plume migration. This project is focused on the development of a model selection algorithm that refines an initial suite of subsurface models representing the prior uncertainty to create a posterior set of subsurface models that reflect injection performance consistent with that observed. Such posterior models can be used to represent uncertainty in the future migration of the CO₂ plume. Because only injection data is required, the method provides a very inexpensive method to map the migration of the plume and the associated uncertainty in migration paths. The model selection method developed as part of this project mainly consists of assessing the connectivity/dynamic characteristics of a large prior ensemble of models, grouping the models on the basis of their expected dynamic response, selecting the subgroup of models that most closely yield dynamic response closest to the observed dynamic data, and finally quantifying the uncertainty in plume migration using the selected subset of models. The main accomplishment of the project is the development of a software module within the SGEMS earth modeling software package that implements the model selection methodology. This software module was subsequently applied to analyze CO₂ plume migration in two field projects – the In Salah CO₂ Injection project in Algeria and CO₂ injection into the Utsira formation in Norway. These applications of the software revealed that the proxies developed in this project for quickly assessing the dynamic characteristics of the reservoir were
Uncertainty and Complexity in Mathematical Modeling
Cannon, Susan O.; Sanders, Mark
2017-01-01
Modeling is an effective tool to help students access mathematical concepts. Finding a math teacher who has not drawn a fraction bar or pie chart on the board would be difficult, as would finding students who have not been asked to draw models and represent numbers in different ways. In this article, the authors will discuss: (1) the properties of…
Model Uncertainty and Exchange Rate Forecasting
Kouwenberg, R.; Markiewicz, A.; Verhoeks, R.; Zwinkels, R.C.J.
2017-01-01
Exchange rate models with uncertain and incomplete information predict that investors focus on a small set of fundamentals that changes frequently over time. We design a model selection rule that captures the current set of fundamentals that best predicts the exchange rate. Out-of-sample tests show
Immersive Data Comprehension: Visualizing Uncertainty in Measurable Models
Directory of Open Access Journals (Sweden)
Pere eBrunet
2015-09-01
Full Text Available Recent advances in 3D scanning technologies have opened new possibilities in a broad range of applications includingcultural heritage, medicine, civil engineering and urban planning. Virtual Reality systems can provide new tools toprofessionals that want to understand acquired 3D models. In this paper, we review the concept of data comprehension with an emphasis on visualization and inspection tools on immersive setups. We claim that in most application fields, data comprehension requires model measurements which in turn should be based on the explicit visualization of uncertainty. As 3D digital representations are not faithful, information on their fidelity at local level should be included in the model itself as uncertainty bounds. We propose the concept of Measurable 3D Models as digital models that explicitly encode local uncertainty bounds related to their quality. We claim that professionals and experts can strongly benefit from immersive interaction through new specific, fidelity-aware measurement tools which can facilitate 3D data comprehension. Since noise and processing errors are ubiquitous in acquired datasets, we discuss the estimation, representation and visualization of data uncertainty. We show that, based on typical user requirements in Cultural Heritage and other domains, application-oriented measuring tools in 3D models must consider uncertainty and local error bounds. We also discuss the requirements of immersive interaction tools for the comprehension of huge 3D and nD datasets acquired from real objects.
International Nuclear Information System (INIS)
Juang, Kai-Wei; Chen, Yue-Shin; Lee, Dar-Yuan
2004-01-01
Mapping the spatial distribution of soil pollutants is essential for delineating contaminated areas. Currently, geostatistical interpolation, kriging, is increasingly used to estimate pollutant concentrations in soils. The kriging-based approach, indicator kriging (IK), may be used to model the uncertainty of mapping. However, a smoothing effect is usually produced when using kriging in pollutant mapping. The detailed spatial patterns of pollutants could, therefore, be lost. The local uncertainty of mapping pollutants derived by the IK technique is referred to as the conditional cumulative distribution function (ccdf) for one specific location (i.e. single-location uncertainty). The local uncertainty information obtained by IK is not sufficient as the uncertainty of mapping at several locations simultaneously (i.e. multi-location uncertainty or spatial uncertainty) is required to assess the reliability of the delineation of contaminated areas. The simulation approach, sequential indicator simulation (SIS), which has the ability to model not only single, but also multi-location uncertainties, was used, in this study, to assess the uncertainty of the delineation of heavy metal contaminated soils. To illustrate this, a data set of Cu concentrations in soil from Taiwan was used. The results show that contour maps of Cu concentrations generated by the SIS realizations exhausted all the spatial patterns of Cu concentrations without the smoothing effect found when using the kriging method. Based on the SIS realizations, the local uncertainty of Cu concentrations at a specific location of x', refers to the probability of the Cu concentration z(x') being higher than the defined threshold level of contamination (z c ). This can be written as Prob SIS [z(x')>z c ], representing the probability of contamination. The probability map of Prob SIS [z(x')>z c ] can then be used for delineating contaminated areas. In addition, the multi-location uncertainty of an area A
Using measurement uncertainty in decision-making and conformity assessment
Pendrill, L. R.
2014-08-01
Measurements often provide an objective basis for making decisions, perhaps when assessing whether a product conforms to requirements or whether one set of measurements differs significantly from another. There is increasing appreciation of the need to account for the role of measurement uncertainty when making decisions, so that a ‘fit-for-purpose’ level of measurement effort can be set prior to performing a given task. Better mutual understanding between the metrologist and those ordering such tasks about the significance and limitations of the measurements when making decisions of conformance will be especially useful. Decisions of conformity are, however, currently made in many important application areas, such as when addressing the grand challenges (energy, health, etc), without a clear and harmonized basis for sharing the risks that arise from measurement uncertainty between the consumer, supplier and third parties. In reviewing, in this paper, the state of the art of the use of uncertainty evaluation in conformity assessment and decision-making, two aspects in particular—the handling of qualitative observations and of impact—are considered key to bringing more order to the present diverse rules of thumb of more or less arbitrary limits on measurement uncertainty and percentage risk in the field. (i) Decisions of conformity can be made on a more or less quantitative basis—referred in statistical acceptance sampling as by ‘variable’ or by ‘attribute’ (i.e. go/no-go decisions)—depending on the resources available or indeed whether a full quantitative judgment is needed or not. There is, therefore, an intimate relation between decision-making, relating objects to each other in terms of comparative or merely qualitative concepts, and nominal and ordinal properties. (ii) Adding measures of impact, such as the costs of incorrect decisions, can give more objective and more readily appreciated bases for decisions for all parties concerned. Such
A review of occupational dose assessment uncertainties and approaches
International Nuclear Information System (INIS)
Anderson, R. W.
2004-01-01
The Radiological Protection Practitioner (RPP) will spend a considerable proportion of his time predicting or assessing retrospective radiation exposures to occupational personnel for different purposes. The assessments can be for a variety of purposes, such as to predict doses for occupational dose control, or project design purposes or to make retrospective estimates for the dose record, or account for dosemeters which have been lost or damaged. There are other less frequent occasions when dose assessment will be required such as to support legal cases and compensation claims and to provide the detailed dose information for epidemiological studies. It is important that the level of detail, justification and supporting evidence in the dose assessment is suitable for the requirements. So for instance, day to day operational dose assessments often rely mainly on the knowledge of the RPP in discussion with operators whilst at the other end of the spectrum a historical dose assessment for a legal case will require substantial research and supporting evidence for the estimate to withstand forensic challenge. The robustness of the assessment will depend on many factors including a knowledge of the work activities, the radiation dose uptake and field characteristics; all of which are affected by factors such as the time elapsed, the memory of operators and the dosemeters employed. This paper reviews the various options and uncertainties in dose assessments ranging from use of personal dosimetry results to the development of upper bound assessments. The level of assessment, the extent of research and the evidence adduced should then be appropriate to the end use of the estimate. (Author)
Geological-structural models used in SR 97. Uncertainty analysis
Energy Technology Data Exchange (ETDEWEB)
Saksa, P.; Nummela, J. [FINTACT Oy (Finland)
1998-10-01
The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km{sup 3}. Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that
Geological-structural models used in SR 97. Uncertainty analysis
International Nuclear Information System (INIS)
Saksa, P.; Nummela, J.
1998-10-01
The uncertainty of geological-structural models was studied for the three sites in SR 97, called Aberg, Beberg and Ceberg. The evaluation covered both regional and site scale models, the emphasis being placed on fracture zones in the site scale. Uncertainty is a natural feature of all geoscientific investigations. It originates from measurements (errors in data, sampling limitations, scale variation) and conceptualisation (structural geometries and properties, ambiguous geometric or parametric solutions) to name the major ones. The structures of A-, B- and Ceberg are fracture zones of varying types. No major differences in the conceptualisation between the sites were noted. One source of uncertainty in the site models is the non-existence of fracture and zone information in the scale from 10 to 300 - 1000 m. At Aberg the development of the regional model has been performed very thoroughly. At the site scale one major source of uncertainty is that a clear definition of the target area is missing. Structures encountered in the boreholes are well explained and an interdisciplinary approach in interpretation have taken place. Beberg and Ceberg regional models contain relatively large uncertainties due to the investigation methodology and experience available at that time. In site scale six additional structures were proposed both to Beberg and Ceberg to variant analysis of these sites. Both sites include uncertainty in the form of many non-interpreted fractured sections along the boreholes. Statistical analysis gives high occurrences of structures for all three sites: typically 20 - 30 structures/km 3 . Aberg has highest structural frequency, Beberg comes next and Ceberg has the lowest. The borehole configuration, orientations and surveying goals were inspected to find whether preferences or factors causing bias were present. Data from Aberg supports the conclusion that Aespoe sub volume would be an anomalously fractured, tectonised unit of its own. This means that the
Quantifying and Visualizing Uncertainties in Molecular Models
Rasheed, Muhibur; Clement, Nathan; Bhowmick, Abhishek; Bajaj, Chandrajit
2015-01-01
Computational molecular modeling and visualization has seen significant progress in recent years with sev- eral molecular modeling and visualization software systems in use today. Nevertheless the molecular biology community lacks techniques and tools for the rigorous analysis, quantification and visualization of the associated errors in molecular structure and its associated properties. This paper attempts at filling this vacuum with the introduction of a systematic statistical framework whe...
Chen, Zhuowei; Shi, Liangsheng; Ye, Ming; Zhu, Yan; Yang, Jinzhong
2018-06-01
Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. By using a new variance-based global sensitivity analysis method, this paper identifies important parameters for nitrogen reactive transport with simultaneous consideration of these three uncertainties. A combination of three scenarios of soil temperature and two scenarios of soil moisture creates a total of six scenarios. Four alternative models describing the effect of soil temperature and moisture content are used to evaluate the reduction functions used for calculating actual reaction rates. The results show that for nitrogen reactive transport problem, parameter importance varies substantially among different models and scenarios. Denitrification and nitrification process is sensitive to soil moisture content status rather than to the moisture function parameter. Nitrification process becomes more important at low moisture content and low temperature. However, the changing importance of nitrification activity with respect to temperature change highly relies on the selected model. Model-averaging is suggested to assess the nitrification (or denitrification) contribution by reducing the possible model error. Despite the introduction of biochemical heterogeneity or not, fairly consistent parameter importance rank is obtained in this study: optimal denitrification rate (Kden) is the most important parameter; reference temperature (Tr) is more important than temperature coefficient (Q10); empirical constant in moisture response function (m) is the least important one. Vertical distribution of soil moisture but not temperature plays predominant role controlling nitrogen reaction. This study provides insight into the nitrogen reactive transport modeling and demonstrates an effective strategy of selecting the important parameters when future temperature and soil moisture carry uncertainties or when modelers face with multiple ways of establishing nitrogen
Wanders, N.; Karssenberg, D.; Bierkens, M. F. P.; Van Dam, J. C.; De Jong, S. M.
Soil moisture is a key variable in the hydrological cycle and important in hydrological modelling. When assimilating soil moisture into flood forecasting models, the improvement of forecasting skills depends on the ability to accurately estimate the spatial and temporal patterns of soil moisture
Jacquin, A. P.
2012-04-01
This study analyses the effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model's discharge estimates. Prediction uncertainty bounds are derived using the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation (at a single station within the catchment) and a precipitation factor FPi. Thus, these factors provide a simplified representation of the spatial variation of precipitation, specifically the shape of the functional relationship between precipitation and height. In the absence of information about appropriate values of the precipitation factors FPi, these are estimated through standard calibration procedures. The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. Monte Carlo samples of the model output are obtained by randomly varying the model parameters within their feasible ranges. In the first experiment, the precipitation factors FPi are considered unknown and thus included in the sampling process. The total number of unknown parameters in this case is 16. In the second experiment, precipitation factors FPi are estimated a priori, by means of a long term water balance between observed discharge at the catchment outlet, evapotranspiration estimates and observed precipitation. In this case, the number of unknown parameters reduces to 11. The feasible ranges assigned to the precipitation factors in the first experiment are slightly wider than the range of fixed precipitation factors used in the second experiment. The mean squared error of the Box-Cox transformed discharge during the calibration period is used for the evaluation of the
Enhancing uncertainty tolerance in the modelling creep of ligaments
International Nuclear Information System (INIS)
Taha, M M Reda; Lucero, J
2006-01-01
The difficulty in performing biomechanical tests and the scarcity of biomechanical experimental databases necessitate extending the current knowledge base to allow efficient modelling using limited data sets. This study suggests a framework to reduce uncertainties in biomechanical systems using limited data sets. The study also shows how sparse data and epistemic input can be exploited using fuzzy logic to represent biomechanical relations. An example application to model collagen fibre recruitment in the medial collateral ligaments during time-dependent deformation under cyclic loading (creep) is presented. The study suggests a quality metric that can be employed to observe and enhance uncertainty tolerance in the modelling process
Parameter sensitivity and uncertainty of the forest carbon flux model FORUG : a Monte Carlo analysis