WorldWideScience

Sample records for model uncertainties caused

  1. The magnitude and causes of uncertainty in global model simulations of cloud condensation nuclei

    Directory of Open Access Journals (Sweden)

    L. A. Lee

    2013-09-01

    Full Text Available Aerosol–cloud interaction effects are a major source of uncertainty in climate models so it is important to quantify the sources of uncertainty and thereby direct research efforts. However, the computational expense of global aerosol models has prevented a full statistical analysis of their outputs. Here we perform a variance-based analysis of a global 3-D aerosol microphysics model to quantify the magnitude and leading causes of parametric uncertainty in model-estimated present-day concentrations of cloud condensation nuclei (CCN. Twenty-eight model parameters covering essentially all important aerosol processes, emissions and representation of aerosol size distributions were defined based on expert elicitation. An uncertainty analysis was then performed based on a Monte Carlo-type sampling of an emulator built for each model grid cell. The standard deviation around the mean CCN varies globally between about ±30% over some marine regions to ±40–100% over most land areas and high latitudes, implying that aerosol processes and emissions are likely to be a significant source of uncertainty in model simulations of aerosol–cloud effects on climate. Among the most important contributors to CCN uncertainty are the sizes of emitted primary particles, including carbonaceous combustion particles from wildfires, biomass burning and fossil fuel use, as well as sulfate particles formed on sub-grid scales. Emissions of carbonaceous combustion particles affect CCN uncertainty more than sulfur emissions. Aerosol emission-related parameters dominate the uncertainty close to sources, while uncertainty in aerosol microphysical processes becomes increasingly important in remote regions, being dominated by deposition and aerosol sulfate formation during cloud-processing. The results lead to several recommendations for research that would result in improved modelling of cloud–active aerosol on a global scale.

  2. Uncertainty analysis of Multiple Greek Letter parameters in common cause failure model

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Vrbanic, I.

    2003-01-01

    As a rule, common cause failures have high influence on the results of Probabilistic Safety Assessments. In the paper, uncertainty analysis for parameters of Multiple-Greek-Letter common-cause-model due to stochastic nature of events is presented. Results of Bayesian analysis and maximum likelihood analysis are compared and interpreted. Special emphasis is given to the assessment of the Bayesian inclusion of generic knowledge, since it may bias the results conservatively. (author)

  3. Using expert knowledge to incorporate uncertainty in cause-of-death assignments for modeling of cause-specific mortality

    Science.gov (United States)

    Walsh, Daniel P.; Norton, Andrew S.; Storm, Daniel J.; Van Deelen, Timothy R.; Heisy, Dennis M.

    2018-01-01

    Implicit and explicit use of expert knowledge to inform ecological analyses is becoming increasingly common because it often represents the sole source of information in many circumstances. Thus, there is a need to develop statistical methods that explicitly incorporate expert knowledge, and can successfully leverage this information while properly accounting for associated uncertainty during analysis. Studies of cause-specific mortality provide an example of implicit use of expert knowledge when causes-of-death are uncertain and assigned based on the observer's knowledge of the most likely cause. To explicitly incorporate this use of expert knowledge and the associated uncertainty, we developed a statistical model for estimating cause-specific mortality using a data augmentation approach within a Bayesian hierarchical framework. Specifically, for each mortality event, we elicited the observer's belief of cause-of-death by having them specify the probability that the death was due to each potential cause. These probabilities were then used as prior predictive values within our framework. This hierarchical framework permitted a simple and rigorous estimation method that was easily modified to include covariate effects and regularizing terms. Although applied to survival analysis, this method can be extended to any event-time analysis with multiple event types, for which there is uncertainty regarding the true outcome. We conducted simulations to determine how our framework compared to traditional approaches that use expert knowledge implicitly and assume that cause-of-death is specified accurately. Simulation results supported the inclusion of observer uncertainty in cause-of-death assignment in modeling of cause-specific mortality to improve model performance and inference. Finally, we applied the statistical model we developed and a traditional method to cause-specific survival data for white-tailed deer, and compared results. We demonstrate that model selection

  4. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  5. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  6. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  7. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  8. Uncertainties in repository modeling

    International Nuclear Information System (INIS)

    Wilson, J.R.

    1996-01-01

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling

  9. Flood modelling : Parameterisation and inflow uncertainty

    NARCIS (Netherlands)

    Mukolwe, M.M.; Di Baldassarre, G.; Werner, M.; Solomatine, D.P.

    2014-01-01

    This paper presents an analysis of uncertainty in hydraulic modelling of floods, focusing on the inaccuracy caused by inflow errors and parameter uncertainty. In particular, the study develops a method to propagate the uncertainty induced by, firstly, application of a stage–discharge rating curve

  10. Reusable launch vehicle model uncertainties impact analysis

    Science.gov (United States)

    Chen, Jiaye; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng

    2018-03-01

    Reusable launch vehicle(RLV) has the typical characteristics of complex aerodynamic shape and propulsion system coupling, and the flight environment is highly complicated and intensely changeable. So its model has large uncertainty, which makes the nominal system quite different from the real system. Therefore, studying the influences caused by the uncertainties on the stability of the control system is of great significance for the controller design. In order to improve the performance of RLV, this paper proposes the approach of analyzing the influence of the model uncertainties. According to the typical RLV, the coupling dynamic and kinematics models are built. Then different factors that cause uncertainties during building the model are analyzed and summed up. After that, the model uncertainties are expressed according to the additive uncertainty model. Choosing the uncertainties matrix's maximum singular values as the boundary model, and selecting the uncertainties matrix's norm to show t how much the uncertainty factors influence is on the stability of the control system . The simulation results illustrate that the inertial factors have the largest influence on the stability of the system, and it is necessary and important to take the model uncertainties into consideration before the designing the controller of this kind of aircraft( like RLV, etc).

  11. A commentary on model uncertainty

    International Nuclear Information System (INIS)

    Apostolakis, G.

    1994-01-01

    A framework is proposed for the identification of model and parameter uncertainties in risk assessment models. Two cases are distinguished; in the first case, a set of mutually exclusive and exhaustive hypotheses (models) can be formulated, while, in the second, only one reference model is available. The relevance of this formulation to decision making and the communication of uncertainties is discussed

  12. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  13. Uncertainty analysis of environmental models

    International Nuclear Information System (INIS)

    Monte, L.

    1990-01-01

    In the present paper an evaluation of the output uncertainty of an environmental model for assessing the transfer of 137 Cs and 131 I in the human food chain are carried out on the basis of a statistical analysis of data reported by the literature. The uncertainty analysis offers the oppotunity of obtaining some remarkable information about the uncertainty of models predicting the migration of non radioactive substances in the environment mainly in relation to the dry and wet deposition

  14. Model uncertainty in safety assessment

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Huovinen, T.

    1996-01-01

    The uncertainty analyses are an essential part of any risk assessment. Usually the uncertainties of reliability model parameter values are described by probability distributions and the uncertainty is propagated through the whole risk model. In addition to the parameter uncertainties, the assumptions behind the risk models may be based on insufficient experimental observations and the models themselves may not be exact descriptions of the phenomena under analysis. The description and quantification of this type of uncertainty, model uncertainty, is the topic of this report. The model uncertainty is characterized and some approaches to model and quantify it are discussed. The emphasis is on so called mixture models, which have been applied in PSAs. Some of the possible disadvantages of the mixture model are addressed. In addition to quantitative analyses, also qualitative analysis is discussed shortly. To illustrate the models, two simple case studies on failure intensity and human error modeling are described. In both examples, the analysis is based on simple mixture models, which are observed to apply in PSA analyses. (orig.) (36 refs., 6 figs., 2 tabs.)

  15. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  16. Impact of model defect and experimental uncertainties on evaluated output

    International Nuclear Information System (INIS)

    Neudecker, D.; Capote, R.; Leeb, H.

    2013-01-01

    One of the current major problems in nuclear data evaluation is the unreasonably small evaluated uncertainties often obtained. These small uncertainties are partly attributed to missing correlations of experimental uncertainties as well as to deficiencies of the model employed for the prior information. In this article, both uncertainty sources are included in an evaluation of 55 Mn cross-sections for incident neutrons. Their impact on the evaluated output is studied using a prior obtained by the Full Bayesian Evaluation Technique and a prior obtained by the nuclear model program EMPIRE. It is shown analytically and by means of an evaluation that unreasonably small evaluated uncertainties can be obtained not only if correlated systematic uncertainties of the experiment are neglected but also if prior uncertainties are smaller or about the same magnitude as the experimental ones. Furthermore, it is shown that including model defect uncertainties in the evaluation of 55 Mn leads to larger evaluated uncertainties for channels where the model is deficient. It is concluded that including correlated experimental uncertainties is equally important as model defect uncertainties, if the model calculations deviate significantly from the measurements. -- Highlights: • We study possible causes of unreasonably small evaluated nuclear data uncertainties. • Two different formulations of model defect uncertainties are presented and compared. • Smaller prior than experimental uncertainties cause too small evaluated ones. • Neglected correlations of experimental uncertainties cause too small evaluated ones. • Including model defect uncertainties in the prior improves the evaluated output

  17. Current status of uncertainty analysis methods for computer models

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu

    1989-11-01

    This report surveys several existing uncertainty analysis methods for estimating computer output uncertainty caused by input uncertainties, illustrating application examples of those methods to three computer models, MARCH/CORRAL II, TERFOC and SPARC. Merits and limitations of the methods are assessed in the application, and recommendation for selecting uncertainty analysis methods is provided. (author)

  18. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    .D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta......Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  19. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  20. A Bayesian approach to model uncertainty

    International Nuclear Information System (INIS)

    Buslik, A.

    1994-01-01

    A Bayesian approach to model uncertainty is taken. For the case of a finite number of alternative models, the model uncertainty is equivalent to parameter uncertainty. A derivation based on Savage's partition problem is given

  1. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  2. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  3. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high

  4. Uncertainty modeling and decision support

    International Nuclear Information System (INIS)

    Yager, Ronald R.

    2004-01-01

    We first formulate the problem of decision making under uncertainty. The importance of the representation of our knowledge about the uncertainty in formulating a decision process is pointed out. We begin with a brief discussion of the case of probabilistic uncertainty. Next, in considerable detail, we discuss the case of decision making under ignorance. For this case the fundamental role of the attitude of the decision maker is noted and its subjective nature is emphasized. Next the case in which a Dempster-Shafer belief structure is used to model our knowledge of the uncertainty is considered. Here we also emphasize the subjective choices the decision maker must make in formulating a decision function. The case in which the uncertainty is represented by a fuzzy measure (monotonic set function) is then investigated. We then return to the Dempster-Shafer belief structure and show its relationship to the fuzzy measure. This relationship allows us to get a deeper understanding of the formulation the decision function used Dempster- Shafer framework. We discuss how this deeper understanding allows a decision analyst to better make the subjective choices needed in the formulation of the decision function

  5. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  6. Model Uncertainty for Bilinear Hysteric Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model......In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...

  7. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  8. Uncertainty in multispectral lidar signals caused by incidence angle effects.

    Science.gov (United States)

    Kaasalainen, Sanna; Åkerblom, Markku; Nevalainen, Olli; Hakala, Teemu; Kaasalainen, Mikko

    2018-04-06

    Multispectral terrestrial laser scanning (TLS) is an emerging technology. Several manufacturers already offer commercial dual or three wavelength airborne laser scanners, while multispectral TLS is still carried out mainly with research instruments. Many of these research efforts have focused on the study of vegetation. The aim of this paper is to study the uncertainty of the measurement of spectral indices of vegetation with multispectral lidar. Using two spectral indices as examples, we find that the uncertainty is due to systematic errors caused by the wavelength dependency of laser incidence angle effects. This finding is empirical, and the error cannot be removed by modelling or instrument modification. The discovery and study of these effects has been enabled by hyperspectral and multispectral TLS, and it has become a subject of active research within the past few years. We summarize the most recent studies on multi-wavelength incidence angle effects and present new results on the effect of specular reflection from the leaf surface, and the surface structure, which have been suggested to play a key role. We also discuss the consequences to the measurement of spectral indices with multispectral TLS, and a possible correction scheme using a synthetic laser footprint.

  9. Uncertainty, God, and scrupulosity: Uncertainty salience and priming God concepts interact to cause greater fears of sin.

    Science.gov (United States)

    Fergus, Thomas A; Rowatt, Wade C

    2015-03-01

    Difficulties tolerating uncertainty are considered central to scrupulosity, a moral/religious presentation of obsessive-compulsive disorder (OCD). We examined whether uncertainty salience (i.e., exposure to a state of uncertainty) caused fears of sin and fears of God, as well as whether priming God concepts affected the impact of uncertainty salience on those fears. An internet sample of community adults (N = 120) who endorsed holding a belief in God or a higher power were randomly assigned to an experimental manipulation of (1) salience (uncertainty or insecurity) and (2) prime (God concepts or neutral). As predicted, participants who received the uncertainty salience and God concept priming reported the greatest fears of sin. There were no mean-level differences in the other conditions. The effect was not attributable to religiosity and the manipulations did not cause negative affect. We used a nonclinical sample recruited from the internet. These results support cognitive-behavioral models suggesting that religious uncertainty is important to scrupulosity. Implications of these results for future research are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Some illustrative examples of model uncertainty

    International Nuclear Information System (INIS)

    Bier, V.M.

    1994-01-01

    In this paper, we first discuss the view of model uncertainty proposed by Apostolakis. We then present several illustrative examples related to model uncertainty, some of which are not well handled by this formalism. Thus, Apostolakis' approach seems to be well suited to describing some types of model uncertainty, but not all. Since a comprehensive approach for characterizing and quantifying model uncertainty is not yet available, it is hoped that the examples presented here will service as a springboard for further discussion

  11. Uncertainty shocks in a model of effective demand

    OpenAIRE

    Bundick, Brent; Basu, Susanto

    2014-01-01

    Can increased uncertainty about the future cause a contraction in output and its components? An identified uncertainty shock in the data causes significant declines in output, consumption, investment, and hours worked. Standard general-equilibrium models with flexible prices cannot reproduce this comovement. However, uncertainty shocks can easily generate comovement with countercyclical markups through sticky prices. Monetary policy plays a key role in offsetting the negative impact of uncert...

  12. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    innovation survey data for France, Germany and the UK, we conduct a ‘large-scale’ replication using the Bayesian averaging approach of classical estimators. Our method tests a wide range of determinants of innovation suggested in the prior literature, and establishes a robust set of findings on the variables...... which shape the introduction of new to the firm and new to the world innovations. We provide some implications for innovation research, and explore the potential application of our approach to other domains of research in strategic management.......Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing...

  13. Uncertainty and its propagation in dynamics models

    International Nuclear Information System (INIS)

    Devooght, J.

    1994-01-01

    The purpose of this paper is to bring together some characteristics due to uncertainty when we deal with dynamic models and therefore to propagation of uncertainty. The respective role of uncertainty and inaccuracy is examined. A mathematical formalism based on Chapman-Kolmogorov equation allows to define a open-quotes subdynamicsclose quotes where the evolution equation takes the uncertainty into account. The problem of choosing or combining models is examined through a loss function associated to a decision

  14. Impact of geological model uncertainty on integrated catchment hydrological modeling

    Science.gov (United States)

    He, Xin; Jørgensen, Flemming; Refsgaard, Jens Christian

    2014-05-01

    Various types of uncertainty can influence hydrological model performance. Among them, uncertainty originated from geological model may play an important role in process-based integrated hydrological modeling, if the model is used outside the calibration base. In the present study, we try to assess the hydrological model predictive uncertainty caused by uncertainty of the geology using an ensemble of geological models with equal plausibility. The study is carried out in the 101 km2 Norsminde catchment in western Denmark. Geostatistical software TProGS is used to generate 20 stochastic geological realizations for the west side the of study area. This process is done while incorporating the borehole log data from 108 wells and high resolution airborne transient electromagnetic (AEM) data for conditioning. As a result, 10 geological models are generated based solely on borehole data, and another 10 geological models are based on both borehole and AEM data. Distributed surface water - groundwater models are developed using MIKE SHE code for each of the 20 geological models. The models are then calibrated using field data collected from stream discharge and groundwater head observations. The model simulation results are evaluated based on the same two types of field data. The results show that the differences between simulated discharge flows caused by using different geological models are relatively small. The model calibration is shown to be able to account for the systematic bias in different geological realizations and hence varies the calibrated model parameters. This results in an increase in the variance between the hydrological realizations compared to the uncalibrated models that uses the same parameter values in all 20 models. Furthermore, borehole based hydrological models in general show more variance between simulations than the AEM based models; however, the combined total uncertainty, bias plus variance, is not necessarily higher.

  15. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...

  16. Uncertainty propagation within the UNEDF models

    Science.gov (United States)

    Haverinen, T.; Kortelainen, M.

    2017-04-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  17. Bioprocess optimization under uncertainty using ensemble modeling

    OpenAIRE

    Liu, Yang; Gunawan, Rudiyanto

    2017-01-01

    The performance of model-based bioprocess optimizations depends on the accuracy of the mathematical model. However, models of bioprocesses often have large uncertainty due to the lack of model identifiability. In the presence of such uncertainty, process optimizations that rely on the predictions of a single “best fit” model, e.g. the model resulting from a maximum likelihood parameter estimation using the available process data, may perform poorly in real life. In this study, we employed ens...

  18. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  19. Study on Uncertainty and Contextual Modelling

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 1 (2007), s. 12-15 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Knowledge * contextual modelling * temporal modelling * uncertainty * knowledge management Subject RIV: BD - Theory of Information

  20. Uncertainties

    Indian Academy of Sciences (India)

    The imperfect understanding of some of the processes and physics in the carbon cycle and chemistry models generate uncertainties in the conversion of emissions to concentration. To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the ...

  1. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  2. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    2016-08-26

    Aug 26, 2016 ... Uncertainty; polynomial chaos expansion; fuzzy set theory; cumulative distribution function; uniform distribution; membership function. Abstract. The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that ...

  3. Model uncertainties in top-quark physics

    CERN Document Server

    Seidel, Markus

    2014-01-01

    The ATLAS and CMS collaborations at the Large Hadron Collider (LHC) are studying the top quark in pp collisions at 7 and 8 TeV. Due to the large integrated luminosity, precision measurements of production cross-sections and properties are often limited by systematic uncertainties. An overview of the modeling uncertainties for simulated events is given in this report.

  4. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...

  5. Assessing Groundwater Model Uncertainty for the Central Nevada Test Area

    International Nuclear Information System (INIS)

    Pohll, Greg; Pohlmann, Karl; Hassan, Ahmed; Chapman, Jenny; Mihevc, Todd

    2002-01-01

    The purpose of this study is to quantify the flow and transport model uncertainty for the Central Nevada Test Area (CNTA). Six parameters were identified as uncertain, including the specified head boundary conditions used in the flow model, the spatial distribution of the underlying welded tuff unit, effective porosity, sorption coefficients, matrix diffusion coefficient, and the geochemical release function which describes nuclear glass dissolution. The parameter uncertainty was described by assigning prior statistical distributions for each of these parameters. Standard Monte Carlo techniques were used to sample from the parameter distributions to determine the full prediction uncertainty. Additional analysis is performed to determine the most cost-beneficial characterization activities. The maximum radius of the tritium and strontium-90 contaminant boundary was used as the output metric for evaluation of prediction uncertainty. The results indicate that combining all of the uncertainty in the parameters listed above propagates to a prediction uncertainty in the maximum radius of the contaminant boundary of 234 to 308 m and 234 to 302 m, for tritium and strontium-90, respectively. Although the uncertainty in the input parameters is large, the prediction uncertainty in the contaminant boundary is relatively small. The relatively small prediction uncertainty is primarily due to the small transport velocities such that large changes in the uncertain input parameters causes small changes in the contaminant boundary. This suggests that the model is suitable in terms of predictive capability for the contaminant boundary delineation

  6. Declarative representation of uncertainty in mathematical models.

    Science.gov (United States)

    Miller, Andrew K; Britten, Randall D; Nielsen, Poul M F

    2012-01-01

    An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent) variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language) to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses) in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  7. Declarative representation of uncertainty in mathematical models.

    Directory of Open Access Journals (Sweden)

    Andrew K Miller

    Full Text Available An important aspect of multi-scale modelling is the ability to represent mathematical models in forms that can be exchanged between modellers and tools. While the development of languages like CellML and SBML have provided standardised declarative exchange formats for mathematical models, independent of the algorithm to be applied to the model, to date these standards have not provided a clear mechanism for describing parameter uncertainty. Parameter uncertainty is an inherent feature of many real systems. This uncertainty can result from a number of situations, such as: when measurements include inherent error; when parameters have unknown values and so are replaced by a probability distribution by the modeller; when a model is of an individual from a population, and parameters have unknown values for the individual, but the distribution for the population is known. We present and demonstrate an approach by which uncertainty can be described declaratively in CellML models, by utilising the extension mechanisms provided in CellML. Parameter uncertainty can be described declaratively in terms of either a univariate continuous probability density function or multiple realisations of one variable or several (typically non-independent variables. We additionally present an extension to SED-ML (the Simulation Experiment Description Markup Language to describe sampling sensitivity analysis simulation experiments. We demonstrate the usability of the approach by encoding a sample model in the uncertainty markup language, and by developing a software implementation of the uncertainty specification (including the SED-ML extension for sampling sensitivty analyses in an existing CellML software library, the CellML API implementation. We used the software implementation to run sampling sensitivity analyses over the model to demonstrate that it is possible to run useful simulations on models with uncertainty encoded in this form.

  8. Uncertainty and error in complex plasma chemistry models

    Science.gov (United States)

    Turner, Miles M.

    2015-06-01

    Chemistry models that include dozens of species and hundreds to thousands of reactions are common in low-temperature plasma physics. The rate constants used in such models are uncertain, because they are obtained from some combination of experiments and approximate theories. Since the predictions of these models are a function of the rate constants, these predictions must also be uncertain. However, systematic investigations of the influence of uncertain rate constants on model predictions are rare to non-existent. In this work we examine a particular chemistry model, for helium-oxygen plasmas. This chemistry is of topical interest because of its relevance to biomedical applications of atmospheric pressure plasmas. We trace the primary sources for every rate constant in the model, and hence associate an error bar (or equivalently, an uncertainty) with each. We then use a Monte Carlo procedure to quantify the uncertainty in predicted plasma species densities caused by the uncertainty in the rate constants. Under the conditions investigated, the range of uncertainty in most species densities is a factor of two to five. However, the uncertainty can vary strongly for different species, over time, and with other plasma conditions. There are extreme (pathological) cases where the uncertainty is more than a factor of ten. One should therefore be cautious in drawing any conclusion from plasma chemistry modelling, without first ensuring that the conclusion in question survives an examination of the related uncertainty.

  9. Methodology for the treatment of model uncertainty

    Science.gov (United States)

    Droguett, Enrique Lopez

    The development of a conceptual, unified, framework and methodology for treating model and parameter uncertainties is the subject of this work. Firstly, a discussion on the philosophical grounds of notions such as reality, modeling, models, and their relation is presented. On this, a characterization of the modeling process is presented. The concept of uncertainty, addressing controversial topics such as type and sources of uncertainty, are investigated arguing that uncertainty is fundamentally a characterization of lack of knowledge and as such all uncertainty are of the same type. A discussion about the roles of a model structure and model parameters is presented, in which it is argued that a distinction is for convenience and a function of the stage in the modeling process. From the foregoing discussion, a Bayesian framework for an integrated assessment of model and parameter uncertainties is developed. The methodology has as its central point the treatment of model as source of information regarding the unknown of interest. It allows for the assessment of the model characteristics affecting its performance, such as bias and precision. It also permits the assessment of possible dependencies among multiple models. Furthermore, the proposed framework makes possible the use of not only information from models (e.g., point estimates, qualitative assessments), but also evidence about the models themselves (performance data, confidence in the model, applicability of the model). The methodology is then applied in the context of fire risk models where several examples with real data are studied. These examples demonstrate how the framework and specific techniques developed in this study can address cases involving multiple models, use of performance data to update the predictive capabilities of a model, and the case where a model is applied in a context other than one for which it is designed.

  10. Parameter and Uncertainty Estimation in Groundwater Modelling

    DEFF Research Database (Denmark)

    Jensen, Jacob Birk

    The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... was applied.Capture zone modelling was conducted on a synthetic stationary 3-dimensional flow problem involving river, surface and groundwater flow. Simulated capture zones were illustrated as likelihood maps and compared with a deterministic capture zones derived from a reference model. The results showed...

  11. Empirical Bayesian inference and model uncertainty

    International Nuclear Information System (INIS)

    Poern, K.

    1994-01-01

    This paper presents a hierarchical or multistage empirical Bayesian approach for the estimation of uncertainty concerning the intensity of a homogeneous Poisson process. A class of contaminated gamma distributions is considered to describe the uncertainty concerning the intensity. These distributions in turn are defined through a set of secondary parameters, the knowledge of which is also described and updated via Bayes formula. This two-stage Bayesian approach is an example where the modeling uncertainty is treated in a comprehensive way. Each contaminated gamma distributions, represented by a point in the 3D space of secondary parameters, can be considered as a specific model of the uncertainty about the Poisson intensity. Then, by the empirical Bayesian method each individual model is assigned a posterior probability

  12. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  13. Model Uncertainty Quantification Methods In Data Assimilation

    Science.gov (United States)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  14. Comment on ‘Empirical versus modelling approaches to the estimation of measurement uncertainty caused by primary sampling’ by J. A. Lyn, M. H. Ramsey, A. P. Damant and R. Wood

    NARCIS (Netherlands)

    Geelhoed, B.

    2009-01-01

    Recently, Lyn et al. (Analyst, 2007, 132, 1231) compared two ways of estimating the standard uncertainty of sampling pistachio nuts for aflatoxins – a modelling method and an empirical method. Their case study used robust analysis of variance (RANOVA) to derive the uncertainty estimates,

  15. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  16. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  17. Modeling Uncertainty in Climate Change: A Multi-Model Comparison

    Energy Technology Data Exchange (ETDEWEB)

    Gillingham, Kenneth; Nordhaus, William; Anthoff, David; Blanford, Geoffrey J.; Bosetti, Valentina; Christensen, Peter; McJeon, Haewon C.; Reilly, J. M.; Sztorc, Paul

    2015-10-01

    The economics of climate change involves a vast array of uncertainties, complicating both the analysis and development of climate policy. This study presents the results of the first comprehensive study of uncertainty in climate change using multiple integrated assessment models. The study looks at model and parametric uncertainties for population, total factor productivity, and climate sensitivity and estimates the pdfs of key output variables, including CO2 concentrations, temperature, damages, and the social cost of carbon (SCC). One key finding is that parametric uncertainty is more important than uncertainty in model structure. Our resulting pdfs also provide insight on tail events.

  18. Paradoxical effects of compulsive perseveration : Sentence repetition causes semantic uncertainty

    NARCIS (Netherlands)

    Giele, Catharina L.; van den Hout, Marcel A.; Engelhard, Iris M.; Dek, Eliane C P

    2014-01-01

    Many patients with obsessive compulsive disorder (OCD) perform perseverative checking behavior to reduce uncertainty, but studies have shown that this ironically increases uncertainty. Some patients also tend to perseveratively repeat sentences. The aim of this study was to examine whether sentence

  19. Spatial Uncertainty Analysis of Ecological Models

    Energy Technology Data Exchange (ETDEWEB)

    Jager, H.I.; Ashwood, T.L.; Jackson, B.L.; King, A.W.

    2000-09-02

    The authors evaluated the sensitivity of a habitat model and a source-sink population model to spatial uncertainty in landscapes with different statistical properties and for hypothetical species with different habitat requirements. Sequential indicator simulation generated alternative landscapes from a source map. Their results showed that spatial uncertainty was highest for landscapes in which suitable habitat was rare and spatially uncorrelated. Although, they were able to exert some control over the degree of spatial uncertainty by varying the sampling density drawn from the source map, intrinsic spatial properties (i.e., average frequency and degree of spatial autocorrelation) played a dominant role in determining variation among realized maps. To evaluate the ecological significance of landscape variation, they compared the variation in predictions from a simple habitat model to variation among landscapes for three species types. Spatial uncertainty in predictions of the amount of source habitat depended on both the spatial life history characteristics of the species and the statistical attributes of the synthetic landscapes. Species differences were greatest when the landscape contained a high proportion of suitable habitat. The predicted amount of source habitat was greater for edge-dependent (interior) species in landscapes with spatially uncorrelated(correlated) suitable habitat. A source-sink model demonstrated that, although variation among landscapes resulted in relatively little variation in overall population growth rate, this spatial uncertainty was sufficient in some situations, to produce qualitatively different predictions about population viability (i.e., population decline vs. increase).

  20. An evaluation of uncertainties in radioecological models

    International Nuclear Information System (INIS)

    Hoffmann, F.O.; Little, C.A.; Miller, C.W.; Dunning, D.E. Jr.; Rupp, E.M.; Shor, R.W.; Schaeffer, D.L.; Baes, C.F. III

    1978-01-01

    The paper presents results of analyses for seven selected parameters commonly used in environmental radiological assessment models, assuming that the available data are representative of the true distribution of parameter values and that their respective distributions are lognormal. Estimates of the most probable, median, mean, and 99th percentile for each parameter are fiven and compared to U.S. NRC default values. The regulatory default values are generally greater than the median values for the selected parameters, but some are associated with percentiles significantly less than the 50th. The largest uncertainties appear to be associated with aquatic bioaccumulation factors for fresh water fish. Approximately one order of magnitude separates median values and values of the 99th percentile. The uncertainty is also estimated for the annual dose rate predicted by a multiplicative chain model for the transport of molecular iodine-131 via the air-pasture-cow-milk-child's thyroid pathway. The value for the 99th percentile is ten times larger than the median value of the predicted dose normalized for a given air concentration of 131 I 2 . About 72% of the uncertainty in this model is contributed by the dose conversion factor and the milk transfer coefficient. Considering the difficulties in obtaining a reliable quantification of the true uncertainties in model predictions, methods for taking these uncertainties into account when determining compliance with regulatory statutes are discussed. (orig./HP) [de

  1. Estimating Coastal Digital Elevation Model (DEM) Uncertainty

    Science.gov (United States)

    Amante, C.; Mesick, S.

    2017-12-01

    Integrated bathymetric-topographic digital elevation models (DEMs) are representations of the Earth's solid surface and are fundamental to the modeling of coastal processes, including tsunami, storm surge, and sea-level rise inundation. Deviations in elevation values from the actual seabed or land surface constitute errors in DEMs, which originate from numerous sources, including: (i) the source elevation measurements (e.g., multibeam sonar, lidar), (ii) the interpolative gridding technique (e.g., spline, kriging) used to estimate elevations in areas unconstrained by source measurements, and (iii) the datum transformation used to convert bathymetric and topographic data to common vertical reference systems. The magnitude and spatial distribution of the errors from these sources are typically unknown, and the lack of knowledge regarding these errors represents the vertical uncertainty in the DEM. The National Oceanic and Atmospheric Administration (NOAA) National Centers for Environmental Information (NCEI) has developed DEMs for more than 200 coastal communities. This study presents a methodology developed at NOAA NCEI to derive accompanying uncertainty surfaces that estimate DEM errors at the individual cell-level. The development of high-resolution (1/9th arc-second), integrated bathymetric-topographic DEMs along the southwest coast of Florida serves as the case study for deriving uncertainty surfaces. The estimated uncertainty can then be propagated into the modeling of coastal processes that utilize DEMs. Incorporating the uncertainty produces more reliable modeling results, and in turn, better-informed coastal management decisions.

  2. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1984-01-01

    is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  3. Uncertainty quantification in wind farm flow models

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo

    uncertainties through a model chain are presented and applied to several wind energy related problems such as: annual energy production estimation, wind turbine power curve estimation, wake model calibration and validation, and estimation of lifetime equivalent fatigue loads on a wind turbine. Statistical...

  4. Uncertainty quantification of squeal instability via surrogate modelling

    Science.gov (United States)

    Nobari, Amir; Ouyang, Huajiang; Bannister, Paul

    2015-08-01

    One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of

  5. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  6. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...... find that confidence sets are very wide, change significantly with the predictor variables, and frequently include expected utilities for which the investor prefers not to invest. The latter motivates a robust investment strategy maximizing the minimal element of the confidence set. The robust investor...... allocates a much lower share of wealth to stocks compared to a standard investor....

  7. Uncertainty calculation in transport models and forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Prato, Carlo Giacomo

    . Forthcoming: European Journal of Transport and Infrastructure Research, 15-3, 64-72. 4 The last paper4 examined uncertainty in the spatial composition of residence and workplace locations in the Danish National Transport Model. Despite the evidence that spatial structure influences travel behaviour...... to increase the quality of the decision process and to develop robust or adaptive plans. In fact, project evaluation processes that do not take into account model uncertainty produce not fully informative and potentially misleading results so increasing the risk inherent to the decision to be taken...

  8. Quantifying Registration Uncertainty With Sparse Bayesian Modelling.

    Science.gov (United States)

    Le Folgoc, Loic; Delingette, Herve; Criminisi, Antonio; Ayache, Nicholas

    2017-02-01

    We investigate uncertainty quantification under a sparse Bayesian model of medical image registration. Bayesian modelling has proven powerful to automate the tuning of registration hyperparameters, such as the trade-off between the data and regularization functionals. Sparsity-inducing priors have recently been used to render the parametrization itself adaptive and data-driven. The sparse prior on transformation parameters effectively favors the use of coarse basis functions to capture the global trends in the visible motion while finer, highly localized bases are introduced only in the presence of coherent image information and motion. In earlier work, approximate inference under the sparse Bayesian model was tackled in an efficient Variational Bayes (VB) framework. In this paper we are interested in the theoretical and empirical quality of uncertainty estimates derived under this approximate scheme vs. under the exact model. We implement an (asymptotically) exact inference scheme based on reversible jump Markov Chain Monte Carlo (MCMC) sampling to characterize the posterior distribution of the transformation and compare the predictions of the VB and MCMC based methods. The true posterior distribution under the sparse Bayesian model is found to be meaningful: orders of magnitude for the estimated uncertainty are quantitatively reasonable, the uncertainty is higher in textureless regions and lower in the direction of strong intensity gradients.

  9. Bioprocess optimization under uncertainty using ensemble modeling.

    Science.gov (United States)

    Liu, Yang; Gunawan, Rudiyanto

    2017-02-20

    The performance of model-based bioprocess optimizations depends on the accuracy of the mathematical model. However, models of bioprocesses often have large uncertainty due to the lack of model identifiability. In the presence of such uncertainty, process optimizations that rely on the predictions of a single "best fit" model, e.g. the model resulting from a maximum likelihood parameter estimation using the available process data, may perform poorly in real life. In this study, we employed ensemble modeling to account for model uncertainty in bioprocess optimization. More specifically, we adopted a Bayesian approach to define the posterior distribution of the model parameters, based on which we generated an ensemble of model parameters using a uniformly distributed sampling of the parameter confidence region. The ensemble-based process optimization involved maximizing the lower confidence bound of the desired bioprocess objective (e.g. yield or product titer), using a mean-standard deviation utility function. We demonstrated the performance and robustness of the proposed strategy in an application to a monoclonal antibody batch production by mammalian hybridoma cell culture. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  10. Uncertainties

    Indian Academy of Sciences (India)

    To reflect this uncertainty in the climate scenarios, the use of AOGCMs that explicitly simulate the carbon cycle and chemistry of all the substances are needed. The Hadley Centre has developed a version of the climate model that allows the effect of climate change on the carbon cycle and its feedback into climate, to be ...

  11. Evidential Model Validation under Epistemic Uncertainty

    Directory of Open Access Journals (Sweden)

    Wei Deng

    2018-01-01

    Full Text Available This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

  12. Uncertainty in reactive transport geochemical modelling

    International Nuclear Information System (INIS)

    Oedegaard-Jensen, A.; Ekberg, C.

    2005-01-01

    Full text of publication follows: Geochemical modelling is one way of predicting the transport of i.e. radionuclides in a rock formation. In a rock formation there will be fractures in which water and dissolved species can be transported. The composition of the water and the rock can either increase or decrease the mobility of the transported entities. When doing simulations on the mobility or transport of different species one has to know the exact water composition, the exact flow rates in the fracture and in the surrounding rock, the porosity and which minerals the rock is composed of. The problem with simulations on rocks is that the rock itself it not uniform i.e. larger fractures in some areas and smaller in other areas which can give different water flows. The rock composition can be different in different areas. In additions to this variance in the rock there are also problems with measuring the physical parameters used in a simulation. All measurements will perturb the rock and this perturbation will results in more or less correct values of the interesting parameters. The analytical methods used are also encumbered with uncertainties which in this case are added to the uncertainty from the perturbation of the analysed parameters. When doing simulation the effect of the uncertainties must be taken into account. As the computers are getting faster and faster the complexity of simulated systems are increased which also increase the uncertainty in the results from the simulations. In this paper we will show how the uncertainty in the different parameters will effect the solubility and mobility of different species. Small uncertainties in the input parameters can result in large uncertainties in the end. (authors)

  13. Accept & Reject Statement-Based Uncertainty Models

    NARCIS (Netherlands)

    E. Quaeghebeur (Erik); G. de Cooman; F. Hermans (Felienne)

    2015-01-01

    textabstractWe develop a framework for modelling and reasoning with uncertainty based on accept and reject statements about gambles. It generalises the frameworks found in the literature based on statements of acceptability, desirability, or favourability and clarifies their relative position. Next

  14. Parametric uncertainty in optical image modeling

    Science.gov (United States)

    Potzick, James; Marx, Egon; Davidson, Mark

    2006-10-01

    Optical photomask feature metrology and wafer exposure process simulation both rely on optical image modeling for accurate results. While it is fair to question the accuracies of the available models, model results also depend on several input parameters describing the object and imaging system. Errors in these parameter values can lead to significant errors in the modeled image. These parameters include wavelength, illumination and objective NA's, magnification, focus, etc. for the optical system, and topography, complex index of refraction n and k, etc. for the object. In this paper each input parameter is varied over a range about its nominal value and the corresponding images simulated. Second order parameter interactions are not explored. Using the scenario of the optical measurement of photomask features, these parametric sensitivities are quantified by calculating the apparent change of the measured linewidth for a small change in the relevant parameter. Then, using reasonable values for the estimated uncertainties of these parameters, the parametric linewidth uncertainties can be calculated and combined to give a lower limit to the linewidth measurement uncertainty for those parameter uncertainties.

  15. Modeling transport phenomena and uncertainty quantification in solidification processes

    Science.gov (United States)

    Fezi, Kyle S.

    Direct chill (DC) casting is the primary processing route for wrought aluminum alloys. This semicontinuous process consists of primary cooling as the metal is pulled through a water cooled mold followed by secondary cooling with a water jet spray and free falling water. To gain insight into this complex solidification process, a fully transient model of DC casting was developed to predict the transport phenomena of aluminum alloys for various conditions. This model is capable of solving mixture mass, momentum, energy, and species conservation equations during multicomponent solidification. Various DC casting process parameters were examined for their effect on transport phenomena predictions in an alloy of commercial interest (aluminum alloy 7050). The practice of placing a wiper to divert cooling water from the ingot surface was studied and the results showed that placement closer to the mold causes remelting at the surface and increases susceptibility to bleed outs. Numerical models of metal alloy solidification, like the one previously mentioned, are used to gain insight into physical phenomena that cannot be observed experimentally. However, uncertainty in model inputs cause uncertainty in results and those insights. The analysis of model assumptions and probable input variability on the level of uncertainty in model predictions has not been calculated in solidification modeling as yet. As a step towards understanding the effect of uncertain inputs on solidification modeling, uncertainty quantification (UQ) and sensitivity analysis were first performed on a transient solidification model of a simple binary alloy (Al-4.5wt.%Cu) in a rectangular cavity with both columnar and equiaxed solid growth models. This analysis was followed by quantifying the uncertainty in predictions from the recently developed transient DC casting model. The PRISM Uncertainty Quantification (PUQ) framework quantified the uncertainty and sensitivity in macrosegregation, solidification

  16. Optical Model and Cross Section Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.

    2009-10-05

    Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.

  17. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  18. Representing uncertainty on model analysis plots

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2016-09-01

    Full Text Available Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao’s original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  19. Uncertainty Quantification in Geomagnetic Field Modeling

    Science.gov (United States)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  20. Incorporating model parameter uncertainty into inverse treatment planning

    International Nuclear Information System (INIS)

    Lian Jun; Xing Lei

    2004-01-01

    Radiobiological treatment planning depends not only on the accuracy of the models describing the dose-response relation of different tumors and normal tissues but also on the accuracy of tissue specific radiobiological parameters in these models. Whereas the general formalism remains the same, different sets of model parameters lead to different solutions and thus critically determine the final plan. Here we describe an inverse planning formalism with inclusion of model parameter uncertainties. This is made possible by using a statistical analysis-based frameset developed by our group. In this formalism, the uncertainties of model parameters, such as the parameter a that describes tissue-specific effect in the equivalent uniform dose (EUD) model, are expressed by probability density function and are included in the dose optimization process. We found that the final solution strongly depends on distribution functions of the model parameters. Considering that currently available models for computing biological effects of radiation are simplistic, and the clinical data used to derive the models are sparse and of questionable quality, the proposed technique provides us with an effective tool to minimize the effect caused by the uncertainties in a statistical sense. With the incorporation of the uncertainties, the technique has potential for us to maximally utilize the available radiobiology knowledge for better IMRT treatment

  1. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  2. Impact of inherent meteorology uncertainty on air quality model predictions

    Science.gov (United States)

    Gilliam, Robert C.; Hogrefe, Christian; Godowitch, James M.; Napelenok, Sergey; Mathur, Rohit; Rao, S. Trivikrama

    2015-12-01

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is important to understand how uncertainties in these inputs affect the simulated concentrations. Ensembles are one method to explore how uncertainty in meteorology affects air pollution concentrations. Most studies explore this uncertainty by running different meteorological models or the same model with different physics options and in some cases combinations of different meteorological and air quality models. While these have been shown to be useful techniques in some cases, we present a technique that leverages the initial condition perturbations of a weather forecast ensemble, namely, the Short-Range Ensemble Forecast system to drive the four-dimensional data assimilation in the Weather Research and Forecasting (WRF)-Community Multiscale Air Quality (CMAQ) model with a key focus being the response of ozone chemistry and transport. Results confirm that a sizable spread in WRF solutions, including common weather variables of temperature, wind, boundary layer depth, clouds, and radiation, can cause a relatively large range of ozone-mixing ratios. Pollutant transport can be altered by hundreds of kilometers over several days. Ozone-mixing ratios of the ensemble can vary as much as 10-20 ppb or 20-30% in areas that typically have higher pollution levels.

  3. Economic policy uncertainty index and economic activity: what causes what?

    Directory of Open Access Journals (Sweden)

    Ivana Lolić

    2017-01-01

    Full Text Available This paper is a follow-up on the Economic Policy Uncertainty (EPU index, developed in 2011 by Baker, Bloom, and Davis. The principal idea of the EPU index is to quantify the level of uncertainty in an economic system, based on three separate pillars: news media, number of federal tax code provisions expiring in the following years, and disagreement amongst professional forecasters on future tendencies of relevant macroeconomic variables. Although the original EPU index was designed and published for the US economy, it had instantly caught the attention of numerous academics and was rapidly introduced in 15 countries worldwide. Extensive academic debate has been triggered on the importance of economic uncertainty relating to the intensity and persistence of the recent crisis. Despite the intensive (mostly politically-motivated debate, formal scientific confirmation of causality running from the EPU index to economic activity has not followed. Moreover, empirical literature has completely failed to conduct formal econometric testing of the Granger causality between the two mentioned phenomena. This paper provides an estimation of the Toda-Yamamoto causality test between the EPU index and economic activity in the USA and several European countries. The results do not provide a general conclusion: causality seems to run in both directions only for the USA, while only in one direction for France and Germany. Having taken into account the Great Recession of 2008, the main result does not change, therefore casting doubt on the index methodology and overall media bias.

  4. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  5. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  6. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    gates the material uncertainties on column design and proposes an uncertainty model for critical ... ances the accuracy of the structural models by using experimental results and design codes. (Baalbaki et al ..... Elishakoff I 1999 Whys and hows in uncertainty modeling, probability, fuzziness and anti-optimization. New York: ...

  7. Model uncertainty from a regulatory point of view

    International Nuclear Information System (INIS)

    Abramson, L.R.

    1994-01-01

    This paper discusses model uncertainty in the larger context of knowledge and random uncertainty. It explores some regulatory implications of model uncertainty and argues that, from a regulator's perspective, a conservative approach must be taken. As a consequence of this perspective, averaging over model results is ruled out

  8. Uncertainty associated with selected environmental transport models

    International Nuclear Information System (INIS)

    Little, C.A.; Miller, C.W.

    1979-11-01

    A description is given of the capabilities of several models to predict accurately either pollutant concentrations in environmental media or radiological dose to human organs. The models are discussed in three sections: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations. This procedure is infeasible for food chain models and, therefore, the uncertainty embodied in the models input parameters, rather than the model output, is estimated. Aquatic transport models are divided into one-dimensional, longitudinal-vertical, and longitudinal-horizontal models. Several conclusions were made about the ability of the Gaussian plume atmospheric dispersion model to predict accurately downwind air concentrations from releases under several sets of conditions. It is concluded that no validation study has been conducted to test the predictions of either aquatic or terrestrial food chain models. Using the aquatic pathway from water to fish to an adult for 137 Cs as an example, a 95% one-tailed confidence limit interval for the predicted exposure is calculated by examining the distributions of the input parameters. Such an interval is found to be 16 times the value of the median exposure. A similar one-tailed limit for the air-grass-cow-milk-thyroid for 131 I and infants was 5.6 times the median dose. Of the three model types discussed in this report,the aquatic transport models appear to do the best job of predicting observed concentrations. However, this conclusion is based on many fewer aquatic validation data than were availaable for atmospheric model validation

  9. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  10. Implications of model uncertainty for the practice of risk assessment

    International Nuclear Information System (INIS)

    Laskey, K.B.

    1994-01-01

    A model is a representation of a system that can be used to answer questions about the system's behavior. The term model uncertainty refers to problems in which there is no generally agreed upon, validated model that can be used as a surrogate for the system itself. Model uncertainty affects both the methodology appropriate for building models and how models should be used. This paper discusses representations of model uncertainty, methodologies for exercising and interpreting models in the presence of model uncertainty, and the appropriate use of fallible models for policy making

  11. Climate change decision-making: Model & parameter uncertainties explored

    Energy Technology Data Exchange (ETDEWEB)

    Dowlatabadi, H.; Kandlikar, M.; Linville, C.

    1995-12-31

    A critical aspect of climate change decision-making is uncertainties in current understanding of the socioeconomic, climatic and biogeochemical processes involved. Decision-making processes are much better informed if these uncertainties are characterized and their implications understood. Quantitative analysis of these uncertainties serve to inform decision makers about the likely outcome of policy initiatives, and help set priorities for research so that outcome ambiguities faced by the decision-makers are reduced. A family of integrated assessment models of climate change have been developed at Carnegie Mellon. These models are distinguished from other integrated assessment efforts in that they were designed from the outset to characterize and propagate parameter, model, value, and decision-rule uncertainties. The most recent of these models is ICAM 2.1. This model includes representation of the processes of demographics, economic activity, emissions, atmospheric chemistry, climate and sea level change and impacts from these changes and policies for emissions mitigation, and adaptation to change. The model has over 800 objects of which about one half are used to represent uncertainty. In this paper we show, that when considering parameter uncertainties, the relative contribution of climatic uncertainties are most important, followed by uncertainties in damage calculations, economic uncertainties and direct aerosol forcing uncertainties. When considering model structure uncertainties we find that the choice of policy is often dominated by model structure choice, rather than parameter uncertainties.

  12. How to: understanding SWAT model uncertainty relative to measured results

    Science.gov (United States)

    Watershed models are being relied upon to contribute to most policy-making decisions of watershed management, and the demand for an accurate accounting of complete model uncertainty is rising. Generalized likelihood uncertainty estimation (GLUE) is a widely used method for quantifying uncertainty i...

  13. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  14. Physical and Model Uncertainty for Fatigue Design of Composite Material

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...... for linear damage accumulation. Test data analyzed are taken from the Optimat database [1] which is public available. The composite material tested within the Optimat project is normally used for wind turbine blades....

  15. Modelling of data uncertainties on hybrid computers

    International Nuclear Information System (INIS)

    Schneider, Anke

    2016-06-01

    The codes d 3 f and r 3 t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d 3 f and r 3 t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d 3 f and r 3 t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d 3 f and r 3 t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d 3 f and r 3 t were combined to one conjoint code d 3 f++. A direct estimation of uncertainties for complex groundwater flow models with the help of Monte Carlo simulations will not be

  16. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  17. A market model: uncertainty and reachable sets

    Directory of Open Access Journals (Sweden)

    Raczynski Stanislaw

    2015-01-01

    Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.

  18. Uncertainty "escalation" and use of machine learning to forecast residual and data model uncertainties

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    When speaking about model uncertainty many authors implicitly assume the data uncertainty (mainly in parameters or inputs) which is probabilistically described by distributions. Often however it is look also into the residual uncertainty as well. It is hence reasonable to classify the main approaches to uncertainty analysis with respect to the two main types of model uncertainty that can be distinguished: A. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. it uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. The following methods can be mentioned: (a) quantile regression (QR) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) a more recent approach that takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (neural networks, model trees etc.) - the UNEEC method [2,3,7] (c) and even more recent DUBRAUE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals (it corrects the model residual first and then carries out the uncertainty prediction by a autoregressive statistical model) [5] B. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. In case of simple functions representing models analytical approaches can be used, or approximation methods (e.g., first-order second moment method). However, for real complex non-linear models implemented in software there is no other choice except using

  19. Uncertainty and Preference Modelling for Multiple Criteria Vehicle Evaluation

    Directory of Open Access Journals (Sweden)

    Qiuping Yang

    2010-12-01

    Full Text Available A general framework for vehicle assessment is proposed based on both mass survey information and the evidential reasoning (ER approach. Several methods for uncertainty and preference modeling are developed within the framework, including the measurement of uncertainty caused by missing information, the estimation of missing information in original surveys, the use of nonlinear functions for data mapping, and the use of nonlinear functions as utility function to combine distributed assessments into a single index. The results of the investigation show that various measures can be used to represent the different preferences of decision makers towards the same feedback from respondents. Based on the ER approach, credible and informative analysis can be conducted through the complete understanding of the assessment problem in question and the full exploration of available information.

  20. Characterization uncertainty and its effects on models and performance

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Treadway, A.H.

    1991-01-01

    Geostatistical simulation is being used to develop multiple geologic models of rock properties at the proposed Yucca Mountain repository site. Because each replicate model contains the same known information, and is thus essentially indistinguishable statistically from others, the differences between models may be thought of as representing the uncertainty in the site description. The variability among performance measures, such as ground water travel time, calculated using these replicate models therefore quantifies the uncertainty in performance that arises from uncertainty in site characterization.

  1. Uncertainty in a spatial evacuation model

    Science.gov (United States)

    Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De

    2017-08-01

    Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.

  2. Identification and communication of uncertainties of phenomenological models in PSA

    International Nuclear Information System (INIS)

    Pulkkinen, U.; Simola, K.

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  3. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  4. Uncertainty modelling and analysis of volume calculations based on a regular grid digital elevation model (DEM)

    Science.gov (United States)

    Li, Chang; Wang, Qing; Shi, Wenzhong; Zhao, Sisi

    2018-05-01

    The accuracy of earthwork calculations that compute terrain volume is critical to digital terrain analysis (DTA). The uncertainties in volume calculations (VCs) based on a DEM are primarily related to three factors: 1) model error (ME), which is caused by an adopted algorithm for a VC model, 2) discrete error (DE), which is usually caused by DEM resolution and terrain complexity, and 3) propagation error (PE), which is caused by the variables' error. Based on these factors, the uncertainty modelling and analysis of VCs based on a regular grid DEM are investigated in this paper. Especially, how to quantify the uncertainty of VCs is proposed by a confidence interval based on truncation error (TE). In the experiments, the trapezoidal double rule (TDR) and Simpson's double rule (SDR) were used to calculate volume, where the TE is the major ME, and six simulated regular grid DEMs with different terrain complexity and resolution (i.e. DE) were generated by a Gauss synthetic surface to easily obtain the theoretical true value and eliminate the interference of data errors. For PE, Monte-Carlo simulation techniques and spatial autocorrelation were used to represent DEM uncertainty. This study can enrich uncertainty modelling and analysis-related theories of geographic information science.

  5. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  6. Incorporating model uncertainty into optimal insurance contract design

    OpenAIRE

    Pflug, G.; Timonina-Farkas, A.; Hochrainer-Stigler, S.

    2017-01-01

    In stochastic optimization models, the optimal solution heavily depends on the selected probability model for the scenarios. However, the scenario models are typically chosen on the basis of statistical estimates and are therefore subject to model error. We demonstrate here how the model uncertainty can be incorporated into the decision making process. We use a nonparametric approach for quantifying the model uncertainty and a minimax setup to find model-robust solutions. The method is illust...

  7. Multi-scenario modelling of uncertainty in stochastic chemical systems

    International Nuclear Information System (INIS)

    Evans, R. David; Ricardez-Sandoval, Luis A.

    2014-01-01

    Uncertainty analysis has not been well studied at the molecular scale, despite extensive knowledge of uncertainty in macroscale systems. The ability to predict the effect of uncertainty allows for robust control of small scale systems such as nanoreactors, surface reactions, and gene toggle switches. However, it is difficult to model uncertainty in such chemical systems as they are stochastic in nature, and require a large computational cost. To address this issue, a new model of uncertainty propagation in stochastic chemical systems, based on the Chemical Master Equation, is proposed in the present study. The uncertain solution is approximated by a composite state comprised of the averaged effect of samples from the uncertain parameter distributions. This model is then used to study the effect of uncertainty on an isomerization system and a two gene regulation network called a repressilator. The results of this model show that uncertainty in stochastic systems is dependent on both the uncertain distribution, and the system under investigation. -- Highlights: •A method to model uncertainty on stochastic systems was developed. •The method is based on the Chemical Master Equation. •Uncertainty in an isomerization reaction and a gene regulation network was modelled. •Effects were significant and dependent on the uncertain input and reaction system. •The model was computationally more efficient than Kinetic Monte Carlo

  8. Analytic uncertainty and sensitivity analysis of models with input correlations

    Science.gov (United States)

    Zhu, Yueying; Wang, Qiuping A.; Li, Wei; Cai, Xu

    2018-03-01

    Probabilistic uncertainty analysis is a common means of evaluating mathematical models. In mathematical modeling, the uncertainty in input variables is specified through distribution laws. Its contribution to the uncertainty in model response is usually analyzed by assuming that input variables are independent of each other. However, correlated parameters are often happened in practical applications. In the present paper, an analytic method is built for the uncertainty and sensitivity analysis of models in the presence of input correlations. With the method, it is straightforward to identify the importance of the independence and correlations of input variables in determining the model response. This allows one to decide whether or not the input correlations should be considered in practice. Numerical examples suggest the effectiveness and validation of our analytic method in the analysis of general models. A practical application of the method is also proposed to the uncertainty and sensitivity analysis of a deterministic HIV model.

  9. Uncertainty models applied to the substation planning

    Energy Technology Data Exchange (ETDEWEB)

    Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)

    1994-12-31

    The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.

  10. Aspects of uncertainty analysis in accident consequence modeling

    International Nuclear Information System (INIS)

    Travis, C.C.; Hoffman, F.O.

    1981-01-01

    Mathematical models are frequently used to determine probable dose to man from an accidental release of radionuclides by a nuclear facility. With increased emphasis on the accuracy of these models, the incorporation of uncertainty analysis has become one of the most crucial and sensitive components in evaluating the significance of model predictions. In the present paper, we address three aspects of uncertainty in models used to assess the radiological impact to humans: uncertainties resulting from the natural variability in human biological parameters; the propagation of parameter variability by mathematical models; and comparison of model predictions to observational data

  11. Appropriatie spatial scales to achieve model output uncertainty goals

    NARCIS (Netherlands)

    Booij, Martijn J.; Melching, Charles S.; Chen, Xiaohong; Chen, Yongqin; Xia, Jun; Zhang, Hailun

    2008-01-01

    Appropriate spatial scales of hydrological variables were determined using an existing methodology based on a balance in uncertainties from model inputs and parameters extended with a criterion based on a maximum model output uncertainty. The original methodology uses different relationships between

  12. Estimated Frequency Domain Model Uncertainties used in Robust Controller Design

    DEFF Research Database (Denmark)

    Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob

    1994-01-01

    This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...

  13. Modeling of uncertainty in atmospheric transport system using hybrid method

    International Nuclear Information System (INIS)

    Pandey, M.; Ranade, Ashok; Brij Kumar; Datta, D.

    2012-01-01

    Atmospheric dispersion models are routinely used at nuclear and chemical plants to estimate exposure to the members of the public and occupational workers due to release of hazardous contaminants into the atmosphere. Atmospheric dispersion is a stochastic phenomenon and in general, the concentration of the contaminant estimated at a given time and at a predetermined location downwind of a source cannot be predicted precisely. Uncertainty in atmospheric dispersion model predictions is associated with: 'data' or 'parameter' uncertainty resulting from errors in the data used to execute and evaluate the model, uncertainties in empirical model parameters, and initial and boundary conditions; 'model' or 'structural' uncertainty arising from inaccurate treatment of dynamical and chemical processes, approximate numerical solutions, and internal model errors; and 'stochastic' uncertainty, which results from the turbulent nature of the atmosphere as well as from unpredictability of human activities related to emissions, The possibility theory based on fuzzy measure has been proposed in recent years as an alternative approach to address knowledge uncertainty of a model in situations where available information is too vague to represent the parameters statistically. The paper presents a novel approach (called Hybrid Method) to model knowledge uncertainty in a physical system by a combination of probabilistic and possibilistic representation of parametric uncertainties. As a case study, the proposed approach is applied for estimating the ground level concentration of hazardous contaminant in air due to atmospheric releases through the stack (chimney) of a nuclear plant. The application illustrates the potential of the proposed approach. (author)

  14. Uncertainty analysis of impacts of climate change on snow processes: Case study of interactions of GCM uncertainty and an impact model

    Science.gov (United States)

    Kudo, Ryoji; Yoshida, Takeo; Masumoto, Takao

    2017-05-01

    The impact of climate change on snow water equivalent (SWE) and its uncertainty were investigated in snowy areas of subarctic and temperate climate zones in Japan by using a snow process model and climate projections derived from general circulation models (GCMs). In particular, we examined how the uncertainty due to GCMs propagated through the snow model, which contained nonlinear processes defined by thresholds, as an example of the uncertainty caused by interactions among multiple sources of uncertainty. An assessment based on the climate projections in Coupled Model Intercomparison Project Phase 5 indicated that heavy-snowfall areas in the temperate zone (especially in low-elevation areas) were markedly vulnerable to temperature change, showing a large SWE reduction even under slight changes in winter temperature. The uncertainty analysis demonstrated that the uncertainty associated with snow processes (1) can be accounted for mainly by the interactions between GCM uncertainty (in particular, the differences of projected temperature changes between GCMs) and the nonlinear responses of the snow model and (2) depends on the balance between the magnitude of projected temperature changes and present climates dominated largely by climate zones and elevation. Specifically, when the peaks of the distributions of daily mean temperature projected by GCMs cross the key thresholds set in the model, the GCM uncertainty, even if tiny, can be amplified by the nonlinear propagation through the snow process model. This amplification results in large uncertainty in projections of CC impact on snow processes.

  15. Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling

    Science.gov (United States)

    Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.

    2016-12-01

    The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.

  16. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  17. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  18. Dimensionality reduction for uncertainty quantification of nuclear engineering models.

    Energy Technology Data Exchange (ETDEWEB)

    Roderick, O.; Wang, Z.; Anitescu, M. (Mathematics and Computer Science)

    2011-01-01

    The task of uncertainty quantification consists of relating the available information on uncertainties in the model setup to the resulting variation in the outputs of the model. Uncertainty quantification plays an important role in complex simulation models of nuclear engineering, where better understanding of uncertainty results in greater confidence in the model and in the improved safety and efficiency of engineering projects. In our previous work, we have shown that the effect of uncertainty can be approximated by polynomial regression with derivatives (PRD): a hybrid regression method that uses first-order derivatives of the model output as additional fitting conditions for a polynomial expansion. Numerical experiments have demonstrated the advantage of this approach over classical methods of uncertainty analysis: in precision, computational efficiency, or both. To obtain derivatives, we used automatic differentiation (AD) on the simulation code; hand-coded derivatives are acceptable for simpler models. We now present improvements on the method. We use a tuned version of the method of snapshots, a technique based on proper orthogonal decomposition (POD), to set up the reduced order representation of essential information on uncertainty in the model inputs. The automatically obtained sensitivity information is required to set up the method. Dimensionality reduction in combination with PRD allows analysis on a larger dimension of the uncertainty space (>100), at modest computational cost.

  19. Precipitation forecasts and their uncertainty as input into hydrological models

    Directory of Open Access Journals (Sweden)

    M. Kobold

    2005-01-01

    Full Text Available Torrential streams and fast runoff are characteristic of most Slovenian rivers and extensive damage is caused almost every year by rainstorms affecting different regions of Slovenia. Rainfall-runoff models which are tools for runoff calculation can be used for flood forecasting. In Slovenia, the lag time between rainfall and runoff is only a few hours and on-line data are used only for now-casting. Predicted precipitation is necessary in flood forecasting some days ahead. The ECMWF (European Centre for Medium-Range Weather Forecasts model gives general forecasts several days ahead while more detailed precipitation data with the ALADIN/SI model are available for two days ahead. Combining the weather forecasts with the information on catchment conditions and a hydrological forecasting model can give advance warning of potential flooding notwithstanding a certain degree of uncertainty in using precipitation forecasts based on meteorological models. Analysis of the sensitivity of the hydrological model to the rainfall error has shown that the deviation in runoff is much larger than the rainfall deviation. Therefore, verification of predicted precipitation for large precipitation events was performed with the ECMWF model. Measured precipitation data were interpolated on a regular grid and compared with the results from the ECMWF model. The deviation in predicted precipitation from interpolated measurements is shown with the model bias resulting from the inability of the model to predict the precipitation correctly and a bias for horizontal resolution of the model and natural variability of precipitation.

  20. Comparison of evidence theory and Bayesian theory for uncertainty modeling

    International Nuclear Information System (INIS)

    Soundappan, Prabhu; Nikolaidis, Efstratios; Haftka, Raphael T.; Grandhi, Ramana; Canfield, Robert

    2004-01-01

    This paper compares Evidence Theory (ET) and Bayesian Theory (BT) for uncertainty modeling and decision under uncertainty, when the evidence about uncertainty is imprecise. The basic concepts of ET and BT are introduced and the ways these theories model uncertainties, propagate them through systems and assess the safety of these systems are presented. ET and BT approaches are demonstrated and compared on challenge problems involving an algebraic function whose input variables are uncertain. The evidence about the input variables consists of intervals provided by experts. It is recommended that a decision-maker compute both the Bayesian probabilities of the outcomes of alternative actions and their plausibility and belief measures when evidence about uncertainty is imprecise, because this helps assess the importance of imprecision and the value of additional information. Finally, the paper presents and demonstrates a method for testing approaches for decision under uncertainty in terms of their effectiveness in making decisions

  1. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    sensitivity and uncertainty of atmospheric dispersion using fuzzy set theory can be found in. Chutia et al (2013). ..... tainties have been presented, will facilitate the decision makers in the said field to take a decision on the quality of the air if ..... Annals of Fuzzy Mathematics and Informatics 5(1): 213–22. Chutia R, Mahanta S ...

  2. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  3. [Application of an uncertainty model for fibromyalgia].

    Science.gov (United States)

    Triviño Martínez, Ángeles; Solano Ruiz, M Carmen; Siles González, José

    2016-04-01

    Finding out women's experiences diagnosed with fibromyalgia applying the Theory of Uncertainty proposed by M. Mishel. A qualitative study was conducted, using a phenomenological approach. An Association of patients in the province of Alicante during the months of June 2012 to November 2013. A total of 14 women diagnosed with fibromyalgia participated in the study as volunteers, aged between 45 and 65 years. Information generated through structured interviews with recording and transcription, prior confidentiality pledge and informed consent. Analysis content by extracting different categories according to the theory proposed. The study patients perceive a high level of uncertainty related to the difficulty to deal with symptoms, uncertainty about diagnosis and treatment complexity. Moreover, the ability of coping with the disease it is influenced by social support, relationships with health professionals and help and information attending to patient associations. The health professional must provide clear information on the pathology to the fibromyalgia suffers, the larger lever of knowledge of the patients about their disease and the better the quality of the information provided, it is reported to be the less anxiety and uncertainty in the experience of the disease. Likewise patient associations should have health professionals in order to avoid bias in the information and advice with scientific evidence. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  4. Uncertainty modelling of atmospheric dispersion by stochastic ...

    Indian Academy of Sciences (India)

    discharges and related regulated pollution criteria for the marine environment. An Integrated. Simulation-Assessment Approach (ISAA) (Yang et al 2010) is developed to systematically tackle multiple uncertainties associated with hydrocarbon contaminant transport in subsurface and assessment of carcinogenic health risk ...

  5. Reservoir management under geological uncertainty using fast model update

    NARCIS (Netherlands)

    Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.

    2015-01-01

    Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU

  6. Dynamic modeling of predictive uncertainty by regression on absolute errors

    NARCIS (Netherlands)

    Pianosi, F.; Raso, L.

    2012-01-01

    Uncertainty of hydrological forecasts represents valuable information for water managers and hydrologists. This explains the popularity of probabilistic models, which provide the entire distribution of the hydrological forecast. Nevertheless, many existing hydrological models are deterministic and

  7. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    that can also provide estimates of uncertainties in predictions of properties and their effects on process design becomes necessary. For instance, the accuracy of design of distillation column to achieve a given product purity is dependent on many pure compound properties such as critical pressure...... of formation, standard enthalpy of fusion, standard enthalpy of vaporization at 298 K and at the normal boiling point, entropy of vaporization at the normal boiling point, surface tension at 298 K, viscosity at 300 K, flash point, auto ignition temperature, Hansen solubility parameters, Hildebrand solubility....... The comparison of model prediction uncertainties with reported range of measurement uncertainties is presented for the properties with related available data. The application of the developed methodology to quantify the effect of these uncertainties on the design of different unit operations (distillation column...

  8. Sustainable infrastructure system modeling under uncertainties and dynamics

    Science.gov (United States)

    Huang, Yongxi

    potential risks caused by feedstock seasonality and demand uncertainty. Facility spatiality, time variation of feedstock yields, and demand uncertainty are integrated into a two-stage stochastic programming (SP) framework. In the study of Transitional Energy System Modeling under Uncertainty, a multistage stochastic dynamic programming is established to optimize the process of building and operating fuel production facilities during the transition. Dynamics due to the evolving technologies and societal changes and uncertainty due to demand fluctuations are the major issues to be addressed.

  9. Quantification of Model Uncertainty in Modeling Mechanisms of Soil Microbial Respiration Pulses to Simulate Birch Effect

    Science.gov (United States)

    Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.

    2014-12-01

    A Bayesian framework is developed to quantify predictive uncertainty in environmental modeling caused by uncertainty in modeling scenarios, model structures, model parameters, and data. An example of using the framework to quantify model uncertainty is presented to simulate soil microbial respiration pulses in response to episodic rainfall pulses (the "Birch effect"). A total of five models are developed; they evolve from an existing four-carbon (C) pool model to models with additional C pools and recently developed models with explicit representations of soil moisture controls on C degradation and microbial uptake rates. Markov chain Monte Carlo (MCMC) methods with generalized likelihood function (not Gaussian) are used to estimate posterior parameter distributions of the models, and the posterior parameter samples are used to evaluate probabilities of the models. The models with explicit representations of soil moisture controls outperform the other models. The models with additional C pools for accumulation of degraded C in the dry zone of the soil pore space result in a higher probability of reproducing the observed Birch pulses. A cross-validation is conducted to explore predictive performance of model averaging and of individual models. The Bayesian framework is mathematically general and can be applied to a wide range of environmental problems.

  10. Incorporating parametric uncertainty into population viability analysis models

    Science.gov (United States)

    McGowan, Conor P.; Runge, Michael C.; Larson, Michael A.

    2011-01-01

    Uncertainty in parameter estimates from sampling variation or expert judgment can introduce substantial uncertainty into ecological predictions based on those estimates. However, in standard population viability analyses, one of the most widely used tools for managing plant, fish and wildlife populations, parametric uncertainty is often ignored in or discarded from model projections. We present a method for explicitly incorporating this source of uncertainty into population models to fully account for risk in management and decision contexts. Our method involves a two-step simulation process where parametric uncertainty is incorporated into the replication loop of the model and temporal variance is incorporated into the loop for time steps in the model. Using the piping plover, a federally threatened shorebird in the USA and Canada, as an example, we compare abundance projections and extinction probabilities from simulations that exclude and include parametric uncertainty. Although final abundance was very low for all sets of simulations, estimated extinction risk was much greater for the simulation that incorporated parametric uncertainty in the replication loop. Decisions about species conservation (e.g., listing, delisting, and jeopardy) might differ greatly depending on the treatment of parametric uncertainty in population models.

  11. Bayesian analysis for uncertainty estimation of a canopy transpiration model

    Science.gov (United States)

    Samanta, S.; Mackay, D. S.; Clayton, M. K.; Kruger, E. L.; Ewers, B. E.

    2007-04-01

    A Bayesian approach was used to fit a conceptual transpiration model to half-hourly transpiration rates for a sugar maple (Acer saccharum) stand collected over a 5-month period and probabilistically estimate its parameter and prediction uncertainties. The model used the Penman-Monteith equation with the Jarvis model for canopy conductance. This deterministic model was extended by adding a normally distributed error term. This extension enabled using Markov chain Monte Carlo simulations to sample the posterior parameter distributions. The residuals revealed approximate conformance to the assumption of normally distributed errors. However, minor systematic structures in the residuals at fine timescales suggested model changes that would potentially improve the modeling of transpiration. Results also indicated considerable uncertainties in the parameter and transpiration estimates. This simple methodology of uncertainty analysis would facilitate the deductive step during the development cycle of deterministic conceptual models by accounting for these uncertainties while drawing inferences from data.

  12. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  13. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  14. Improved Wave-vessel Transfer Functions by Uncertainty Modelling

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Fønss Bach, Kasper; Iseki, Toshio

    2016-01-01

    This paper deals with uncertainty modelling of wave-vessel transfer functions used to calculate or predict wave-induced responses of a ship in a seaway. Although transfer functions, in theory, can be calculated to exactly reflect the behaviour of the ship when exposed to waves, uncertainty in input...

  15. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a m...

  16. A Model-Free Definition of Increasing Uncertainty

    NARCIS (Netherlands)

    Grant, S.; Quiggin, J.

    2001-01-01

    We present a definition of increasing uncertainty, independent of any notion of subjective probabilities, or of any particular model of preferences.Our notion of an elementary increase in the uncertainty of any act corresponds to the addition of an 'elementary bet' which increases consumption by a

  17. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    Buckling is a critical issue for structural stability in structural design. ... This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. ... Civil Engineering Department, Suleyman Demirel University, Isparta 32260, Turkey ...

  18. Uncertainty in a monthly water balance model using the generalized ...

    Indian Academy of Sciences (India)

    Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology. Diego Rivera1,∗. , Yessica Rivas. 2 and Alex Godoy. 3. 1. Laboratory of Comparative Policy in Water Resources Management, University of Concepcion,. CONICYT/FONDAP 15130015, Concepcion, Chile. 2.

  19. Uncertainty in Discount Models and Environmental Accounting

    Directory of Open Access Journals (Sweden)

    Donald Ludwig

    2005-12-01

    Full Text Available Cost-benefit analysis (CBA is controversial for environmental issues, but is nevertheless employed by many governments and private organizations for making environmental decisions. Controversy centers on the practice of economic discounting in CBA for decisions that have substantial long-term consequences, as do most environmental decisions. Customarily, economic discounting has been calculated at a constant exponential rate, a practice that weights the present heavily in comparison with the future. Recent analyses of economic data show that the assumption of constant exponential discounting should be modified to take into account large uncertainties in long-term discount rates. A proper treatment of this uncertainty requires that we consider returns over a plausible range of assumptions about future discounting rates. When returns are averaged in this way, the schemes with the most severe discounting have a negligible effect on the average after a long period of time has elapsed. This re-examination of economic uncertainty provides support for policies that prevent or mitigate environmental damage. We examine these effects for three examples: a stylized renewable resource, management of a long-lived species (Atlantic Right Whales, and lake eutrophication.

  20. Uncertainties in environmental radiological assessment models and their implications

    International Nuclear Information System (INIS)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible

  1. Bayesian models for comparative analysis integrating phylogenetic uncertainty

    Science.gov (United States)

    2012-01-01

    Background Uncertainty in comparative analyses can come from at least two sources: a) phylogenetic uncertainty in the tree topology or branch lengths, and b) uncertainty due to intraspecific variation in trait values, either due to measurement error or natural individual variation. Most phylogenetic comparative methods do not account for such uncertainties. Not accounting for these sources of uncertainty leads to false perceptions of precision (confidence intervals will be too narrow) and inflated significance in hypothesis testing (e.g. p-values will be too small). Although there is some application-specific software for fitting Bayesian models accounting for phylogenetic error, more general and flexible software is desirable. Methods We developed models to directly incorporate phylogenetic uncertainty into a range of analyses that biologists commonly perform, using a Bayesian framework and Markov Chain Monte Carlo analyses. Results We demonstrate applications in linear regression, quantification of phylogenetic signal, and measurement error models. Phylogenetic uncertainty was incorporated by applying a prior distribution for the phylogeny, where this distribution consisted of the posterior tree sets from Bayesian phylogenetic tree estimation programs. The models were analysed using simulated data sets, and applied to a real data set on plant traits, from rainforest plant species in Northern Australia. Analyses were performed using the free and open source software OpenBUGS and JAGS. Conclusions Incorporating phylogenetic uncertainty through an empirical prior distribution of trees leads to more precise estimation of regression model parameters than using a single consensus tree and enables a more realistic estimation of confidence intervals. In addition, models incorporating measurement errors and/or individual variation, in one or both variables, are easily formulated in the Bayesian framework. We show that BUGS is a useful, flexible general purpose tool for

  2. Fuzzy techniques for subjective workload-score modeling under uncertainties.

    Science.gov (United States)

    Kumar, Mohit; Arndt, Dagmar; Kreuzfeld, Steffi; Thurow, Kerstin; Stoll, Norbert; Stoll, Regina

    2008-12-01

    This paper deals with the development of a computer model to estimate the subjective workload score of individuals by evaluating their heart-rate (HR) signals. The identification of a model to estimate the subjective workload score of individuals under different workload situations is too ambitious a task because different individuals (due to different body conditions, emotional states, age, gender, etc.) show different physiological responses (assessed by evaluating the HR signal) under different workload situations. This is equivalent to saying that the mathematical mappings between physiological parameters and the workload score are uncertain. Our approach to deal with the uncertainties in a workload-modeling problem consists of the following steps: 1) The uncertainties arising due the individual variations in identifying a common model valid for all the individuals are filtered out using a fuzzy filter; 2) stochastic modeling of the uncertainties (provided by the fuzzy filter) use finite-mixture models and utilize this information regarding uncertainties for identifying the structure and initial parameters of a workload model; and 3) finally, the workload model parameters for an individual are identified in an online scenario using machine learning algorithms. The contribution of this paper is to propose, with a mathematical analysis, a fuzzy-based modeling technique that first filters out the uncertainties from the modeling problem, analyzes the uncertainties statistically using finite-mixture modeling, and, finally, utilizes the information about uncertainties for adapting the workload model to an individual's physiological conditions. The approach of this paper, demonstrated with the real-world medical data of 11 subjects, provides a fuzzy-based tool useful for modeling in the presence of uncertainties.

  3. Generalized martingale model of the uncertainty evolution of streamflow forecasts

    Science.gov (United States)

    Zhao, Tongtiegang; Zhao, Jianshi; Yang, Dawen; Wang, Hao

    2013-07-01

    Streamflow forecasts are dynamically updated in real-time, thus facilitating a process of forecast uncertainty evolution. Forecast uncertainty generally decreases over time and as more hydrologic information becomes available. The process of forecasting and uncertainty updating can be described by the martingale model of forecast evolution (MMFE), which formulates the total forecast uncertainty of a streamflow in one future period as the sum of forecast improvements in the intermediate periods. This study tests the assumptions, i.e., unbiasedness, Gaussianity, temporal independence, and stationarity, of MMFE using real-world streamflow forecast data. The results show that (1) real-world forecasts can be biased and tend to underestimate the actual streamflow, and (2) real-world forecast uncertainty is non-Gaussian and heavy-tailed. Based on these statistical tests, this study proposes a generalized martingale model GMMFE for the simulation of biased and non-Gaussian forecast uncertainties. The new model combines the normal quantile transform (NQT) with MMFE to formulate the uncertainty evolution of real-world streamflow forecasts. Reservoir operations based on a synthetic forecast by GMMFE illustrates that applications of streamflow forecasting facilitate utility improvements and that special attention should be focused on the statistical distribution of forecast uncertainty.

  4. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    . However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties......The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario...... of the meteorological model results. These uncertainties stem from e.g. limits in meteorological obser-vations used to initialise meteorological forecast series. By perturbing the initial state of an NWP model run in agreement with the available observa-tional data, an ensemble of meteorological forecasts is produced...

  5. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    ’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent......The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble...

  6. Sensitivity and uncertainty analysis of the PATHWAY radionuclide transport model

    International Nuclear Information System (INIS)

    Otis, M.D.

    1983-01-01

    Procedures were developed for the uncertainty and sensitivity analysis of a dynamic model of radionuclide transport through human food chains. Uncertainty in model predictions was estimated by propagation of parameter uncertainties using a Monte Carlo simulation technique. Sensitivity of model predictions to individual parameters was investigated using the partial correlation coefficient of each parameter with model output. Random values produced for the uncertainty analysis were used in the correlation analysis for sensitivity. These procedures were applied to the PATHWAY model which predicts concentrations of radionuclides in foods grown in Nevada and Utah and exposed to fallout during the period of atmospheric nuclear weapons testing in Nevada. Concentrations and time-integrated concentrations of iodine-131, cesium-136, and cesium-137 in milk and other foods were investigated. 9 figs., 13 tabs

  7. A new process sensitivity index to identify important system processes under process model and parametric uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Heng [Pacific Northwest National Laboratory, Richland Washington USA; Ye, Ming [Department of Scientific Computing, Florida State University, Tallahassee Florida USA; Walker, Anthony P. [Environmental Sciences Division and Climate Change Science Institute, Oak Ridge National Laboratory, Oak Ridge Tennessee USA; Chen, Xingyuan [Pacific Northwest National Laboratory, Richland Washington USA

    2017-04-01

    Hydrological models are always composed of multiple components that represent processes key to intended model applications. When a process can be simulated by multiple conceptual-mathematical models (process models), model uncertainty in representing the process arises. While global sensitivity analysis methods have been widely used for identifying important processes in hydrologic modeling, the existing methods consider only parametric uncertainty but ignore the model uncertainty for process representation. To address this problem, this study develops a new method to probe multimodel process sensitivity by integrating the model averaging methods into the framework of variance-based global sensitivity analysis, given that the model averaging methods quantify both parametric and model uncertainty. A new process sensitivity index is derived as a metric of relative process importance, and the index includes variance in model outputs caused by uncertainty in both process models and model parameters. For demonstration, the new index is used to evaluate the processes of recharge and geology in a synthetic study of groundwater reactive transport modeling. The recharge process is simulated by two models that converting precipitation to recharge, and the geology process is also simulated by two models of different parameterizations of hydraulic conductivity; each process model has its own random parameters. The new process sensitivity index is mathematically general, and can be applied to a wide range of problems in hydrology and beyond.

  8. Modeling theoretical uncertainties in phenomenological analyses for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)

    2017-04-15

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)

  9. Error and Uncertainty Analysis for Ecological Modeling and Simulation

    National Research Council Canada - National Science Library

    Gertner, George

    1998-01-01

    The main objectives of this project are a) to develop a general methodology for conducting sensitivity and uncertainty analysis and building error budgets in simulation modeling over space and time; and b...

  10. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pohl, Andrew Phillip [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jordan, Dirk [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  11. Impact of dose-distribution uncertainties on rectal ntcp modeling I: Uncertainty estimates

    International Nuclear Information System (INIS)

    Fenwick, John D.; Nahum, Alan E.

    2001-01-01

    A trial of nonescalated conformal versus conventional radiotherapy treatment of prostate cancer has been carried out at the Royal Marsden NHS Trust (RMH) and Institute of Cancer Research (ICR), demonstrating a significant reduction in the rate of rectal bleeding reported for patients treated using the conformal technique. The relationship between planned rectal dose-distributions and incidences of bleeding has been analyzed, showing that the rate of bleeding falls significantly as the extent of the rectal wall receiving a planned dose-level of more than 57 Gy is reduced. Dose-distributions delivered to the rectal wall over the course of radiotherapy treatment inevitably differ from planned distributions, due to sources of uncertainty such as patient setup error, rectal wall movement and variation in the absolute rectal wall surface area. In this paper estimates of the differences between planned and treated rectal dose-distribution parameters are obtained for the RMH/ICR nonescalated conformal technique, working from a distribution of setup errors observed during the RMH/ICR trial, movement data supplied by Lebesque and colleagues derived from repeat CT scans, and estimates of rectal circumference variations extracted from the literature. Setup errors and wall movement are found to cause only limited systematic differences between mean treated and planned rectal dose-distribution parameter values, but introduce considerable uncertainties into the treated values of some dose-distribution parameters: setup errors lead to 22% and 9% relative uncertainties in the highly dosed fraction of the rectal wall and the wall average dose, respectively, with wall movement leading to 21% and 9% relative uncertainties. Estimates obtained from the literature of the uncertainty in the absolute surface area of the distensible rectal wall are of the order of 13%-18%. In a subsequent paper the impact of these uncertainties on analyses of the relationship between incidences of bleeding

  12. Data-driven Modelling for decision making under uncertainty

    Science.gov (United States)

    Angria S, Layla; Dwi Sari, Yunita; Zarlis, Muhammad; Tulus

    2018-01-01

    The rise of the issues with the uncertainty of decision making has become a very warm conversation in operation research. Many models have been presented, one of which is with data-driven modelling (DDM). The purpose of this paper is to extract and recognize patterns in data, and find the best model in decision-making problem under uncertainty by using data-driven modeling approach with linear programming, linear and nonlinear differential equation, bayesian approach. Model criteria tested to determine the smallest error, and it will be the best model that can be used.

  13. Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.

    2012-12-01

    Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root

  14. An Iterative Uncertainty Assessment Technique for Environmental Modeling

    International Nuclear Information System (INIS)

    Engel, David W.; Liebetrau, Albert M.; Jarman, Kenneth D.; Ferryman, Thomas A.; Scheibe, Timothy D.; Didier, Brett T.

    2004-01-01

    The reliability of and confidence in predictions from model simulations are crucial--these predictions can significantly affect risk assessment decisions. For example, the fate of contaminants at the U.S. Department of Energy's Hanford Site has critical impacts on long-term waste management strategies. In the uncertainty estimation efforts for the Hanford Site-Wide Groundwater Modeling program, computational issues severely constrain both the number of uncertain parameters that can be considered and the degree of realism that can be included in the models. Substantial improvements in the overall efficiency of uncertainty analysis are needed to fully explore and quantify significant sources of uncertainty. We have combined state-of-the-art statistical and mathematical techniques in a unique iterative, limited sampling approach to efficiently quantify both local and global prediction uncertainties resulting from model input uncertainties. The approach is designed for application to widely diverse problems across multiple scientific domains. Results are presented for both an analytical model where the response surface is ''known'' and a simplified contaminant fate transport and groundwater flow model. The results show that our iterative method for approximating a response surface (for subsequent calculation of uncertainty estimates) of specified precision requires less computing time than traditional approaches based upon noniterative sampling methods

  15. Model-specification uncertainty in future forest pest outbreak.

    Science.gov (United States)

    Boulanger, Yan; Gray, David R; Cooke, Barry J; De Grandpré, Louis

    2016-04-01

    Climate change will modify forest pest outbreak characteristics, although there are disagreements regarding the specifics of these changes. A large part of this variability may be attributed to model specifications. As a case study, we developed a consensus model predicting spruce budworm (SBW, Choristoneura fumiferana [Clem.]) outbreak duration using two different predictor data sets and six different correlative methods. The model was used to project outbreak duration and the uncertainty associated with using different data sets and correlative methods (=model-specification uncertainty) for 2011-2040, 2041-2070 and 2071-2100, according to three forcing scenarios (RCP 2.6, RCP 4.5 and RCP 8.5). The consensus model showed very high explanatory power and low bias. The model projected a more important northward shift and decrease in outbreak duration under the RCP 8.5 scenario. However, variation in single-model projections increases with time, making future projections highly uncertain. Notably, the magnitude of the shifts in northward expansion, overall outbreak duration and the patterns of outbreaks duration at the southern edge were highly variable according to the predictor data set and correlative method used. We also demonstrated that variation in forcing scenarios contributed only slightly to the uncertainty of model projections compared with the two sources of model-specification uncertainty. Our approach helped to quantify model-specification uncertainty in future forest pest outbreak characteristics. It may contribute to sounder decision-making by acknowledging the limits of the projections and help to identify areas where model-specification uncertainty is high. As such, we further stress that this uncertainty should be strongly considered when making forest management plans, notably by adopting adaptive management strategies so as to reduce future risks. © 2015 Her Majesty the Queen in Right of Canada Global Change Biology © 2015 Published by John

  16. Parameter uncertainty analysis of a biokinetic model of caesium

    International Nuclear Information System (INIS)

    Li, W.B.; Oeh, U.; Klein, W.; Blanchardon, E.; Puncher, M.; Leggett, R.W.; Breustedt, B.; Nosske, D.; Lopez, M.A.

    2015-01-01

    Parameter uncertainties for the biokinetic model of caesium (Cs) developed by Leggett et al. were inventoried and evaluated. The methods of parameter uncertainty analysis were used to assess the uncertainties of model predictions with the assumptions of model parameter uncertainties and distributions. Furthermore, the importance of individual model parameters was assessed by means of sensitivity analysis. The calculated uncertainties of model predictions were compared with human data of Cs measured in blood and in the whole body. It was found that propagating the derived uncertainties in model parameter values reproduced the range of bioassay data observed in human subjects at different times after intake. The maximum ranges, expressed as uncertainty factors (UFs) (defined as a square root of ratio between 97.5. and 2.5. percentiles) of blood clearance, whole-body retention and urinary excretion of Cs predicted at earlier time after intake were, respectively: 1.5, 1.0 and 2.5 at the first day; 1.8, 1.1 and 2.4 at Day 10 and 1.8, 2.0 and 1.8 at Day 100; for the late times (1000 d) after intake, the UFs were increased to 43, 24 and 31, respectively. The model parameters of transfer rates between kidneys and blood, muscle and blood and the rate of transfer from kidneys to urinary bladder content are most influential to the blood clearance and to the whole-body retention of Cs. For the urinary excretion, the parameters of transfer rates from urinary bladder content to urine and from kidneys to urinary bladder content impact mostly. The implication and effect on the estimated equivalent and effective doses of the larger uncertainty of 43 in whole-body retention in the later time, say, after Day 500 will be explored in a successive work in the framework of EURADOS. (authors)

  17. Metrics for evaluating performance and uncertainty of Bayesian network models

    Science.gov (United States)

    Bruce G. Marcot

    2012-01-01

    This paper presents a selected set of existing and new metrics for gauging Bayesian network model performance and uncertainty. Selected existing and new metrics are discussed for conducting model sensitivity analysis (variance reduction, entropy reduction, case file simulation); evaluating scenarios (influence analysis); depicting model complexity (numbers of model...

  18. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    Science.gov (United States)

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  19. Graphical models and their (un)certainties

    NARCIS (Netherlands)

    Leisink, M.A.R.

    2004-01-01

    'A graphical models is a powerful tool to deal with complex probability models. Although in principle any set of probabilistic relationships can be modelled, the calculation of the actual numbers can be very hard. Every graphical model suffers from a phenomenon known as exponential scaling. To

  20. Meteorological uncertainty of atmospheric dispersion model results (MUD)

    International Nuclear Information System (INIS)

    Havskov Soerensen, J.; Amstrup, B.; Feddersen, H.

    2013-08-01

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble of meteorological forecasts is produced from which uncertainties in the various meteorological parameters are estimated, e.g. probabilities for rain. Corresponding ensembles of atmospheric dispersion can now be computed from which uncertainties of predicted radionuclide concentration and deposition patterns can be derived. (Author)

  1. Partitioning uncertainty in streamflow projections under nonstationary model conditions

    Science.gov (United States)

    Chawla, Ila; Mujumdar, P. P.

    2018-02-01

    Assessing the impacts of Land Use (LU) and climate change on future streamflow projections is necessary for efficient management of water resources. However, model projections are burdened with significant uncertainty arising from various sources. Most of the previous studies have considered climate models and scenarios as major sources of uncertainty, but uncertainties introduced by land use change and hydrologic model assumptions are rarely investigated. In this paper an attempt is made to segregate the contribution from (i) general circulation models (GCMs), (ii) emission scenarios, (iii) land use scenarios, (iv) stationarity assumption of the hydrologic model, and (v) internal variability of the processes, to overall uncertainty in streamflow projections using analysis of variance (ANOVA) approach. Generally, most of the impact assessment studies are carried out with unchanging hydrologic model parameters in future. It is, however, necessary to address the nonstationarity in model parameters with changing land use and climate. In this paper, a regression based methodology is presented to obtain the hydrologic model parameters with changing land use and climate scenarios in future. The Upper Ganga Basin (UGB) in India is used as a case study to demonstrate the methodology. The semi-distributed Variable Infiltration Capacity (VIC) model is set-up over the basin, under nonstationary conditions. Results indicate that model parameters vary with time, thereby invalidating the often-used assumption of model stationarity. The streamflow in UGB under the nonstationary model condition is found to reduce in future. The flows are also found to be sensitive to changes in land use. Segregation results suggest that model stationarity assumption and GCMs along with their interactions with emission scenarios, act as dominant sources of uncertainty. This paper provides a generalized framework for hydrologists to examine stationarity assumption of models before considering them

  2. Uncertainty Assessment in Long Term Urban Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    on the rainfall inputs. In order to handle the uncertainties three different stochastic approaches are investigated applying a case catchment in the town Frejlev: (1) a reliability approach in which a parameterization of the rainfall input is conducted in order to generate synthetic rainfall events and find...... return periods, and even within the return periods specified in the design criteria. If urban drainage models are based on standard parameters and hence not calibrated, the uncertainties are even larger. The greatest uncertainties are shown to be the rainfall input and the assessment of the contributing...

  3. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several

  4. Representing Uncertainty on Model Analysis Plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-01-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model.…

  5. UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS

    Directory of Open Access Journals (Sweden)

    Fabiana Lucena Oliveira

    2014-05-01

    Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material.  The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.

  6. Bayesian tsunami fragility modeling considering input data uncertainty

    OpenAIRE

    De Risi, Raffaele; Goda, Katsu; Mori, Nobuhito; Yasuda, Tomohiro

    2017-01-01

    Empirical tsunami fragility curves are developed based on a Bayesian framework by accounting for uncertainty of input tsunami hazard data in a systematic and comprehensive manner. Three fragility modeling approaches, i.e. lognormal method, binomial logistic method, and multinomial logistic method, are considered, and are applied to extensive tsunami damage data for the 2011 Tohoku earthquake. A unique aspect of this study is that uncertainty of tsunami inundation data (i.e. input hazard data ...

  7. Evaluation of uncertainties in selected environmental dispersion models

    International Nuclear Information System (INIS)

    Little, C.A.; Miller, C.W.

    1979-01-01

    Compliance with standards of radiation dose to the general public has necessitated the use of dispersion models to predict radionuclide concentrations in the environment due to releases from nuclear facilities. Because these models are only approximations of reality and because of inherent variations in the input parameters used in these models, their predictions are subject to uncertainty. Quantification of this uncertainty is necessary to assess the adequacy of these models for use in determining compliance with protection standards. This paper characterizes the capabilities of several dispersion models to predict accurately pollutant concentrations in environmental media. Three types of models are discussed: aquatic or surface water transport models, atmospheric transport models, and terrestrial and aquatic food chain models. Using data published primarily by model users, model predictions are compared to observations

  8. Alchemy and uncertainty: What good are models?

    Science.gov (United States)

    F.L. Bunnell

    1989-01-01

    Wildlife-habitat models are increasing in abundance, diversity, and use, but symptoms of failure are evident in their application, including misuse, disuse, failure to test, and litigation. Reasons for failure often relate to the different purposes managers and researchers have for using the models to predict and to aid understanding. This paper examines these two...

  9. Uncertainty and Complexity in Mathematical Modeling

    Science.gov (United States)

    Cannon, Susan O.; Sanders, Mark

    2017-01-01

    Modeling is an effective tool to help students access mathematical concepts. Finding a math teacher who has not drawn a fraction bar or pie chart on the board would be difficult, as would finding students who have not been asked to draw models and represent numbers in different ways. In this article, the authors will discuss: (1) the properties of…

  10. Model Uncertainty and Exchange Rate Forecasting

    NARCIS (Netherlands)

    Kouwenberg, R.; Markiewicz, A.; Verhoeks, R.; Zwinkels, R.C.J.

    2017-01-01

    Exchange rate models with uncertain and incomplete information predict that investors focus on a small set of fundamentals that changes frequently over time. We design a model selection rule that captures the current set of fundamentals that best predicts the exchange rate. Out-of-sample tests show

  11. "Wrong, but useful": negotiating uncertainty in infectious disease modelling.

    Directory of Open Access Journals (Sweden)

    Robert M Christley

    Full Text Available For infectious disease dynamical models to inform policy for containment of infectious diseases the models must be able to predict; however, it is well recognised that such prediction will never be perfect. Nevertheless, the consensus is that although models are uncertain, some may yet inform effective action. This assumes that the quality of a model can be ascertained in order to evaluate sufficiently model uncertainties, and to decide whether or not, or in what ways or under what conditions, the model should be 'used'. We examined uncertainty in modelling, utilising a range of data: interviews with scientists, policy-makers and advisors, and analysis of policy documents, scientific publications and reports of major inquiries into key livestock epidemics. We show that the discourse of uncertainty in infectious disease models is multi-layered, flexible, contingent, embedded in context and plays a critical role in negotiating model credibility. We argue that usability and stability of a model is an outcome of the negotiation that occurs within the networks and discourses surrounding it. This negotiation employs a range of discursive devices that renders uncertainty in infectious disease modelling a plastic quality that is amenable to 'interpretive flexibility'. The utility of models in the face of uncertainty is a function of this flexibility, the negotiation this allows, and the contexts in which model outputs are framed and interpreted in the decision making process. We contend that rather than being based predominantly on beliefs about quality, the usefulness and authority of a model may at times be primarily based on its functional status within the broad social and political environment in which it acts.

  12. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  13. Enhancing uncertainty tolerance in the modelling creep of ligaments

    International Nuclear Information System (INIS)

    Taha, M M Reda; Lucero, J

    2006-01-01

    The difficulty in performing biomechanical tests and the scarcity of biomechanical experimental databases necessitate extending the current knowledge base to allow efficient modelling using limited data sets. This study suggests a framework to reduce uncertainties in biomechanical systems using limited data sets. The study also shows how sparse data and epistemic input can be exploited using fuzzy logic to represent biomechanical relations. An example application to model collagen fibre recruitment in the medial collateral ligaments during time-dependent deformation under cyclic loading (creep) is presented. The study suggests a quality metric that can be employed to observe and enhance uncertainty tolerance in the modelling process

  14. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  15. Eigenspace perturbations for structural uncertainty estimation of turbulence closure models

    Science.gov (United States)

    Jofre, Lluis; Mishra, Aashwin; Iaccarino, Gianluca

    2017-11-01

    With the present state of computational resources, a purely numerical resolution of turbulent flows encountered in engineering applications is not viable. Consequently, investigations into turbulence rely on various degrees of modeling. Archetypal amongst these variable resolution approaches would be RANS models in two-equation closures, and subgrid-scale models in LES. However, owing to the simplifications introduced during model formulation, the fidelity of all such models is limited, and therefore the explicit quantification of the predictive uncertainty is essential. In such scenario, the ideal uncertainty estimation procedure must be agnostic to modeling resolution, methodology, and the nature or level of the model filter. The procedure should be able to give reliable prediction intervals for different Quantities of Interest, over varied flows and flow conditions, and at diametric levels of modeling resolution. In this talk, we present and substantiate the Eigenspace perturbation framework as an uncertainty estimation paradigm that meets these criteria. Commencing from a broad overview, we outline the details of this framework at different modeling resolution. Thence, using benchmark flows, along with engineering problems, the efficacy of this procedure is established. This research was partially supported by NNSA under the Predictive Science Academic Alliance Program (PSAAP) II, and by DARPA under the Enabling Quantification of Uncertainty in Physical Systems (EQUiPS) project (technical monitor: Dr Fariba Fahroo).

  16. Improving uncertainty estimation in urban hydrological modeling by statistically describing bias

    Directory of Open Access Journals (Sweden)

    D. Del Giudice

    2013-10-01

    Full Text Available Hydrodynamic models are useful tools for urban water management. Unfortunately, it is still challenging to obtain accurate results and plausible uncertainty estimates when using these models. In particular, with the currently applied statistical techniques, flow predictions are usually overconfident and biased. In this study, we present a flexible and relatively efficient methodology (i to obtain more reliable hydrological simulations in terms of coverage of validation data by the uncertainty bands and (ii to separate prediction uncertainty into its components. Our approach acknowledges that urban drainage predictions are biased. This is mostly due to input errors and structural deficits of the model. We address this issue by describing model bias in a Bayesian framework. The bias becomes an autoregressive term additional to white measurement noise, the only error type accounted for in traditional uncertainty analysis. To allow for bigger discrepancies during wet weather, we make the variance of bias dependent on the input (rainfall or/and output (runoff of the system. Specifically, we present a structured approach to select, among five variants, the optimal bias description for a given urban or natural case study. We tested the methodology in a small monitored stormwater system described with a parsimonious model. Our results clearly show that flow simulations are much more reliable when bias is accounted for than when it is neglected. Furthermore, our probabilistic predictions can discriminate between three uncertainty contributions: parametric uncertainty, bias, and measurement errors. In our case study, the best performing bias description is the output-dependent bias using a log-sinh transformation of data and model results. The limitations of the framework presented are some ambiguity due to the subjective choice of priors for bias parameters and its inability to address the causes of model discrepancies. Further research should focus on

  17. Uncertainty Analysis of Resistance Tests in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University

    Directory of Open Access Journals (Sweden)

    Cihad DELEN

    2015-12-01

    Full Text Available In this study, some systematical resistance tests, where were performed in Ata Nutku Ship Model Testing Laboratory of Istanbul Technical University (ITU, have been included in order to determine the uncertainties. Experiments which are conducted in the framework of mathematical and physical rules for the solution of engineering problems, measurements, calculations include uncertainty. To question the reliability of the obtained values, the existing uncertainties should be expressed as quantities. The uncertainty of a measurement system is not known if the results do not carry a universal value. On the other hand, resistance is one of the most important parameters that should be considered in the process of ship design. Ship resistance during the design phase of a ship cannot be determined precisely and reliably due to the uncertainty resources in determining the resistance value that are taken into account. This case may cause negative effects to provide the required specifications in the latter design steps. The uncertainty arising from the resistance test has been estimated and compared for a displacement type ship and high speed marine vehicles according to ITTC 2002 and ITTC 2014 regulations which are related to the uncertainty analysis methods. Also, the advantages and disadvantages of both ITTC uncertainty analysis methods have been discussed.

  18. Estimation and uncertainty of reversible Markov models.

    Science.gov (United States)

    Trendelkamp-Schroer, Benjamin; Wu, Hao; Paul, Fabian; Noé, Frank

    2015-11-07

    Reversibility is a key concept in Markov models and master-equation models of molecular kinetics. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model rely heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is, therefore, crucial to the successful application of the previously developed theory. In this work, we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta-stable features of the observed process during posterior inference. All algorithms here are implemented in the PyEMMA software--http://pyemma.org--as of version 2.0.

  19. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  20. River meander modeling and confronting uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Posner, Ari J. (University of Arizona Tucson, AZ)

    2011-05-01

    This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.

  1. Uncertainty propagation through dynamic models of assemblies of mechanical structures

    International Nuclear Information System (INIS)

    Daouk, Sami

    2016-01-01

    When studying the behaviour of mechanical systems, mathematical models and structural parameters are usually considered deterministic. Return on experience shows however that these elements are uncertain in most cases, due to natural variability or lack of knowledge. Therefore, quantifying the quality and reliability of the numerical model of an industrial assembly remains a major question in low-frequency dynamics. The purpose of this thesis is to improve the vibratory design of bolted assemblies through setting up a dynamic connector model that takes account of different types and sources of uncertainty on stiffness parameters, in a simple, efficient and exploitable in industrial context. This work has been carried out in the framework of the SICODYN project, led by EDF R and D, that aims to characterise and quantify, numerically and experimentally, the uncertainties in the dynamic behaviour of bolted industrial assemblies. Comparative studies of several numerical methods of uncertainty propagation demonstrate the advantage of using the Lack-Of-Knowledge theory. An experimental characterisation of uncertainties in bolted structures is performed on a dynamic test rig and on an industrial assembly. The propagation of many small and large uncertainties through different dynamic models of mechanical assemblies leads to the assessment of the efficiency of the Lack-Of-Knowledge theory and its applicability in an industrial environment. (author)

  2. Modeling multibody systems with uncertainties. Part II: Numerical applications

    Energy Technology Data Exchange (ETDEWEB)

    Sandu, Corina, E-mail: csandu@vt.edu; Sandu, Adrian; Ahmadian, Mehdi [Virginia Polytechnic Institute and State University, Mechanical Engineering Department (United States)

    2006-04-15

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties.

  3. Modeling multibody systems with uncertainties. Part II: Numerical applications

    International Nuclear Information System (INIS)

    Sandu, Corina; Sandu, Adrian; Ahmadian, Mehdi

    2006-01-01

    This study applies generalized polynomial chaos theory to model complex nonlinear multibody dynamic systems operating in the presence of parametric and external uncertainty. Theoretical and computational aspects of this methodology are discussed in the companion paper 'Modeling Multibody Dynamic Systems With Uncertainties. Part I: Theoretical and Computational Aspects .In this paper we illustrate the methodology on selected test cases. The combined effects of parametric and forcing uncertainties are studied for a quarter car model. The uncertainty distributions in the system response in both time and frequency domains are validated against Monte-Carlo simulations. Results indicate that polynomial chaos is more efficient than Monte Carlo and more accurate than statistical linearization. The results of the direct collocation approach are similar to the ones obtained with the Galerkin approach. A stochastic terrain model is constructed using a truncated Karhunen-Loeve expansion. The application of polynomial chaos to differential-algebraic systems is illustrated using the constrained pendulum problem. Limitations of the polynomial chaos approach are studied on two different test problems, one with multiple attractor points, and the second with a chaotic evolution and a nonlinear attractor set. The overall conclusion is that, despite its limitations, generalized polynomial chaos is a powerful approach for the simulation of multibody dynamic systems with uncertainties

  4. Uncertainties in spatially aggregated predictions from a logistic regression model

    NARCIS (Netherlands)

    Horssen, P.W. van; Pebesma, E.J.; Schot, P.P.

    2002-01-01

    This paper presents a method to assess the uncertainty of an ecological spatial prediction model which is based on logistic regression models, using data from the interpolation of explanatory predictor variables. The spatial predictions are presented as approximate 95% prediction intervals. The

  5. Can agent based models effectively reduce fisheries management implementation uncertainty?

    Science.gov (United States)

    Drexler, M.

    2016-02-01

    Uncertainty is an inherent feature of fisheries management. Implementation uncertainty remains a challenge to quantify often due to unintended responses of users to management interventions. This problem will continue to plague both single species and ecosystem based fisheries management advice unless the mechanisms driving these behaviors are properly understood. Equilibrium models, where each actor in the system is treated as uniform and predictable, are not well suited to forecast the unintended behaviors of individual fishers. Alternatively, agent based models (AMBs) can simulate the behaviors of each individual actor driven by differing incentives and constraints. This study evaluated the feasibility of using AMBs to capture macro scale behaviors of the US West Coast Groundfish fleet. Agent behavior was specified at the vessel level. Agents made daily fishing decisions using knowledge of their own cost structure, catch history, and the histories of catch and quota markets. By adding only a relatively small number of incentives, the model was able to reproduce highly realistic macro patterns of expected outcomes in response to management policies (catch restrictions, MPAs, ITQs) while preserving vessel heterogeneity. These simulations indicate that agent based modeling approaches hold much promise for simulating fisher behaviors and reducing implementation uncertainty. Additional processes affecting behavior, informed by surveys, are continually being added to the fisher behavior model. Further coupling of the fisher behavior model to a spatial ecosystem model will provide a fully integrated social, ecological, and economic model capable of performing management strategy evaluations to properly consider implementation uncertainty in fisheries management.

  6. Model uncertainty and systematic risk in US banking

    NARCIS (Netherlands)

    Baele, L.T.M.; De Bruyckere, Valerie; De Jonghe, O.G.; Vander Vennet, Rudi

    This paper uses Bayesian Model Averaging to examine the driving factors of equity returns of US Bank Holding Companies. BMA has as an advantage over OLS that it accounts for the considerable uncertainty about the correct set (model) of bank risk factors. We find that out of a broad set of 12 risk

  7. Identifying influences on model uncertainty: an application using a forest carbon budget model

    Science.gov (United States)

    James E. Smith; Linda S. Heath

    2001-01-01

    Uncertainty is an important consideration for both developers and users of environmental simulation models. Establishing quantitative estimates of uncertainty for deterministic models can be difficult when the underlying bases for such information are scarce. We demonstrate an application of probabilistic uncertainty analysis that provides for refinements in...

  8. Evaluation of Spatial Uncertainties In Modeling of Cadastral Systems

    Science.gov (United States)

    Fathi, Morteza; Teymurian, Farideh

    2013-04-01

    Cadastre plays an essential role in sustainable development especially in developing countries like Iran. A well-developed Cadastre results in transparency of estates tax system, transparency of data of estate, reduction of action before the courts and effective management of estates and natural sources and environment. Multipurpose Cadastre through gathering of other related data has a vital role in civil, economic and social programs and projects. Iran is being performed Cadastre for many years but success in this program is subject to correct geometric and descriptive data of estates. Since there are various sources of data with different accuracy and precision in Iran, some difficulties and uncertainties are existed in modeling of geometric part of Cadastre such as inconsistency between data in deeds and Cadastral map which cause some troubles in execution of cadastre and result in losing national and natural source, rights of nation. Now there is no uniform and effective technical method for resolving such conflicts. This article describes various aspects of such conflicts in geometric part of cadastre and suggests a solution through some modeling tools of GIS.

  9. Model-based uncertainty in species range prediction

    DEFF Research Database (Denmark)

    Pearson, R. G.; Thuiller, Wilfried; Bastos Araujo, Miguel

    2006-01-01

    algorithm when extrapolating beyond the range of data used to build the model. The effects of these factors should be carefully considered when using this modelling approach to predict species ranges. Main conclusions We highlight an important source of uncertainty in assessments of the impacts of climate......Aim Many attempts to predict the potential range of species rely on environmental niche (or 'bioclimate envelope') modelling, yet the effects of using different niche-based methodologies require further investigation. Here we investigate the impact that the choice of model can have on predictions......, identify key reasons why model output may differ and discuss the implications that model uncertainty has for policy-guiding applications. Location The Western Cape of South Africa. Methods We applied nine of the most widely used modelling techniques to model potential distributions under current...

  10. Model for predicting mountain wave field uncertainties

    Science.gov (United States)

    Damiens, Florentin; Lott, François; Millet, Christophe; Plougonven, Riwal

    2017-04-01

    Studying the propagation of acoustic waves throughout troposphere requires knowledge of wind speed and temperature gradients from the ground up to about 10-20 km. Typical planetary boundary layers flows are known to present vertical low level shears that can interact with mountain waves, thereby triggering small-scale disturbances. Resolving these fluctuations for long-range propagation problems is, however, not feasible because of computer memory/time restrictions and thus, they need to be parameterized. When the disturbances are small enough, these fluctuations can be described by linear equations. Previous works by co-authors have shown that the critical layer dynamics that occur near the ground produces large horizontal flows and buoyancy disturbances that result in intense downslope winds and gravity wave breaking. While these phenomena manifest almost systematically for high Richardson numbers and when the boundary layer depth is relatively small compare to the mountain height, the process by which static stability affects downslope winds remains unclear. In the present work, new linear mountain gravity wave solutions are tested against numerical predictions obtained with the Weather Research and Forecasting (WRF) model. For Richardson numbers typically larger than unity, the mesoscale model is used to quantify the effect of neglected nonlinear terms on downslope winds and mountain wave patterns. At these regimes, the large downslope winds transport warm air, a so called "Foehn" effect than can impact sound propagation properties. The sensitivity of small-scale disturbances to Richardson number is quantified using two-dimensional spectral analysis. It is shown through a pilot study of subgrid scale fluctuations of boundary layer flows over realistic mountains that the cross-spectrum of mountain wave field is made up of the same components found in WRF simulations. The impact of each individual component on acoustic wave propagation is discussed in terms of

  11. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  12. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  13. Flight Dynamics and Control of Elastic Hypersonic Vehicles Uncertainty Modeling

    Science.gov (United States)

    Chavez, Frank R.; Schmidt, David K.

    1994-01-01

    It has been shown previously that hypersonic air-breathing aircraft exhibit strong aeroelastic/aeropropulsive dynamic interactions. To investigate these, especially from the perspective of the vehicle dynamics and control, analytical expressions for key stability derivatives were derived, and an analysis of the dynamics was performed. In this paper, the important issue of model uncertainty, and the appropriate forms for representing this uncertainty, is addressed. It is shown that the methods suggested in the literature for analyzing the robustness of multivariable feedback systems, which as a prerequisite to their application assume particular forms of model uncertainty, can be difficult to apply on real atmospheric flight vehicles. Also, the extent to which available methods are conservative is demonstrated for this class of vehicle dynamics.

  14. Sensitivity of wildlife habitat models to uncertainties in GIS data

    Science.gov (United States)

    Stoms, David M.; Davis, Frank W.; Cogan, Christopher B.

    1992-01-01

    Decision makers need to know the reliability of output products from GIS analysis. For many GIS applications, it is not possible to compare these products to an independent measure of 'truth'. Sensitivity analysis offers an alternative means of estimating reliability. In this paper, we present a CIS-based statistical procedure for estimating the sensitivity of wildlife habitat models to uncertainties in input data and model assumptions. The approach is demonstrated in an analysis of habitat associations derived from a GIS database for the endangered California condor. Alternative data sets were generated to compare results over a reasonable range of assumptions about several sources of uncertainty. Sensitivity analysis indicated that condor habitat associations are relatively robust, and the results have increased our confidence in our initial findings. Uncertainties and methods described in the paper have general relevance for many GIS applications.

  15. Linear models in the mathematics of uncertainty

    CERN Document Server

    Mordeson, John N; Clark, Terry D; Pham, Alex; Redmond, Michael A

    2013-01-01

    The purpose of this book is to present new mathematical techniques for modeling global issues. These mathematical techniques are used to determine linear equations between a dependent variable and one or more independent variables in cases where standard techniques such as linear regression are not suitable. In this book, we examine cases where the number of data points is small (effects of nuclear warfare), where the experiment is not repeatable (the breakup of the former Soviet Union), and where the data is derived from expert opinion (how conservative is a political party). In all these cases the data  is difficult to measure and an assumption of randomness and/or statistical validity is questionable.  We apply our methods to real world issues in international relations such as  nuclear deterrence, smart power, and cooperative threat reduction. We next apply our methods to issues in comparative politics such as successful democratization, quality of life, economic freedom, political stability, and fail...

  16. Solar model uncertainties, MSW analysis, and future solar neutrino experiments

    International Nuclear Information System (INIS)

    Hata, N.; Langacker, P.

    1994-01-01

    Various theoretical uncertainties in the standard solar model and in the Mikheyev-Smirnov-Wolfenstein (MSW) analysis are discussed. It is shown that two methods give consistent estimations of the solar neutrino flux uncertainties: (a) a simple parametrization of the uncertainties using the core temperature and the ncuelar production cross sections; (b) the Monte Carlo method of Bahcall and Ulrich. In the MSW analysis, we emphasize proper treatments of correlations of theoretical uncertainties between flux components and between different detectors, the Earth effect, and multiple solutions in a combined χ 2 procedure. In particular the large-angle solution of the combined observation is allowed at 95% C.L. only when the theoretical uncertainties are included. If their correlations were ignored, the region would be overestimated. The MSW solutions for various standard and nonstandard solar models are also shown. The MSW predictions of the global solutions for the future solar neutrino experiments are given, emphasizing the measurement of the energy spectrum and the day-night effect in Sudbury Neutrino Observatory and Super-Kamiokande to distinguish the two solutions

  17. Uncertainty assessment in building energy performance with a simplified model

    Directory of Open Access Journals (Sweden)

    Titikpina Fally

    2015-01-01

    Full Text Available To assess a building energy performance, the consumption being predicted or estimated during the design stage is compared to the measured consumption when the building is operational. When valuing this performance, many buildings show significant differences between the calculated and measured consumption. In order to assess the performance accurately and ensure the thermal efficiency of the building, it is necessary to evaluate the uncertainties involved not only in measurement but also those induced by the propagation of the dynamic and the static input data in the model being used. The evaluation of measurement uncertainty is based on both the knowledge about the measurement process and the input quantities which influence the result of measurement. Measurement uncertainty can be evaluated within the framework of conventional statistics presented in the Guide to the Expression of Measurement Uncertainty (GUM as well as by Bayesian Statistical Theory (BST. Another choice is the use of numerical methods like Monte Carlo Simulation (MCS. In this paper, we proposed to evaluate the uncertainty associated to the use of a simplified model for the estimation of the energy consumption of a given building. A detailed review and discussion of these three approaches (GUM, MCS and BST is given. Therefore, an office building has been monitored and multiple temperature sensors have been mounted on candidate locations to get required data. The monitored zone is composed of six offices and has an overall surface of 102 m2.

  18. Improving the precision of lake ecosystem metabolism estimates by identifying predictors of model uncertainty

    Science.gov (United States)

    Rose, Kevin C.; Winslow, Luke A.; Read, Jordan S.; Read, Emily K.; Solomon, Christopher T.; Adrian, Rita; Hanson, Paul C.

    2014-01-01

    Diel changes in dissolved oxygen are often used to estimate gross primary production (GPP) and ecosystem respiration (ER) in aquatic ecosystems. Despite the widespread use of this approach to understand ecosystem metabolism, we are only beginning to understand the degree and underlying causes of uncertainty for metabolism model parameter estimates. Here, we present a novel approach to improve the precision and accuracy of ecosystem metabolism estimates by identifying physical metrics that indicate when metabolism estimates are highly uncertain. Using datasets from seventeen instrumented GLEON (Global Lake Ecological Observatory Network) lakes, we discovered that many physical characteristics correlated with uncertainty, including PAR (photosynthetically active radiation, 400-700 nm), daily variance in Schmidt stability, and wind speed. Low PAR was a consistent predictor of high variance in GPP model parameters, but also corresponded with low ER model parameter variance. We identified a threshold (30% of clear sky PAR) below which GPP parameter variance increased rapidly and was significantly greater in nearly all lakes compared with variance on days with PAR levels above this threshold. The relationship between daily variance in Schmidt stability and GPP model parameter variance depended on trophic status, whereas daily variance in Schmidt stability was consistently positively related to ER model parameter variance. Wind speeds in the range of ~0.8-3 m s–1 were consistent predictors of high variance for both GPP and ER model parameters, with greater uncertainty in eutrophic lakes. Our findings can be used to reduce ecosystem metabolism model parameter uncertainty and identify potential sources of that uncertainty.

  19. Uncertainties in model-based outcome predictions for treatment planning

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Chao, K.S. Clifford; Markman, Jerry

    2001-01-01

    Purpose: Model-based treatment-plan-specific outcome predictions (such as normal tissue complication probability [NTCP] or the relative reduction in salivary function) are typically presented without reference to underlying uncertainties. We provide a method to assess the reliability of treatment-plan-specific dose-volume outcome model predictions. Methods and Materials: A practical method is proposed for evaluating model prediction based on the original input data together with bootstrap-based estimates of parameter uncertainties. The general framework is applicable to continuous variable predictions (e.g., prediction of long-term salivary function) and dichotomous variable predictions (e.g., tumor control probability [TCP] or NTCP). Using bootstrap resampling, a histogram of the likelihood of alternative parameter values is generated. For a given patient and treatment plan we generate a histogram of alternative model results by computing the model predicted outcome for each parameter set in the bootstrap list. Residual uncertainty ('noise') is accounted for by adding a random component to the computed outcome values. The residual noise distribution is estimated from the original fit between model predictions and patient data. Results: The method is demonstrated using a continuous-endpoint model to predict long-term salivary function for head-and-neck cancer patients. Histograms represent the probabilities for the level of posttreatment salivary function based on the input clinical data, the salivary function model, and the three-dimensional dose distribution. For some patients there is significant uncertainty in the prediction of xerostomia, whereas for other patients the predictions are expected to be more reliable. In contrast, TCP and NTCP endpoints are dichotomous, and parameter uncertainties should be folded directly into the estimated probabilities, thereby improving the accuracy of the estimates. Using bootstrap parameter estimates, competing treatment

  20. Uncertainty Quantification in Control Problems for Flocking Models

    Directory of Open Access Journals (Sweden)

    Giacomo Albi

    2015-01-01

    Full Text Available The optimal control of flocking models with random inputs is investigated from a numerical point of view. The effect of uncertainty in the interaction parameters is studied for a Cucker-Smale type model using a generalized polynomial chaos (gPC approach. Numerical evidence of threshold effects in the alignment dynamic due to the random parameters is given. The use of a selective model predictive control permits steering of the system towards the desired state even in unstable regimes.

  1. Where is positional uncertainty a problem for species distribution modelling?

    NARCIS (Netherlands)

    Naimi, N.; Hamm, N.A.S.; Groen, T.A.; Skidmore, A.K.; Toxopeus, A.G.

    2014-01-01

    Species data held in museum and herbaria, survey data and opportunistically observed data are a substantial information resource. A key challenge in using these data is the uncertainty about where an observation is located. This is important when the data are used for species distribution modelling

  2. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    A weakness of global sensitivity and uncertainty analysis methodologies is the often subjective definition of prior parameter probability distributions, especially ... The reservoir representing the central part of the wetland, where flood waters separate into several independent distributaries, is a keystone area within the model.

  3. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    ances the accuracy of the structural models by using experimental results and design codes. (Baalbaki et al 1991; ... in calculation of column buckling load as defined in the following section. 4. Fuzzy logic ... material uncertainty, using the value becomes a critical solution and is a more accurate and safe method compared ...

  4. System convergence in transport models: algorithms efficiency and output uncertainty

    DEFF Research Database (Denmark)

    Rich, Jeppe; Nielsen, Otto Anker

    2015-01-01

    of this paper is to analyse convergence performance for the external loop and to illustrate how an improper linkage between the converging parts can lead to substantial uncertainty in the final output. Although this loop is crucial for the performance of large-scale transport models it has not been analysed......-scale in the Danish National Transport Model (DNTM). It is revealed that system convergence requires that either demand or supply is without random noise but not both. In that case, if MSA is applied to the model output with random noise, it will converge effectively as the random effects are gradually dampened...... in the MSA process. In connection to DNTM it is shown that MSA works well when applied to travel-time averaging, whereas trip averaging is generally infected by random noise resulting from the assignment model. The latter implies that the minimum uncertainty in the final model output is dictated...

  5. Geostatistical modeling of groundwater properties and assessment of their uncertainties

    International Nuclear Information System (INIS)

    Honda, Makoto; Yamamoto, Shinya; Sakurai, Hideyuki; Suzuki, Makoto; Sanada, Hiroyuki; Matsui, Hiroya; Sugita, Yutaka

    2010-01-01

    The distribution of groundwater properties is important for understanding of the deep underground hydrogeological environments. This paper proposes a geostatistical system for modeling the groundwater properties which have a correlation with the ground resistivity data obtained from widespread and exhaustive survey. That is, the methodology for the integration of resistivity data measured by various methods and the methodology for modeling the groundwater properties using the integrated resistivity data has been developed. The proposed system has also been validated using the data obtained in the Horonobe Underground Research Laboratory project. Additionally, the quantification of uncertainties in the estimated model has been tried by numerical simulations based on the data. As a result, the uncertainties of the proposal model have been estimated lower than other traditional model's. (author)

  6. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  7. Robust nonlinear control of nuclear reactors under model uncertainty

    International Nuclear Information System (INIS)

    Park, Moon Ghu

    1993-02-01

    A nonlinear model-based control method is developed for the robust control of a nuclear reactor. The nonlinear plant model is used to design a unique control law which covers a wide operating range. The robustness is a crucial factor for the fully automatic control of reactor power due to time-varying, uncertain parameters, and state estimation error, or unmodeled dynamics. A variable structure control (VSC) method is introduced which consists of an adaptive performance specification (fime control) after the tracking error reaches the narrow boundary-layer by a time-optimal control (coarse control). Variable structure control is a powerful method for nonlinear system controller design which has inherent robustness to parameter variations or external disturbances using the known uncertainty bounds, and it requires very low computational efforts. In spite of its desirable properties, conventional VSC presents several important drawbacks that limit its practical applicability. One of the most undesirable phenomena is chattering, which implies extremely high control activity and may excite high-frequency unmodeled dynamics. This problem is due to the neglected actuator time-delay or sampling effects. The problem was partially remedied by replacing chattering control by a smooth control inter-polation in a boundary layer neighnboring a time-varying sliding surface. But, for the nuclear reactor systems which has very fast dynamic response, the sampling effect may destroy the narrow boundary layer when a large uncertainty bound is used. Due to the very short neutron life time, large uncertainty bound leads to the high gain in feedback control. To resolve this problem, a derivative feedback is introduced that gives excellent performance by reducing the uncertainty bound. The stability of tracking error dynamics is guaranteed by the second method of Lyapunov using the two-level uncertainty bounds that are obtained from the knowledge of uncertainty bound and the estimated

  8. Bayesian uncertainty analysis with applications to turbulence modeling

    International Nuclear Information System (INIS)

    Cheung, Sai Hung; Oliver, Todd A.; Prudencio, Ernesto E.; Prudhomme, Serge; Moser, Robert D.

    2011-01-01

    In this paper, we apply Bayesian uncertainty quantification techniques to the processes of calibrating complex mathematical models and predicting quantities of interest (QoI's) with such models. These techniques also enable the systematic comparison of competing model classes. The processes of calibration and comparison constitute the building blocks of a larger validation process, the goal of which is to accept or reject a given mathematical model for the prediction of a particular QoI for a particular scenario. In this work, we take the first step in this process by applying the methodology to the analysis of the Spalart-Allmaras turbulence model in the context of incompressible, boundary layer flows. Three competing model classes based on the Spalart-Allmaras model are formulated, calibrated against experimental data, and used to issue a prediction with quantified uncertainty. The model classes are compared in terms of their posterior probabilities and their prediction of QoI's. The model posterior probability represents the relative plausibility of a model class given the data. Thus, it incorporates the model's ability to fit experimental observations. Alternatively, comparing models using the predicted QoI connects the process to the needs of decision makers that use the results of the model. We show that by using both the model plausibility and predicted QoI, one has the opportunity to reject some model classes after calibration, before subjecting the remaining classes to additional validation challenges.

  9. A simplified model of choice behavior under uncertainty

    Directory of Open Access Journals (Sweden)

    Ching-Hung Lin

    2016-08-01

    Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  10. Uncertainty modelling and code calibration for composite materials

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Branner, Kim; Mishnaevsky, Leon, Jr

    2013-01-01

    Uncertainties related to the material properties of a composite material can be determined from the micro-, meso- or macro-scales. These three starting points for a stochastic modelling of the material properties are investigated. The uncertainties are divided into physical, model, statistical...... between risk of failure and cost of the structure. Consideration related to calibration of partial safety factors for composite material is described, including the probability of failure, format for the partial safety factor method and weight factors for different load cases. In a numerical example......, it is demonstrated how probabilistic models for the material properties formulated on micro-scale can be calibrated using tests on the meso- and macro-scales. The results are compared to probabilistic models estimated directly from tests on the macro-scale. In another example, partial safety factors for application...

  11. Exploring uncertainty and model predictive performance concepts via a modular snowmelt-runoff modeling framework

    Science.gov (United States)

    Tyler Jon Smith; Lucy Amanda. Marshall

    2010-01-01

    Model selection is an extremely important aspect of many hydrologic modeling studies because of the complexity, variability, and uncertainty that surrounds the current understanding of watershed-scale systems. However, development and implementation of a complete precipitation-runoff modeling framework, from model selection to calibration and uncertainty analysis, are...

  12. Approximating prediction uncertainty for random forest regression models

    Science.gov (United States)

    John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne

    2016-01-01

    Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...

  13. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  14. Evaluation of Uncertainties in hydrogeological modeling and groundwater flow analyses. Model calibration

    International Nuclear Information System (INIS)

    Ijiri, Yuji; Ono, Makoto; Sugihara, Yutaka; Shimo, Michito; Yamamoto, Hajime; Fumimura, Kenichi

    2003-03-01

    This study involves evaluation of uncertainty in hydrogeological modeling and groundwater flow analysis. Three-dimensional groundwater flow in Shobasama site in Tono was analyzed using two continuum models and one discontinuous model. The domain of this study covered area of four kilometers in east-west direction and six kilometers in north-south direction. Moreover, for the purpose of evaluating how uncertainties included in modeling of hydrogeological structure and results of groundwater simulation decreased with progress of investigation research, updating and calibration of the models about several modeling techniques of hydrogeological structure and groundwater flow analysis techniques were carried out, based on the information and knowledge which were newly acquired. The acquired knowledge is as follows. As a result of setting parameters and structures in renewal of the models following to the circumstances by last year, there is no big difference to handling between modeling methods. The model calibration is performed by the method of matching numerical simulation with observation, about the pressure response caused by opening and closing of a packer in MIU-2 borehole. Each analysis technique attains reducing of residual sum of squares of observations and results of numerical simulation by adjusting hydrogeological parameters. However, each model adjusts different parameters as water conductivity, effective porosity, specific storage, and anisotropy. When calibrating models, sometimes it is impossible to explain the phenomena only by adjusting parameters. In such case, another investigation may be required to clarify details of hydrogeological structure more. As a result of comparing research from beginning to this year, the following conclusions are obtained about investigation. (1) The transient hydraulic data are effective means in reducing the uncertainty of hydrogeological structure. (2) Effective porosity for calculating pore water velocity of

  15. Effects of input uncertainty on cross-scale crop modeling

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input

  16. Uncertainty Analysis of the Water Scarcity Footprint Based on the AWARE Model Considering Temporal Variations

    Directory of Open Access Journals (Sweden)

    Jong Seok Lee

    2018-03-01

    Full Text Available The purpose of this paper is to compare the degree of uncertainty of the water scarcity footprint using the Monte Carlo statistical method and block bootstrap method. Using the hydrological data of a water drainage basin in Korea, characterization factors based on the available water remaining (AWARE model were obtained. The uncertainties of the water scarcity footprint considering temporal variations in paddy rice production in Korea were estimated. The block bootstrap method gave five-times smaller percentage uncertainty values of the model output compared to that of the two different Monte Carlo statistical method scenarios. Incorrect estimation of the probability distribution of the AWARE characterization factor model is what causes the higher uncertainty in the water scarcity footprint value calculated by the Monte Carlo statistical method in this study. This is because AWARE characterization factor values partly follows discrete distribution with extreme value on one side. Therefore, this study suggests that the block bootstrap method is a better choice in analyzing uncertainty compared to the Monte Carlo statistical method when using the AWARE model to quantify the water scarcity footprint.

  17. A Multi-objective Model for Transmission Planning Under Uncertainties

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Wang, Qi; Ding, Yi

    2014-01-01

    trading and regulating in transmission level. In this paper, the aggregator caused uncertainty is analyzed first considering DERs’ correlation. During the transmission planning, a scenario-based multi-objective transmission planning (MOTP) framework is proposed to simultaneously optimize two objectives, i......The significant growth of distributed energy resources (DERs) associated with smart grid technologies has prompted excessive uncertainties in the transmission system. The most representative is the novel notation of commercial aggregator who has lighted a bright way for DERs to participate power.......e. the cost of power purchase and network expansion, and the revenue of power delivery. A two-phase multi-objective PSO (MOPSO) algorithm is employed to be the solver. The feasibility of the proposed multi-objective planning approach has been verified by the 77-bus system linked with 38-bus distribution...

  18. Uncertainty propagation through an aeroelastic wind turbine model using polynomial surrogates

    DEFF Research Database (Denmark)

    Murcia Leon, Juan Pablo; Réthoré, Pierre-Elouan; Dimitrov, Nikolay Krasimirov

    2018-01-01

    -alignment. The methodology presented extends the deterministic power and thrust coefficient curves to uncertainty models and adds new variables like damage equivalent fatigue loads in different components of the turbine. These surrogate models can then be implemented inside other work-flows such as: estimation......Polynomial surrogates are used to characterize the energy production and lifetime equivalent fatigue loads for different components of the DTU 10 MW reference wind turbine under realistic atmospheric conditions. The variability caused by different turbulent inflow fields are captured by creating...... of the uncertainty in annual energy production due to wind resource variability and/or robust wind power plant layout optimization. It can be concluded that it is possible to capture the global behavior of a modern wind turbine and its uncertainty under realistic inflow conditions using polynomial response surfaces...

  19. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  20. Model Uncertainties for Valencia RPA Effect for MINERvA

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Richard [Univ. of Minnesota, Duluth, MN (United States)

    2017-05-08

    This technical note describes the application of the Valencia RPA multi-nucleon effect and its uncertainty to QE reactions from the GENIE neutrino event generator. The analysis of MINERvA neutrino data in Rodrigues et al. PRL 116 071802 (2016) paper makes clear the need for an RPA suppression, especially at very low momentum and energy transfer. That published analysis does not constrain the magnitude of the effect; it only tests models with and without the effect against the data. Other MINERvA analyses need an expression of the model uncertainty in the RPA effect. A well-described uncertainty can be used for systematics for unfolding, for model errors in the analysis of non-QE samples, and as input for fitting exercises for model testing or constraining backgrounds. This prescription takes uncertainties on the parameters in the Valencia RPA model and adds a (not-as-tight) constraint from muon capture data. For MINERvA we apply it as a 2D ($q_0$,$q_3$) weight to GENIE events, in lieu of generating a full beyond-Fermi-gas quasielastic events. Because it is a weight, it can be applied to the generated and fully Geant4 simulated events used in analysis without a special GENIE sample. For some limited uses, it could be cast as a 1D $Q^2$ weight without much trouble. This procedure is a suitable starting point for NOvA and DUNE where the energy dependence is modest, but probably not adequate for T2K or MicroBooNE.

  1. Sensitivity of modeled ozone concentrations to uncertainties in biogenic emissions

    International Nuclear Information System (INIS)

    Roselle, S.J.

    1992-06-01

    The study examines the sensitivity of regional ozone (O3) modeling to uncertainties in biogenic emissions estimates. The United States Environmental Protection Agency's (EPA) Regional Oxidant Model (ROM) was used to simulate the photochemistry of the northeastern United States for the period July 2-17, 1988. An operational model evaluation showed that ROM had a tendency to underpredict O3 when observed concentrations were above 70-80 ppb and to overpredict O3 when observed values were below this level. On average, the model underpredicted daily maximum O3 by 14 ppb. Spatial patterns of O3, however, were reproduced favorably by the model. Several simulations were performed to analyze the effects of uncertainties in biogenic emissions on predicted O3 and to study the effectiveness of two strategies of controlling anthropogenic emissions for reducing high O3 concentrations. Biogenic hydrocarbon emissions were adjusted by a factor of 3 to account for the existing range of uncertainty in these emissions. The impact of biogenic emission uncertainties on O3 predictions depended upon the availability of NOx. In some extremely NOx-limited areas, increasing the amount of biogenic emissions decreased O3 concentrations. Two control strategies were compared in the simulations: (1) reduced anthropogenic hydrocarbon emissions, and (2) reduced anthropogenic hydrocarbon and NOx emissions. The simulations showed that hydrocarbon emission controls were more beneficial to the New York City area, but that combined NOx and hydrocarbon controls were more beneficial to other areas of the Northeast. Hydrocarbon controls were more effective as biogenic hydrocarbon emissions were reduced, whereas combined NOx and hydrocarbon controls were more effective as biogenic hydrocarbon emissions were increased

  2. Uncertainty propagation in a multiscale model of nanocrystalline plasticity

    International Nuclear Information System (INIS)

    Koslowski, M.; Strachan, Alejandro

    2011-01-01

    We characterize how uncertainties propagate across spatial and temporal scales in a physics-based model of nanocrystalline plasticity of fcc metals. Our model combines molecular dynamics (MD) simulations to characterize atomic-level processes that govern dislocation-based-plastic deformation with a phase field approach to dislocation dynamics (PFDD) that describes how an ensemble of dislocations evolve and interact to determine the mechanical response of the material. We apply this approach to a nanocrystalline Ni specimen of interest in micro-electromechanical (MEMS) switches. Our approach enables us to quantify how internal stresses that result from the fabrication process affect the properties of dislocations (using MD) and how these properties, in turn, affect the yield stress of the metallic membrane (using the PFMM model). Our predictions show that, for a nanocrystalline sample with small grain size (4 nm), a variation in residual stress of 20 MPa (typical in today's microfabrication techniques) would result in a variation on the critical resolved shear yield stress of approximately 15 MPa, a very small fraction of the nominal value of approximately 9 GPa. - Highlights: → Quantify how fabrication uncertainties affect yield stress in a microswitch component. → Propagate uncertainties in a multiscale model of single crystal plasticity. → Molecular dynamics quantifies how fabrication variations affect dislocations. → Dislocation dynamics relate variations in dislocation properties to yield stress.

  3. Uncertainty Model for Total Solar Irradiance Estimation on Australian Rooftops

    Science.gov (United States)

    Al-Saadi, Hassan; Zivanovic, Rastko; Al-Sarawi, Said

    2017-11-01

    The installations of solar panels on Australian rooftops have been in rise for the last few years, especially in the urban areas. This motivates academic researchers, distribution network operators and engineers to accurately address the level of uncertainty resulting from grid-connected solar panels. The main source of uncertainty is the intermittent nature of radiation, therefore, this paper presents a new model to estimate the total radiation incident on a tilted solar panel. Where a probability distribution factorizes clearness index, the model is driven upon clearness index with special attention being paid for Australia with the utilization of best-fit-correlation for diffuse fraction. The assessment of the model validity is achieved with the adoption of four goodness-of-fit techniques. In addition, the Quasi Monte Carlo and sparse grid methods are used as sampling and uncertainty computation tools, respectively. High resolution data resolution of solar irradiations for Adelaide city were used for this assessment, with an outcome indicating a satisfactory agreement between actual data variation and model.

  4. Modeling Multiple Causes of Carcinogenesis

    Energy Technology Data Exchange (ETDEWEB)

    Jones, T D

    1999-01-24

    An array of epidemiological results and databases on test animal indicate that risk of cancer and atherosclerosis can be up- or down-regulated by diet through a range of 200%. Other factors contribute incrementally and include the natural terrestrial environment and various human activities that jointly produce complex exposures to endotoxin-producing microorganisms, ionizing radiations, and chemicals. Ordinary personal habits and simple physical irritants have been demonstrated to affect the immune response and risk of disease. There tends to be poor statistical correlation of long-term risk with single agent exposures incurred throughout working careers. However, Agency recommendations for control of hazardous exposures to humans has been substance-specific instead of contextually realistic even though there is consistent evidence for common mechanisms of toxicological and carcinogenic action. That behavior seems to be best explained by molecular stresses from cellular oxygen metabolism and phagocytosis of antigenic invasion as well as breakdown of normal metabolic compounds associated with homeostatic- and injury-related renewal of cells. There is continually mounting evidence that marrow stroma, comprised largely of monocyte-macrophages and fibroblasts, is important to phagocytic and cytokinetic response, but the complex action of the immune process is difficult to infer from first-principle logic or biomarkers of toxic injury. The many diverse database studies all seem to implicate two important processes, i.e., the univalent reduction of molecular oxygen and breakdown of aginuine, an amino acid, by hydrolysis or digestion of protein which is attendant to normal antigen-antibody action. This behavior indicates that protection guidelines and risk coefficients should be context dependent to include reference considerations of the composite action of parameters that mediate oxygen metabolism. A logic of this type permits the realistic common-scale modeling of

  5. Uncertainty and sensitivity analysis of environmental transport models

    International Nuclear Information System (INIS)

    Margulies, T.S.; Lancaster, L.E.

    1985-01-01

    An uncertainty and sensitivity analysis has been made of the CRAC-2 (Calculations of Reactor Accident Consequences) atmospheric transport and deposition models. Robustness and uncertainty aspects of air and ground deposited material and the relative contribution of input and model parameters were systematically studied. The underlying data structures were investigated using a multiway layout of factors over specified ranges generated via a Latin hypercube sampling scheme. The variables selected in our analysis include: weather bin, dry deposition velocity, rain washout coefficient/rain intensity, duration of release, heat content, sigma-z (vertical) plume dispersion parameter, sigma-y (crosswind) plume dispersion parameter, and mixing height. To determine the contributors to the output variability (versus distance from the site) step-wise regression analyses were performed on transformations of the spatial concentration patterns simulated. 27 references, 2 figures, 3 tables

  6. Review of strategies for handling geological uncertainty in groundwater flow and transport modeling

    DEFF Research Database (Denmark)

    Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.

    2012-01-01

    be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...

  7. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    Science.gov (United States)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data

  8. Economic-mathematical methods and models under uncertainty

    CERN Document Server

    Aliyev, A G

    2013-01-01

    Brief Information on Finite-Dimensional Vector Space and its Application in EconomicsBases of Piecewise-Linear Economic-Mathematical Models with Regard to Influence of Unaccounted Factors in Finite-Dimensional Vector SpacePiecewise Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence in Three-Dimensional Vector SpacePiecewise-Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence on a PlaneBases of Software for Computer Simulation and Multivariant Prediction of Economic Even at Uncertainty Conditions on the Base of N-Comp

  9. Incorporation of Satellite Data and Uncertainty in a Nationwide Groundwater Recharge Model in New Zealand

    Directory of Open Access Journals (Sweden)

    Rogier Westerhoff

    2018-01-01

    Full Text Available A nationwide model of groundwater recharge for New Zealand (NGRM, as described in this paper, demonstrated the benefits of satellite data and global models to improve the spatial definition of recharge and the estimation of recharge uncertainty. NGRM was inspired by the global-scale WaterGAP model but with the key development of rainfall recharge calculation on scales relevant to national- and catchment-scale studies (i.e., a 1 km × 1 km cell size and a monthly timestep in the period 2000–2014 provided by satellite data (i.e., MODIS-derived evapotranspiration, AET and vegetation in combination with national datasets of rainfall, elevation, soil and geology. The resulting nationwide model calculates groundwater recharge estimates, including their uncertainty, consistent across the country, which makes the model unique compared to all other New Zealand estimates targeted towards groundwater recharge. At the national scale, NGRM estimated an average recharge of 2500 m 3 /s, or 298 mm/year, with a model uncertainty of 17%. Those results were similar to the WaterGAP model, but the improved input data resulted in better spatial characteristics of recharge estimates. Multiple uncertainty analyses led to these main conclusions: the NGRM model could give valuable initial estimates in data-sparse areas, since it compared well to most ground-observed lysimeter data and local recharge models; and the nationwide input data of rainfall and geology caused the largest uncertainty in the model equation, which revealed that the satellite data could improve spatial characteristics without significantly increasing the uncertainty. Clearly the increasing volume and availability of large-scale satellite data is creating more opportunities for the application of national-scale models at the catchment, and smaller, scales. This should result in improved utility of these models including provision of initial estimates in data-sparse areas. Topics for future

  10. Stochastic reduced order models for inverse problems under uncertainty.

    Science.gov (United States)

    Warner, James E; Aquino, Wilkins; Grigoriu, Mircea D

    2015-03-01

    This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well.

  11. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    Science.gov (United States)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  12. The effects of geometric uncertainties on computational modelling of knee biomechanics

    Science.gov (United States)

    Meng, Qingen; Fisher, John; Wilcox, Ruth

    2017-08-01

    The geometry of the articular components of the knee is an important factor in predicting joint mechanics in computational models. There are a number of uncertainties in the definition of the geometry of cartilage and meniscus, and evaluating the effects of these uncertainties is fundamental to understanding the level of reliability of the models. In this study, the sensitivity of knee mechanics to geometric uncertainties was investigated by comparing polynomial-based and image-based knee models and varying the size of meniscus. The results suggested that the geometric uncertainties in cartilage and meniscus resulting from the resolution of MRI and the accuracy of segmentation caused considerable effects on the predicted knee mechanics. Moreover, even if the mathematical geometric descriptors can be very close to the imaged-based articular surfaces, the detailed contact pressure distribution produced by the mathematical geometric descriptors was not the same as that of the image-based model. However, the trends predicted by the models based on mathematical geometric descriptors were similar to those of the imaged-based models.

  13. A Multi-objective Model for Transmission Planning Under Uncertainties

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Wang, Qi; Ding, Yi

    2014-01-01

    trading and regulating in transmission level. In this paper, the aggregator caused uncertainty is analyzed first considering DERs’ correlation. During the transmission planning, a scenario-based multi-objective transmission planning (MOTP) framework is proposed to simultaneously optimize two objectives, i.......e. the cost of power purchase and network expansion, and the revenue of power delivery. A two-phase multi-objective PSO (MOPSO) algorithm is employed to be the solver. The feasibility of the proposed multi-objective planning approach has been verified by the 77-bus system linked with 38-bus distribution...

  14. Handling Uncertainty in Palaeo-Climate Models and Data

    Science.gov (United States)

    Voss, J.; Haywood, A. M.; Dolan, A. M.; Domingo, D.

    2017-12-01

    The study of palaeoclimates can provide data on the behaviour of the Earth system with boundary conditions different from the ones we observe in the present. One of the main challenges in this approach is that data on past climates comes with large uncertainties, since quantities of interest cannot be observed directly, but must be derived from proxies instead. We consider proxy-derived data from the Pliocene (around 3 millions years ago; the last interval in Earth history when CO2 was at modern or near future levels) and contrast this data to the output of complex climate models. In order to perform a meaningful data-model comparison, uncertainties must be taken into account. In this context, we discuss two examples of complex data-model comparison problems. Both examples have in common that they involve fitting a statistical model to describe how the output of the climate simulations depends on various model parameters, including atmospheric CO2 concentration and orbital parameters (obliquity, excentricity, and precession). This introduces additional uncertainties, but allows to explore a much larger range of model parameters than would be feasible by only relying on simulation runs. The first example shows how Gaussian process emulators can be used to perform data-model comparison when simulation runs only differ in the choice of orbital parameters, but temperature data is given in the (somewhat inconvenient) form of "warm peak averages". The second example shows how a simpler approach, based on linear regression, can be used to analyse a more complex problem where we use a larger and more varied ensemble of climate simulations with the aim to estimate Earth System Sensitivity.

  15. Spatial variability and parametric uncertainty in performance assessment models

    International Nuclear Information System (INIS)

    Pensado, Osvaldo; Mancillas, James; Painter, Scott; Tomishima, Yasuo

    2011-01-01

    The problem of defining an appropriate treatment of distribution functions (which could represent spatial variability or parametric uncertainty) is examined based on a generic performance assessment model for a high-level waste repository. The generic model incorporated source term models available in GoldSim ® , the TDRW code for contaminant transport in sparse fracture networks with a complex fracture-matrix interaction process, and a biosphere dose model known as BDOSE TM . Using the GoldSim framework, several Monte Carlo sampling approaches and transport conceptualizations were evaluated to explore the effect of various treatments of spatial variability and parametric uncertainty on dose estimates. Results from a model employing a representative source and ensemble-averaged pathway properties were compared to results from a model allowing for stochastic variation of transport properties along streamline segments (i.e., explicit representation of spatial variability within a Monte Carlo realization). We concluded that the sampling approach and the definition of an ensemble representative do influence consequence estimates. In the examples analyzed in this paper, approaches considering limited variability of a transport resistance parameter along a streamline increased the frequency of fast pathways resulting in relatively high dose estimates, while those allowing for broad variability along streamlines increased the frequency of 'bottlenecks' reducing dose estimates. On this basis, simplified approaches with limited consideration of variability may suffice for intended uses of the performance assessment model, such as evaluation of site safety. (author)

  16. Uncertainties in modelling the climate impact of irrigation

    Science.gov (United States)

    de Vrese, Philipp; Hagemann, Stefan

    2017-11-01

    Irrigation-based agriculture constitutes an essential factor for food security as well as fresh water resources and has a distinct impact on regional and global climate. Many issues related to irrigation's climate impact are addressed in studies that apply a wide range of models. These involve substantial uncertainties related to differences in the model's structure and its parametrizations on the one hand and the need for simplifying assumptions for the representation of irrigation on the other hand. To address these uncertainties, we used the Max Planck Institute for Meteorology's Earth System model into which a simple irrigation scheme was implemented. In order to estimate possible uncertainties with regard to the model's more general structure, we compared the climate impact of irrigation between three simulations that use different schemes for the land-surface-atmosphere coupling. Here, it can be shown that the choice of coupling scheme does not only affect the magnitude of possible impacts but even their direction. For example, when using a scheme that does not explicitly resolve spatial subgrid scale heterogeneity at the surface, irrigation reduces the atmospheric water content, even in heavily irrigated regions. Contrarily, in simulations that use a coupling scheme that resolves heterogeneity at the surface or even within the lowest layers of the atmosphere, irrigation increases the average atmospheric specific humidity. A second experiment targeted possible uncertainties related to the representation of irrigation characteristics. Here, in four simulations the irrigation effectiveness (controlled by the target soil moisture and the non-vegetated fraction of the grid box that receives irrigation) and the timing of delivery were varied. The second experiment shows that uncertainties related to the modelled irrigation characteristics, especially the irrigation effectiveness, are also substantial. In general the impact of irrigation on the state of the land

  17. Sensitivity and uncertainty analysis of a polyurethane foam decomposition model

    Energy Technology Data Exchange (ETDEWEB)

    HOBBS,MICHAEL L.; ROBINSON,DAVID G.

    2000-03-14

    Sensitivity/uncertainty analyses are not commonly performed on complex, finite-element engineering models because the analyses are time consuming, CPU intensive, nontrivial exercises that can lead to deceptive results. To illustrate these ideas, an analytical sensitivity/uncertainty analysis is used to determine the standard deviation and the primary factors affecting the burn velocity of polyurethane foam exposed to firelike radiative boundary conditions. The complex, finite element model has 25 input parameters that include chemistry, polymer structure, and thermophysical properties. The response variable was selected as the steady-state burn velocity calculated as the derivative of the burn front location versus time. The standard deviation of the burn velocity was determined by taking numerical derivatives of the response variable with respect to each of the 25 input parameters. Since the response variable is also a derivative, the standard deviation is essentially determined from a second derivative that is extremely sensitive to numerical noise. To minimize the numerical noise, 50-micron elements and approximately 1-msec time steps were required to obtain stable uncertainty results. The primary effect variable was shown to be the emissivity of the foam.

  18. Uncertainty quantification for quantum chemical models of complex reaction networks.

    Science.gov (United States)

    Proppe, Jonny; Husch, Tamara; Simm, Gregor N; Reiher, Markus

    2016-12-22

    For the quantitative understanding of complex chemical reaction mechanisms, it is, in general, necessary to accurately determine the corresponding free energy surface and to solve the resulting continuous-time reaction rate equations for a continuous state space. For a general (complex) reaction network, it is computationally hard to fulfill these two requirements. However, it is possible to approximately address these challenges in a physically consistent way. On the one hand, it may be sufficient to consider approximate free energies if a reliable uncertainty measure can be provided. On the other hand, a highly resolved time evolution may not be necessary to still determine quantitative fluxes in a reaction network if one is interested in specific time scales. In this paper, we present discrete-time kinetic simulations in discrete state space taking free energy uncertainties into account. The method builds upon thermo-chemical data obtained from electronic structure calculations in a condensed-phase model. Our kinetic approach supports the analysis of general reaction networks spanning multiple time scales, which is here demonstrated for the example of the formose reaction. An important application of our approach is the detection of regions in a reaction network which require further investigation, given the uncertainties introduced by both approximate electronic structure methods and kinetic models. Such cases can then be studied in greater detail with more sophisticated first-principles calculations and kinetic simulations.

  19. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  20. Uncertainty Quantification for Combined Polynomial Chaos Kriging Surrogate Models

    Science.gov (United States)

    Weinmeister, Justin; Gao, Xinfeng; Krishna Prasad, Aditi; Roy, Sourajeet

    2017-11-01

    Surrogate modeling techniques are currently used to perform uncertainty quantification on computational fluid dynamics (CFD) models for their ability to identify the most impactful parameters on CFD simulations and help reduce computational cost in engineering design process. The accuracy of these surrogate models depends on a number of factors, such as the training data created from the CFD simulations, the target functions, the surrogate model framework, and so on. Recently, we have combined polynomial chaos expansions (PCE) and Kriging to produce a more accurate surrogate model, polynomial chaos Kriging (PCK). In this talk, we analyze the error convergence rate for the Kriging, PCE, and PCK model on a convection-diffusion-reaction problem, and validate the statistical measures and performance of the PCK method for its application to practical CFD simulations.

  1. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    2012-01-01

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... "preferred" GIA model has been used, without any consideration of the possible errors involved. Lacking a rigorous assessment of systematic errors in GIA modeling, the reliabil-ity of the results is uncertain. GIA sensitivity and uncertainties associated with the viscosity mod-els have been explored...... in the literature. However, at least two major sources of errors remain. The first is associated with the ice models, spatial distribution of ice and history of melting (this is especially the case of Antarctica), the second with the numerical implementation of model fea-tures relevant to sea level modeling...

  2. A sliding mode observer for hemodynamic characterization under modeling uncertainties

    KAUST Repository

    Zayane, Chadia

    2014-06-01

    This paper addresses the case of physiological states reconstruction in a small region of the brain under modeling uncertainties. The misunderstood coupling between the cerebral blood volume and the oxygen extraction fraction has lead to a partial knowledge of the so-called balloon model describing the hemodynamic behavior of the brain. To overcome this difficulty, a High Order Sliding Mode observer is applied to the balloon system, where the unknown coupling is considered as an internal perturbation. The effectiveness of the proposed method is illustrated through a set of synthetic data that mimic fMRI experiments.

  3. Selected examples of practical approaches for the assessment of model reliability - parameter uncertainty analysis

    International Nuclear Information System (INIS)

    Hofer, E.; Hoffman, F.O.

    1987-02-01

    The uncertainty analysis of model predictions has to discriminate between two fundamentally different types of uncertainty. The presence of stochastic variability (Type 1 uncertainty) necessitates the use of a probabilistic model instead of the much simpler deterministic one. Lack of knowledge (Type 2 uncertainty), however, applies to deterministic as well as to probabilistic model predictions and often dominates over uncertainties of Type 1. The term ''probability'' is interpreted differently in the probabilistic analysis of either type of uncertainty. After these discriminations have been explained the discussion centers on the propagation of parameter uncertainties through the model, the derivation of quantitative uncertainty statements for model predictions and the presentation and interpretation of the results of a Type 2 uncertainty analysis. Various alternative approaches are compared for a very simple deterministic model

  4. A Bayesian Framework of Uncertainties Integration in 3D Geological Model

    Science.gov (United States)

    Liang, D.; Liu, X.

    2017-12-01

    3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.

  5. Making Invasion models useful for decision makers; incorporating uncertainty, knowledge gaps, and decision-making preferences

    Science.gov (United States)

    Denys Yemshanov; Frank H Koch; Mark Ducey

    2015-01-01

    Uncertainty is inherent in model-based forecasts of ecological invasions. In this chapter, we explore how the perceptions of that uncertainty can be incorporated into the pest risk assessment process. Uncertainty changes a decision maker’s perceptions of risk; therefore, the direct incorporation of uncertainty may provide a more appropriate depiction of risk. Our...

  6. Application of Probability Methods to Assess Crash Modeling Uncertainty

    Science.gov (United States)

    Lyle, Karen H.; Stockwell, Alan E.; Hardy, Robin C.

    2007-01-01

    Full-scale aircraft crash simulations performed with nonlinear, transient dynamic, finite element codes can incorporate structural complexities such as: geometrically accurate models; human occupant models; and advanced material models to include nonlinear stress-strain behaviors, and material failure. Validation of these crash simulations is difficult due to a lack of sufficient information to adequately determine the uncertainty in the experimental data and the appropriateness of modeling assumptions. This paper evaluates probabilistic approaches to quantify the effects of finite element modeling assumptions on the predicted responses. The vertical drop test of a Fokker F28 fuselage section will be the focus of this paper. The results of a probabilistic analysis using finite element simulations will be compared with experimental data.

  7. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  8. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  9. Quantifying uncertainty in Bayesian calibrated animal-to-human PBPK models with informative prior distributions

    Science.gov (United States)

    Understanding and quantifying the uncertainty of model parameters and predictions has gained more interest in recent years with the increased use of computational models in chemical risk assessment. Fully characterizing the uncertainty in risk metrics derived from linked quantita...

  10. Reliability of Coulomb stress changes inferred from correlated uncertainties of finite-fault source models

    KAUST Repository

    Woessner, J.

    2012-07-14

    Static stress transfer is one physical mechanism to explain triggered seismicity. Coseismic stress-change calculations strongly depend on the parameterization of the causative finite-fault source model. These models are uncertain due to uncertainties in input data, model assumptions, and modeling procedures. However, fault model uncertainties have usually been ignored in stress-triggering studies and have not been propagated to assess the reliability of Coulomb failure stress change (ΔCFS) calculations. We show how these uncertainties can be used to provide confidence intervals for co-seismic ΔCFS-values. We demonstrate this for the MW = 5.9 June 2000 Kleifarvatn earthquake in southwest Iceland and systematically map these uncertainties. A set of 2500 candidate source models from the full posterior fault-parameter distribution was used to compute 2500 ΔCFS maps. We assess the reliability of the ΔCFS-values from the coefficient of variation (CV) and deem ΔCFS-values to be reliable where they are at least twice as large as the standard deviation (CV ≤ 0.5). Unreliable ΔCFS-values are found near the causative fault and between lobes of positive and negative stress change, where a small change in fault strike causes ΔCFS-values to change sign. The most reliable ΔCFS-values are found away from the source fault in the middle of positive and negative ΔCFS-lobes, a likely general pattern. Using the reliability criterion, our results support the static stress-triggering hypothesis. Nevertheless, our analysis also suggests that results from previous stress-triggering studies not considering source model uncertainties may have lead to a biased interpretation of the importance of static stress-triggering.

  11. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  12. Uncertainties in storm surge and coastal inundation modeling

    Science.gov (United States)

    Dukhovskoy, D. S.; Morey, S. L.

    2011-12-01

    Storm surge modeling has developed noticeably over the past two decades marching from relatively simple two-dimensional models with simplified physics and coarse computational grid to three-dimensional complex modeling systems with wetting and drying capabilities and high-resolution grids. Although the physics of a storm surge is conceptually straight forward, it is still a challenge to provide an accurate numerical forecast of a storm surge. The presentation will focus on sources of uncertainties in storm surge modeling based on several real-case simulations of the storm surge in the Gulf of Mexico. An ongoing study on estimating the likelihood of coastal inundation along the U.S. Gulf coast will be presented.

  13. Workshop on Model Uncertainty and its Statistical Implications

    CERN Document Server

    1988-01-01

    In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.

  14. All wind farm uncertainty is not the same: The economics of common versus independent causes

    International Nuclear Information System (INIS)

    Veers, P.S.

    1994-01-01

    There is uncertainty in the performance of wind energy installations due to unknowns in the local wind environment, machine response to the environment, and the durability of materials. Some of the unknowns are inherently independent from machine to machine while other uncertainties are common to the entire fleet equally. The FAROW computer software for fatigue and reliability of wind turbines is used to calculate the probability of component failure due to a combination of all sources of uncertainty. Although the total probability of component failure due to all effects is sometimes interpreted as the percentage of components likely to fail, this perception is often far from correct. Different amounts of common versus independent uncertainty are reflected in economic risk due to either high probabilities that a small percentage of the fleet will experience problems or low probabilities that the entire fleet will have problems. The average, or expected cost is the same as would be calculated by combining all sources of uncertainty, but the risk to the fleet may be quite different in nature. Present values of replacement costs are compared for two examples reflecting different stages in the design and development process. Results emphasize that an engineering effort to test and evaluate the design assumptions is necessary to advance a design from the high uncertainty of the conceptual stages to the lower uncertainty of a well engineered and tested machine

  15. Uncertainty quantification and sensitivity analysis of an arterial wall mechanics model for evaluation of vascular drug therapies.

    Science.gov (United States)

    Heusinkveld, Maarten H G; Quicken, Sjeng; Holtackers, Robert J; Huberts, Wouter; Reesink, Koen D; Delhaas, Tammo; Spronck, Bart

    2018-02-01

    Quantification of the uncertainty in constitutive model predictions describing arterial wall mechanics is vital towards non-invasive assessment of vascular drug therapies. Therefore, we perform uncertainty quantification to determine uncertainty in mechanical characteristics describing the vessel wall response upon loading. Furthermore, a global variance-based sensitivity analysis is performed to pinpoint measurements that are most rewarding to be measured more precisely. We used previously published carotid diameter-pressure and intima-media thickness (IMT) data (measured in triplicate), and Holzapfel-Gasser-Ogden models. A virtual data set containing 5000 diastolic and systolic diameter-pressure points, and IMT values was generated by adding measurement error to the average of the measured data. The model was fitted to single-exponential curves calculated from the data, obtaining distributions of constitutive parameters and constituent load bearing parameters. Additionally, we (1) simulated vascular drug treatment to assess the relevance of model uncertainty and (2) evaluated how increasing the number of measurement repetitions influences model uncertainty. We found substantial uncertainty in constitutive parameters. Simulating vascular drug treatment predicted a 6% point reduction in collagen load bearing ([Formula: see text]), approximately 50% of its uncertainty. Sensitivity analysis indicated that the uncertainty in [Formula: see text] was primarily caused by noise in distension and IMT measurements. Spread in [Formula: see text] could be decreased by 50% when increasing the number of measurement repetitions from 3 to 10. Model uncertainty, notably that in [Formula: see text], could conceal effects of vascular drug therapy. However, this uncertainty could be reduced by increasing the number of measurement repetitions of distension and wall thickness measurements used for model parameterisation.

  16. Nuclear reaction rate uncertainties and astrophysical modeling: Carbon yields from low-mass giants

    International Nuclear Information System (INIS)

    Herwig, Falk; Austin, Sam M.; Lattanzio, John C.

    2006-01-01

    Calculations that demonstrate the influence of three key nuclear reaction rates on the evolution of asymptotic giant branch stars have been carried out. We study the case of a star with an initial mass of 2 M · and a metallicity of Z=0.01, somewhat less than the solar metallicity. The dredge-up of nuclear processed material from the interior of the star and the yield predictions for carbon are sensitive to the rate of the 14 N(p,γ) 15 O and triple-α reactions. These reactions dominate the H- and He-burning shells of stars in this late evolutionary phase. Published uncertainty estimates for each of these two rates propagated through stellar evolution calculations cause uncertainties in carbon enrichment and yield predictions of about a factor of 2. The other important He-burning reaction, 12 C(α,γ) 16 O, although associated with the largest uncertainty in our study, does not have a significant influence on the abundance evolution compared with other modeling uncertainties. This finding remains valid when the entire evolution from the main sequence to the tip of the asymptotic giant branch is considered. We discuss the experimental sources of the rate uncertainties addressed here and give some outlooks for future work

  17. The uncertainty of modeled soil carbon stock change for Finland

    Science.gov (United States)

    Lehtonen, Aleksi; Heikkinen, Juha

    2013-04-01

    Countries should report soil carbon stock changes of forests for Kyoto Protocol. Under Kyoto Protocol one can omit reporting of a carbon pool by verifying that the pool is not a source of carbon, which is especially tempting for the soil pool. However, verifying that soils of a nation are not a source of carbon in given year seems to be nearly impossible. The Yasso07 model was parametrized against various decomposition data using MCMC method. Soil carbon change in Finland between 1972 and 2011 were simulated with Yasso07 model using litter input data derived from the National Forest Inventory (NFI) and fellings time series. The uncertainties of biomass models, litter turnoverrates, NFI sampling and Yasso07 model were propagated with Monte Carlo simulations. Due to biomass estimation methods, uncertainties of various litter input sources (e.g. living trees, natural mortality and fellings) correlate strongly between each other. We show how original covariance matrices can be analytically combined and the amount of simulated components reduce greatly. While doing simulations we found that proper handling correlations may be even more essential than accurate estimates of standard errors. As a preliminary results, from the analysis we found that both Southern- and Northern Finland were soil carbon sinks, coefficient of variations (CV) varying 10%-25% when model was driven with long term constant weather data. When we applied annual weather data, soils were both sinks and sources of carbon and CVs varied from 10%-90%. This implies that the success of soil carbon sink verification depends on the weather data applied with models. Due to this fact IPCC should provide clear guidance for the weather data applied with soil carbon models and also for soil carbon sink verification. In the UNFCCC reporting carbon sinks of forest biomass have been typically averaged for five years - similar period for soil model weather data would be logical.

  18. Uncertainties in modeling hazardous gas releases for emergency response

    Directory of Open Access Journals (Sweden)

    Kathrin Baumann-Stanzer

    2011-02-01

    Full Text Available In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms-1 in wind speed, on the scale of 50 degrees in wind direction, up to 4°C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders.

  19. Uncertainties in modeling hazardous gas releases for emergency response

    Energy Technology Data Exchange (ETDEWEB)

    Baumann-Stanzer, Kathrin; Stenzel, Sirma [Zentralanstalt fuer Meteorologie und Geodynamik, Vienna (Austria)

    2011-02-15

    In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA) in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms{sup -1} in wind speed, on the scale of 50 degrees in wind direction, up to 4 C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders. (orig.)

  20. Improved design of constrained model predictive tracking control for batch processes against unknown uncertainties.

    Science.gov (United States)

    Wu, Sheng; Jin, Qibing; Zhang, Ridong; Zhang, Junfeng; Gao, Furong

    2017-07-01

    In this paper, an improved constrained tracking control design is proposed for batch processes under uncertainties. A new process model that facilitates process state and tracking error augmentation with further additional tuning is first proposed. Then a subsequent controller design is formulated using robust stable constrained MPC optimization. Unlike conventional robust model predictive control (MPC), the proposed method enables the controller design to bear more degrees of tuning so that improved tracking control can be acquired, which is very important since uncertainties exist inevitably in practice and cause model/plant mismatches. An injection molding process is introduced to illustrate the effectiveness of the proposed MPC approach in comparison with conventional robust MPC. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  2. Modeling a Hybrid Microgrid Using Probabilistic Reconfiguration under System Uncertainties

    Directory of Open Access Journals (Sweden)

    Hadis Moradi

    2017-09-01

    Full Text Available A novel method for a day-ahead optimal operation of a hybrid microgrid system including fuel cells, photovoltaic arrays, a microturbine, and battery energy storage in order to fulfill the required load demand is presented in this paper. In the proposed system, the microgrid has access to the main utility grid in order to exchange power when required. Available municipal waste is utilized to produce the hydrogen required for running the fuel cells, and natural gas will be used as the backup source. In the proposed method, an energy scheduling is introduced to optimize the generating unit power outputs for the next day, as well as the power flow with the main grid, in order to minimize the operational costs and produced greenhouse gases emissions. The nature of renewable energies and electric power consumption is both intermittent and unpredictable, and the uncertainty related to the PV array power generation and power consumption has been considered in the next-day energy scheduling. In order to model uncertainties, some scenarios are produced according to Monte Carlo (MC simulations, and microgrid optimal energy scheduling is analyzed under the generated scenarios. In addition, various scenarios created by MC simulations are applied in order to solve unit commitment (UC problems. The microgrid’s day-ahead operation and emission costs are considered as the objective functions, and the particle swarm optimization algorithm is employed to solve the optimization problem. Overall, the proposed model is capable of minimizing the system costs, as well as the unfavorable influence of uncertainties on the microgrid’s profit, by generating different scenarios.

  3. Numerical solution of dynamic equilibrium models under Poisson uncertainty

    DEFF Research Database (Denmark)

    Posch, Olaf; Trimborn, Timo

    2013-01-01

    We propose a simple and powerful numerical algorithm to compute the transition process in continuous-time dynamic equilibrium models with rare events. In this paper we transform the dynamic system of stochastic differential equations into a system of functional differential equations...... of the retarded type. We apply the Waveform Relaxation algorithm, i.e., we provide a guess of the policy function and solve the resulting system of (deterministic) ordinary differential equations by standard techniques. For parametric restrictions, analytical solutions to the stochastic growth model and a novel...... solution to Lucas' endogenous growth model under Poisson uncertainty are used to compute the exact numerical error. We show how (potential) catastrophic events such as rare natural disasters substantially affect the economic decisions of households....

  4. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  5. How well can we forecast future model error and uncertainty by mining past model performance data

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    Consider a hydrological model Y(t) = M(X(t), P), where X=vector of inputs; P=vector of parameters; Y=model output (typically flow); t=time. In cases when there is enough past data on the model M performance, it is possible to use this data to build a (data-driven) model EC of model M error. This model EC will be able to forecast error E when a new input X is fed into model M; then subtracting E from the model prediction Y a better estimate of Y can be obtained. Model EC is usually called the error corrector (in meteorology - a bias corrector). However, we may go further in characterizing model deficiencies, and instead of using the error (a real value) we may consider a more sophisticated characterization, namely a probabilistic one. So instead of rather a model EC of the model M error it is also possible to build a model U of model M uncertainty; if uncertainty is described as the model error distribution D this model will calculate its properties - mean, variance, other moments, and quantiles. The general form of this model could be: D = U (RV), where RV=vector of relevant variables having influence on model uncertainty (to be identified e.g. by mutual information analysis); D=vector of variables characterizing the error distribution (typically, two or more quantiles). There is one aspect which is not always explicitly mentioned in uncertainty analysis work. In our view it is important to distinguish the following main types of model uncertainty: 1. The residual uncertainty of models. In this case the model parameters and/or model inputs are considered to be fixed (deterministic), i.e. the model is considered to be optimal (calibrated) and deterministic. Model error is considered as the manifestation of uncertainty. If there is enough past data about the model errors (i.e. its uncertainty), it is possible to build a statistical or machine learning model of uncertainty trained on this data. Here the following methods can be mentioned: (a) quantile regression (QR

  6. Multi-Fidelity Uncertainty Propagation for Cardiovascular Modeling

    Science.gov (United States)

    Fleeter, Casey; Geraci, Gianluca; Schiavazzi, Daniele; Kahn, Andrew; Marsden, Alison

    2017-11-01

    Hemodynamic models are successfully employed in the diagnosis and treatment of cardiovascular disease with increasing frequency. However, their widespread adoption is hindered by our inability to account for uncertainty stemming from multiple sources, including boundary conditions, vessel material properties, and model geometry. In this study, we propose a stochastic framework which leverages three cardiovascular model fidelities: 3D, 1D and 0D models. 3D models are generated from patient-specific medical imaging (CT and MRI) of aortic and coronary anatomies using the SimVascular open-source platform, with fluid structure interaction simulations and Windkessel boundary conditions. 1D models consist of a simplified geometry automatically extracted from the 3D model, while 0D models are obtained from equivalent circuit representations of blood flow in deformable vessels. Multi-level and multi-fidelity estimators from Sandia's open-source DAKOTA toolkit are leveraged to reduce the variance in our estimated output quantities of interest while maintaining a reasonable computational cost. The performance of these estimators in terms of computational cost reductions is investigated for a variety of output quantities of interest, including global and local hemodynamic indicators. Sandia National Labs is a multimission laboratory managed and operated by NTESS, LLC, for the U.S. DOE under contract DE-NA0003525. Funding for this project provided by NIH-NIBIB R01 EB018302.

  7. A model for optimization of process integration investments under uncertainty

    International Nuclear Information System (INIS)

    Svensson, Elin; Stroemberg, Ann-Brith; Patriksson, Michael

    2011-01-01

    The long-term economic outcome of energy-related industrial investment projects is difficult to evaluate because of uncertain energy market conditions. In this article, a general, multistage, stochastic programming model for the optimization of investments in process integration and industrial energy technologies is proposed. The problem is formulated as a mixed-binary linear programming model where uncertainties are modelled using a scenario-based approach. The objective is to maximize the expected net present value of the investments which enables heat savings and decreased energy imports or increased energy exports at an industrial plant. The proposed modelling approach enables a long-term planning of industrial, energy-related investments through the simultaneous optimization of immediate and later decisions. The stochastic programming approach is also suitable for modelling what is possibly complex process integration constraints. The general model formulation presented here is a suitable basis for more specialized case studies dealing with optimization of investments in energy efficiency. -- Highlights: → Stochastic programming approach to long-term planning of process integration investments. → Extensive mathematical model formulation. → Multi-stage investment decisions and scenario-based modelling of uncertain energy prices. → Results illustrate how investments made now affect later investment and operation opportunities. → Approach for evaluation of robustness with respect to variations in probability distribution.

  8. Feedback versus uncertainty

    NARCIS (Netherlands)

    Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.

    2014-01-01

    Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of

  9. A risk-averse optimization model for trading wind energy in a market environment under uncertainty

    International Nuclear Information System (INIS)

    Pousinho, H.M.I.; Mendes, V.M.F.; Catalao, J.P.S.

    2011-01-01

    In this paper, a stochastic programming approach is proposed for trading wind energy in a market environment under uncertainty. Uncertainty in the energy market prices is the main cause of high volatility of profits achieved by power producers. The volatile and intermittent nature of wind energy represents another source of uncertainty. Hence, each uncertain parameter is modeled by scenarios, where each scenario represents a plausible realization of the uncertain parameters with an associated occurrence probability. Also, an appropriate risk measurement is considered. The proposed approach is applied on a realistic case study, based on a wind farm in Portugal. Finally, conclusions are duly drawn. -- Highlights: → We model uncertainties on energy market prices and wind power production. → A hybrid intelligent approach generates price-wind power scenarios. → Risk aversion is also incorporated in the proposed stochastic programming approach. → A realistic case study, based on a wind farm in Portugal, is provided. → Our approach allows selecting the best solution according to the desired risk exposure level.

  10. Parameter estimation techniques and uncertainty in ground water flow model predictions

    International Nuclear Information System (INIS)

    Zimmerman, D.A.; Davis, P.A.

    1990-01-01

    Quantification of uncertainty in predictions of nuclear waste repository performance is a requirement of Nuclear Regulatory Commission regulations governing the licensing of proposed geologic repositories for high-level radioactive waste disposal. One of the major uncertainties in these predictions is in estimating the ground-water travel time of radionuclides migrating from the repository to the accessible environment. The cause of much of this uncertainty has been attributed to a lack of knowledge about the hydrogeologic properties that control the movement of radionuclides through the aquifers. A major reason for this lack of knowledge is the paucity of data that is typically available for characterizing complex ground-water flow systems. Because of this, considerable effort has been put into developing parameter estimation techniques that infer property values in regions where no measurements exist. Currently, no single technique has been shown to be superior or even consistently conservative with respect to predictions of ground-water travel time. This work was undertaken to compare a number of parameter estimation techniques and to evaluate how differences in the parameter estimates and the estimation errors are reflected in the behavior of the flow model predictions. That is, we wished to determine to what degree uncertainties in flow model predictions may be affected simply by the choice of parameter estimation technique used. 3 refs., 2 figs

  11. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    Science.gov (United States)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  12. Uncertainty Estimation in SiGe HBT Small-Signal Modeling

    DEFF Research Database (Denmark)

    Masood, Syed M.; Johansen, Tom Keinicke; Vidkjær, Jens

    2005-01-01

    An uncertainty estimation and sensitivity analysis is performed on multi-step de-embedding for SiGe HBT small-signal modeling. The uncertainty estimation in combination with uncertainty model for deviation in measured S-parameters, quantifies the possible error value in de-embedded two-port param...

  13. Innovative supply chain optimization models with multiple uncertainty factors

    DEFF Research Database (Denmark)

    Choi, Tsan Ming; Govindan, Kannan; Li, Xiang

    2017-01-01

    Uncertainty is an inherent factor that affects all dimensions of supply chain activities. In today’s business environment, initiatives to deal with one specific type of uncertainty might not be effective since other types of uncertainty factors and disruptions may be present. These factors relate...

  14. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte...... of design performance criteria differs significantly. The implication for the practical applications of uncertainty analysis in the wastewater industry is profound: (i) as the uncertainty analysis results are specific to the framing used, the results must be interpreted within the context of that framing...... to stoichiometric, biokinetic and influent parameters; (2) uncertainty due to hydraulic behaviour of the plant and mass transfer parameters; (3) uncertainty due to the combination of (1) and (2). The results demonstrate that depending on the way the uncertainty analysis is framed, the estimated uncertainty...

  15. TECHNICAL PRODUCT RISK ASSESSMENT: STANDARDS, INTEGRATION IN THE ERM MODEL AND UNCERTAINTY MODELING

    Directory of Open Access Journals (Sweden)

    Mirko Djapic

    2016-03-01

    Full Text Available European Union has accomplished, through introducing New Approach to technical harmonization and standardization, a breakthrough in the field of technical products safety and in assessing their conformity, in such a manner that it integrated products safety requirements into the process of products development. This is achieved by quantifying risk levels with the aim of determining the scope of the required safety measures and systems. The theory of probability is used as a tool for modeling uncertainties in the assessment of that risk. In the last forty years are developed new mathematical theories have proven to be better at modeling uncertainty when we have not enough data about uncertainty events which is usually the case in product development. Bayesian networks based on modeling of subjective probability and Evidence networks based on Dempster-Shafer theory of belief functions proved to be an excellent tool for modeling uncertainty when we do not have enough information about all events aspect.

  16. Uncertainties of Large-Scale Forcing Caused by Surface Turbulence Flux Measurements and the Impacts on Cloud Simulations at the ARM SGP Site

    Science.gov (United States)

    Tang, S.; Xie, S.; Tang, Q.; Zhang, Y.

    2017-12-01

    Two types of instruments, the eddy correlation flux measurement system (ECOR) and the energy balance Bowen ratio system (EBBR), are used at the Atmospheric Radiation Measurement (ARM) program Southern Great Plains (SGP) site to measure surface latent and sensible fluxes. ECOR and EBBR typically sample different land surface types, and the domain-mean surface fluxes derived from ECOR and EBBR are not always consistent. The uncertainties of the surface fluxes will have impacts on the derived large-scale forcing data and further affect the simulations of single-column models (SCM), cloud-resolving models (CRM) and large-eddy simulation models (LES), especially for the shallow-cumulus clouds which are mainly driven by surface forcing. This study aims to quantify the uncertainties of the large-scale forcing caused by surface turbulence flux measurements and investigate the impacts on cloud simulations using long-term observations from the ARM SGP site.

  17. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    Science.gov (United States)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  18. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks. © 2012 Society for Risk Analysis.

  19. Quantifying uncertainty in LCA-modelling of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  20. Addressing model uncertainty in dose-response: The case of chloroform

    International Nuclear Information System (INIS)

    Evans, J.S.

    1994-01-01

    This paper discusses the issues involved in addressing model uncertainty in the analysis of dose-response relationships. A method for addressing model uncertainty is described and applied to characterize the uncertainty in estimates of the carcinogenic potency of chloroform. The approach, which is rooted in Bayesian concepts of subjective probability, uses probability trees and formally-elicited expert judgments to address model uncertainty. It is argued that a similar approach could be used to improve the characterization of model uncertainty in the dose-response relationships for health effects from ionizing radiation

  1. Model uncertainty in financial markets : Long run risk and parameter uncertainty

    NARCIS (Netherlands)

    de Roode, F.A.

    2014-01-01

    Uncertainty surrounding key parameters of financial markets, such as the in- flation and equity risk premium, constitute a major risk for institutional investors with long investment horizons. Hedging the investors’ inflation exposure can be challenging due to the lack of domestic inflation-linked

  2. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  3. Calibration under uncertainty for finite element models of masonry monuments

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin

    2010-02-01

    Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.

  4. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    Groundwater modeling plays an essential role in modern subsurface hydrology research. It’s generally recognized that simulations and predictions by groundwater models are associated with uncertainties that originate from various sources. The two major uncertainty sources are related to model...... parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...

  5. Status of standard model predictions and uncertainties for electroweak observables

    International Nuclear Information System (INIS)

    Kniehl, B.A.

    1993-11-01

    Recent progress in theoretical predictions of electroweak parameters beyond one loop in the standard model is reviewed. The topics include universal corrections of O(G F 2 M H 2 M W 2 ), O(G F 2 m t 4 ), O(α s G F M W 2 ), and those due to virtual t anti t threshold effects, as well as specific corrections to Γ(Z → b anti b) of O(G F 2 m t 4 ), O(α s G F m t 2 ), and O(α s 2 m b 2 /M Z 2 ). An update of the hadronic contributions to Δα is presented. Theoretical uncertainties, other than those due to the lack of knowledge of M H and m t , are estimated. (orig.)

  6. Nuclear fuel rod stored-energy uncertainty analysis: a study of error propagation in complicated models

    International Nuclear Information System (INIS)

    Olsen, A.R.; Cunningham, M.E.

    1980-01-01

    With the increasing sophistication and use of computer codes in the nuclear industry, there is a growing awareness of the need to identify and quantify the uncertainties of these codes. In any effort to model physical mechanisms, the results obtained from the model are subject to some degree of uncertainty. This uncertainty has two primary sources. First, there is uncertainty in the model's representation of reality. Second, there is an uncertainty in the input data required by the model. If individual models are combined into a predictive sequence, the uncertainties from an individual model will propagate through the sequence and add to the uncertainty of results later obtained. Nuclear fuel rod stored-energy models, characterized as a combination of numerous submodels, exemplify models so affected. Each submodel depends on output from previous calculations and may involve iterative interdependent submodel calculations for the solution. The iterative nature of the model and the cost of running the model severely limit the uncertainty analysis procedures. An approach for uncertainty analysis under these conditions was designed for the particular case of stored-energy models. It is assumed that the complicated model is correct, that a simplified model based on physical considerations can be designed to approximate the complicated model, and that linear error propagation techniques can be used on the simplified model

  7. Estimating the Uncertainty In Diameter Growth Model Predictions and Its Effects On The Uncertainty of Annual Inventory Estimates

    Science.gov (United States)

    Ronald E. McRoberts; Veronica C. Lessard

    2001-01-01

    Uncertainty in diameter growth predictions is attributed to three general sources: measurement error or sampling variability in predictor variables, parameter covariances, and residual or unexplained variation around model expectations. Using measurement error and sampling variability distributions obtained from the literature and Monte Carlo simulation methods, the...

  8. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I.

    Science.gov (United States)

    Bautista, M. A.; Fivet, V.; Quinet, P.; Dunn, J.; Gull, T. R.; Kallman, T. R.; Mendoza, C.

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data.We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of Oiii and Fe ii and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe ii]. Key words: atomic data - atomic processes - line: formation - methods: data analysis - molecular data - molecular processes - techniques: spectroscopic

  9. Quantum-memory-assisted entropic uncertainty in spin models with Dzyaloshinskii-Moriya interaction

    Science.gov (United States)

    Huang, Zhiming

    2018-02-01

    In this article, we investigate the dynamics and correlations of quantum-memory-assisted entropic uncertainty, the tightness of the uncertainty, entanglement, quantum correlation and mixedness for various spin chain models with Dzyaloshinskii-Moriya (DM) interaction, including the XXZ model with DM interaction, the XY model with DM interaction and the Ising model with DM interaction. We find that the uncertainty grows to a stable value with growing temperature but reduces as the coupling coefficient, anisotropy parameter and DM values increase. It is found that the entropic uncertainty is closely correlated with the mixedness of the system. The increasing quantum correlation can result in a decrease in the uncertainty, and the robustness of quantum correlation is better than entanglement since entanglement means sudden birth and death. The tightness of the uncertainty drops to zero, apart from slight volatility as various parameters increase. Furthermore, we propose an effective approach to steering the uncertainty by weak measurement reversal.

  10. Common Cause Failure Modeling: Aerospace Versus Nuclear

    Science.gov (United States)

    Stott, James E.; Britton, Paul; Ring, Robert W.; Hark, Frank; Hatfield, G. Spencer

    2010-01-01

    Aggregate nuclear plant failure data is used to produce generic common-cause factors that are specifically for use in the common-cause failure models of NUREG/CR-5485. Furthermore, the models presented in NUREG/CR-5485 are specifically designed to incorporate two significantly distinct assumptions about the methods of surveillance testing from whence this aggregate failure data came. What are the implications of using these NUREG generic factors to model the common-cause failures of aerospace systems? Herein, the implications of using the NUREG generic factors in the modeling of aerospace systems are investigated in detail and strong recommendations for modeling the common-cause failures of aerospace systems are given.

  11. Assessing uncertainty in SRTM elevations for global flood modelling

    Science.gov (United States)

    Hawker, L. P.; Rougier, J.; Neal, J. C.; Bates, P. D.

    2017-12-01

    The SRTM DEM is widely used as the topography input to flood models in data-sparse locations. Understanding spatial error in the SRTM product is crucial in constraining uncertainty about elevations and assessing the impact of these upon flood prediction. Assessment of SRTM error was carried out by Rodriguez et al (2006), but this did not explicitly quantify the spatial structure of vertical errors in the DEM, and nor did it distinguish between errors over different types of landscape. As a result, there is a lack of information about spatial structure of vertical errors of the SRTM in the landscape that matters most to flood models - the floodplain. Therefore, this study attempts this task by comparing SRTM, an error corrected SRTM product (The MERIT DEM of Yamazaki et al., 2017) and near truth LIDAR elevations for 3 deltaic floodplains (Mississippi, Po, Wax Lake) and a large lowland region (the Fens, UK). Using the error covariance function, calculated by comparing SRTM elevations to the near truth LIDAR, perturbations of the 90m SRTM DEM were generated, producing a catalogue of plausible DEMs. This allows modellers to simulate a suite of plausible DEMs at any aggregated block size above native SRTM resolution. Finally, the generated DEM's were input into a hydrodynamic model of the Mekong Delta, built using the LISFLOOD-FP hydrodynamic model, to assess how DEM error affects the hydrodynamics and inundation extent across the domain. The end product of this is an inundation map with the probability of each pixel being flooded based on the catalogue of DEMs. In a world of increasing computer power, but a lack of detailed datasets, this powerful approach can be used throughout natural hazard modelling to understand how errors in the SRTM DEM can impact the hazard assessment.

  12. Effect of Baseflow Separation on Uncertainty of Hydrological Modeling in the Xinanjiang Model

    Directory of Open Access Journals (Sweden)

    Kairong Lin

    2014-01-01

    Full Text Available Based on the idea of inputting more available useful information for evaluation to gain less uncertainty, this study focuses on how well the uncertainty can be reduced by considering the baseflow estimation information obtained from the smoothed minima method (SMM. The Xinanjiang model and the generalized likelihood uncertainty estimation (GLUE method with the shuffled complex evolution Metropolis (SCEM-UA sampling algorithm were used for hydrological modeling and uncertainty analysis, respectively. The Jiangkou basin, located in the upper of the Hanjiang River, was selected as case study. It was found that the number and standard deviation of behavioral parameter sets both decreased when the threshold value for the baseflow efficiency index increased, and the high Nash-Sutcliffe efficiency coefficients correspond well with the high baseflow efficiency coefficients. The results also showed that uncertainty interval width decreased significantly, while containing ratio did not decrease by much and the simulated runoff with the behavioral parameter sets can fit better to the observed runoff, when threshold for the baseflow efficiency index was taken into consideration. These implied that using the baseflow estimation information can reduce the uncertainty in hydrological modeling to some degree and gain more reasonable prediction bounds.

  13. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi...

  14. Uncertainty modeling dedicated to professor Boris Kovalerchuk on his anniversary

    CERN Document Server

    2017-01-01

    This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts.

  15. Uncertainty analysis of hydrological modeling in a tropical area using different algorithms

    Science.gov (United States)

    Rafiei Emam, Ammar; Kappas, Martin; Fassnacht, Steven; Linh, Nguyen Hoang Khanh

    2018-01-01

    Hydrological modeling outputs are subject to uncertainty resulting from different sources of errors (e.g., error in input data, model structure, and model parameters), making quantification of uncertainty in hydrological modeling imperative and meant to improve reliability of modeling results. The uncertainty analysis must solve difficulties in calibration of hydrological models, which further increase in areas with data scarcity. The purpose of this study is to apply four uncertainty analysis algorithms to a semi-distributed hydrological model, quantifying different source of uncertainties (especially parameter uncertainty) and evaluate their performance. In this study, the Soil and Water Assessment Tools (SWAT) eco-hydrological model was implemented for the watershed in the center of Vietnam. The sensitivity of parameters was analyzed, and the model was calibrated. The uncertainty analysis for the hydrological model was conducted based on four algorithms: Generalized Likelihood Uncertainty Estimation (GLUE), Sequential Uncertainty Fitting (SUFI), Parameter Solution method (ParaSol) and Particle Swarm Optimization (PSO). The performance of the algorithms was compared using P-factor and Rfactor, coefficient of determination (R 2), the Nash Sutcliffe coefficient of efficiency (NSE) and Percent Bias (PBIAS). The results showed the high performance of SUFI and PSO with P-factor>0.83, R-factor 0.91, NSE>0.89, and 0.18model use for policy or management decisions.

  16. Parameter Uncertainty for Aircraft Aerodynamic Modeling using Recursive Least Squares

    Science.gov (United States)

    Grauer, Jared A.; Morelli, Eugene A.

    2016-01-01

    A real-time method was demonstrated for determining accurate uncertainty levels of stability and control derivatives estimated using recursive least squares and time-domain data. The method uses a recursive formulation of the residual autocorrelation to account for colored residuals, which are routinely encountered in aircraft parameter estimation and change the predicted uncertainties. Simulation data and flight test data for a subscale jet transport aircraft were used to demonstrate the approach. Results showed that the corrected uncertainties matched the observed scatter in the parameter estimates, and did so more accurately than conventional uncertainty estimates that assume white residuals. Only small differences were observed between batch estimates and recursive estimates at the end of the maneuver. It was also demonstrated that the autocorrelation could be reduced to a small number of lags to minimize computation and memory storage requirements without significantly degrading the accuracy of predicted uncertainty levels.

  17. Modeling Uncertainty of Directed Movement via Markov Chains

    Directory of Open Access Journals (Sweden)

    YIN Zhangcai

    2015-10-01

    Full Text Available Probabilistic time geography (PTG is suggested as an extension of (classical time geography, in order to present the uncertainty of an agent located at the accessible position by probability. This may provide a quantitative basis for most likely finding an agent at a location. In recent years, PTG based on normal distribution or Brown bridge has been proposed, its variance, however, is irrelevant with the agent's speed or divergent with the increase of the speed; so they are difficult to take into account application pertinence and stability. In this paper, a new method is proposed to model PTG based on Markov chain. Firstly, a bidirectional conditions Markov chain is modeled, the limit of which, when the moving speed is large enough, can be regarded as the Brown bridge, thus has the characteristics of digital stability. Then, the directed movement is mapped to Markov chains. The essential part is to build step length, the state space and transfer matrix of Markov chain according to the space and time position of directional movement, movement speed information, to make sure the Markov chain related to the movement speed. Finally, calculating continuously the probability distribution of the directed movement at any time by the Markov chains, it can be get the possibility of an agent located at the accessible position. Experimental results show that, the variance based on Markov chains not only is related to speed, but also is tending towards stability with increasing the agent's maximum speed.

  18. Mesh refinement for uncertainty quantification through model reduction

    International Nuclear Information System (INIS)

    Li, Jing; Stinis, Panos

    2015-01-01

    We present a novel way of deciding when and where to refine a mesh in probability space in order to facilitate uncertainty quantification in the presence of discontinuities in random space. A discontinuity in random space makes the application of generalized polynomial chaos expansion techniques prohibitively expensive. The reason is that for discontinuous problems, the expansion converges very slowly. An alternative to using higher terms in the expansion is to divide the random space in smaller elements where a lower degree polynomial is adequate to describe the randomness. In general, the partition of the random space is a dynamic process since some areas of the random space, particularly around the discontinuity, need more refinement than others as time evolves. In the current work we propose a way to decide when and where to refine the random space mesh based on the use of a reduced model. The idea is that a good reduced model can monitor accurately, within a random space element, the cascade of activity to higher degree terms in the chaos expansion. In turn, this facilitates the efficient allocation of computational sources to the areas of random space where they are more needed. For the Kraichnan–Orszag system, the prototypical system to study discontinuities in random space, we present theoretical results which show why the proposed method is sound and numerical results which corroborate the theory

  19. The role of uncertainty in supply chains under dynamic modeling

    Directory of Open Access Journals (Sweden)

    M. Fera

    2017-01-01

    Full Text Available The uncertainty in the supply chains (SCs for manufacturing and services firms is going to be, over the coming decades, more important for the companies that are called to compete in a new globalized economy. Risky situations for manufacturing are considered in trying to individuate the optimal positioning of the order penetration point (OPP. It aims at defining the best level of information of the client’s order going back through the several supply chain (SC phases, i.e. engineering, procurement, production and distribution. This work aims at defining a system dynamics model to assess competitiveness coming from the positioning of the order in different SC locations. A Taguchi analysis has been implemented to create a decision map for identifying possible strategic decisions under different scenarios and with alternatives for order location in the SC levels. Centralized and decentralized strategies for SC integration are discussed. In the model proposed, the location of OPP is influenced by the demand variation, production time, stock-outs and stock amount. Results of this research are as follows: (i customer-oriented strategies are preferable under high volatility of demand, (ii production-focused strategies are suggested when the probability of stock-outs is high, (iii no specific location is preferable if a centralized control architecture is implemented, (iv centralization requires cooperation among partners to achieve the SC optimum point, (v the producer must not prefer the OPP location at the Retailer level when the general strategy is focused on a decentralized approach.

  20. Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador

    Science.gov (United States)

    Chicaiza, E. G.; Leiva, C. A.; Arranz, J. J.; Buenańo, X. E.

    2017-06-01

    Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA), trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.

  1. Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador

    Directory of Open Access Journals (Sweden)

    Chicaiza E.G.

    2017-06-01

    Full Text Available Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA, trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.

  2. Model and parametric uncertainty in source-based kinematic models of earthquake ground motion

    Science.gov (United States)

    Hartzell, Stephen; Frankel, Arthur; Liu, Pengcheng; Zeng, Yuehua; Rahman, Shariftur

    2011-01-01

    Four independent ground-motion simulation codes are used to model the strong ground motion for three earthquakes: 1994 Mw 6.7 Northridge, 1989 Mw 6.9 Loma Prieta, and 1999 Mw 7.5 Izmit. These 12 sets of synthetics are used to make estimates of the variability in ground-motion predictions. In addition, ground-motion predictions over a grid of sites are used to estimate parametric uncertainty for changes in rupture velocity. We find that the combined model uncertainty and random variability of the simulations is in the same range as the variability of regional empirical ground-motion data sets. The majority of the standard deviations lie between 0.5 and 0.7 natural-log units for response spectra and 0.5 and 0.8 for Fourier spectra. The estimate of model epistemic uncertainty, based on the different model predictions, lies between 0.2 and 0.4, which is about one-half of the estimates for the standard deviation of the combined model uncertainty and random variability. Parametric uncertainty, based on variation of just the average rupture velocity, is shown to be consistent in amplitude with previous estimates, showing percentage changes in ground motion from 50% to 300% when rupture velocity changes from 2.5 to 2.9 km/s. In addition, there is some evidence that mean biases can be reduced by averaging ground-motion estimates from different methods.

  3. Predictive uncertainty analysis of a saltwater intrusion model using null-space Monte Carlo

    Science.gov (United States)

    Herckenrath, Daan; Langevin, Christian D.; Doherty, John

    2011-01-01

    Because of the extensive computational burden and perhaps a lack of awareness of existing methods, rigorous uncertainty analyses are rarely conducted for variable-density flow and transport models. For this reason, a recently developed null-space Monte Carlo (NSMC) method for quantifying prediction uncertainty was tested for a synthetic saltwater intrusion model patterned after the Henry problem. Saltwater intrusion caused by a reduction in fresh groundwater discharge was simulated for 1000 randomly generated hydraulic conductivity distributions, representing a mildly heterogeneous aquifer. From these 1000 simulations, the hydraulic conductivity distribution giving rise to the most extreme case of saltwater intrusion was selected and was assumed to represent the "true" system. Head and salinity values from this true model were then extracted and used as observations for subsequent model calibration. Random noise was added to the observations to approximate realistic field conditions. The NSMC method was used to calculate 1000 calibration-constrained parameter fields. If the dimensionality of the solution space was set appropriately, the estimated uncertainty range from the NSMC analysis encompassed the truth. Several variants of the method were implemented to investigate their effect on the efficiency of the NSMC method. Reducing the dimensionality of the null-space for the processing of the random parameter sets did not result in any significant gains in efficiency and compromised the ability of the NSMC method to encompass the true prediction value. The addition of intrapilot point heterogeneity to the NSMC process was also tested. According to a variogram comparison, this provided the same scale of heterogeneity that was used to generate the truth. However, incorporation of intrapilot point variability did not make a noticeable difference to the uncertainty of the prediction. With this higher level of heterogeneity, however, the computational burden of

  4. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  5. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    Science.gov (United States)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  6. Modeling Input Errors to Improve Uncertainty Estimates for Sediment Transport Model Predictions

    Science.gov (United States)

    Jung, J. Y.; Niemann, J. D.; Greimann, B. P.

    2016-12-01

    Bayesian methods using Markov chain Monte Carlo algorithms have recently been applied to sediment transport models to assess the uncertainty in the model predictions due to the parameter values. Unfortunately, the existing approaches can only attribute overall uncertainty to the parameters. This limitation is critical because no model can produce accurate forecasts if forced with inaccurate input data, even if the model is well founded in physical theory. In this research, an existing Bayesian method is modified to consider the potential errors in input data during the uncertainty evaluation process. The input error is modeled using Gaussian distributions, and the means and standard deviations are treated as uncertain parameters. The proposed approach is tested by coupling it to the Sedimentation and River Hydraulics - One Dimension (SRH-1D) model and simulating a 23-km reach of the Tachia River in Taiwan. The Wu equation in SRH-1D is used for computing the transport capacity for a bed material load of non-cohesive material. Three types of input data are considered uncertain: (1) the input flowrate at the upstream boundary, (2) the water surface elevation at the downstream boundary, and (3) the water surface elevation at a hydraulic structure in the middle of the reach. The benefits of modeling the input errors in the uncertainty analysis are evaluated by comparing the accuracy of the most likely forecast and the coverage of the observed data by the credible intervals to those of the existing method. The results indicate that the internal boundary condition has the largest uncertainty among those considered. Overall, the uncertainty estimates from the new method are notably different from those of the existing method for both the calibration and forecast periods.

  7. Modelling sensitivity and uncertainty in a LCA model for waste management systems - EASETECH

    DEFF Research Database (Denmark)

    Damgaard, Anders; Clavreul, Julie; Baumeister, Hubert

    2013-01-01

    In the new model, EASETECH, developed for LCA modelling of waste management systems, a general approach for sensitivity and uncertainty assessment for waste management studies has been implemented. First general contribution analysis is done through a regular interpretation of inventory and impact...

  8. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  9. Decision-making model of generation technology under uncertainty based on real option theory

    International Nuclear Information System (INIS)

    Ming, Zeng; Ping, Zhang; Shunkun, Yu; Ge, Zhang

    2016-01-01

    Highlights: • A decision-making model of generation technology investment is proposed. • The irreversible investment concept and real option theory is introduced. • Practical data was used to prove the validity of the model. • Impact of electricity and fuel price fluctuation on investment was analyzed. - Abstract: The introduction of market competition and the increased uncertainty factors makes the generators have to decide not only on whether to invest generation capacity or not but also on what kind of generation technology to choose. In this paper, a decision-making model of generation technology investment is proposed. The irreversible investment concept and real option theory is introduced as the fundamental of the model. In order to explain the decision-making process of generator’s investment, the decision-making optimization model was built considering two generation technologies, i.e., the heat-only system and the combined heat and power generation. Also, we discussed the theory deducing process, which explained how to eliminate the overrated economic potential caused by risk hazard, based on economic evaluation of both generation technologies. Finally, practical data from electricity market of Inner Mongolia was used to prove the validity of the model and the impact of uncertainties of electricity and fuel price fluctuation on investment was analyzed according to the simulated results.

  10. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    . This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles......This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...

  11. Evaluation of uncertainties originating from the different modeling approaches applied to analyze regional groundwater flow in the Tono area of Japan

    Science.gov (United States)

    Ijiri, Yuji; Saegusa, Hiromitsu; Sawada, Atsushi; Ono, Makoto; Watanabe, Kunio; Karasaki, Kenzi; Doughty, Christine; Shimo, Michito; Fumimura, Kenichi

    2009-01-01

    Qualitative evaluation of the effects of uncertainties originating from scenario development, modeling approaches, and parameter values is an important subject in the area of safety assessment for high-level nuclear waste disposal sites. In this study, regional-scale groundwater flow analyses for the Tono area, Japan were conducted using three continuous models designed to handle heterogeneous porous media. We evaluated the simulation results to quantitatively analyze uncertainties originating from modeling approaches. We found that porous media heterogeneity is the main factor which causes uncertainties. We also found that uncertainties originating from modeling approaches greatly depend on the types of hydrological structures and heterogeneity of hydraulic conductivity values in the domain assigned by modelers. Uncertainties originating from modeling approaches decrease as the amount of labor and time spent increase, and iterations between investigation and analyses increases.

  12. Fast uncertainty reduction strategies relying on Gaussian process models

    International Nuclear Information System (INIS)

    Chevalier, Clement

    2013-01-01

    This work deals with sequential and batch-sequential evaluation strategies of real-valued functions under limited evaluation budget, using Gaussian process models. Optimal Stepwise Uncertainty Reduction (SUR) strategies are investigated for two different problems, motivated by real test cases in nuclear safety. First we consider the problem of identifying the excursion set above a given threshold T of a real-valued function f. Then we study the question of finding the set of 'safe controlled configurations', i.e. the set of controlled inputs where the function remains below T, whatever the value of some others non-controlled inputs. New SUR strategies are presented, together with efficient procedures and formulas to compute and use them in real world applications. The use of fast formulas to recalculate quickly the posterior mean or covariance function of a Gaussian process (referred to as the 'kriging update formulas') does not only provide substantial computational savings. It is also one of the key tools to derive closed form formulas enabling a practical use of computationally-intensive sampling strategies. A contribution in batch-sequential optimization (with the multi-points Expected Improvement) is also presented. (author)

  13. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  14. Impact on Model Uncertainty of Diabatization in Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2014-01-01

    This work provides uncertainty and sensitivity analysis of design of conventional and heat integrated distillation columns using Monte Carlo simulations. Selected uncertain parameters are relative volatility, heat of vaporization, the overall heat transfer coefficient , tray hold-up, and adiabat ...

  15. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...

  16. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  17. An Efficient Deterministic Approach to Model-based Prediction Uncertainty

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the...

  18. Quantification of Uncertainties in Integrated Spacecraft System Models, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective for the Phase II effort will be to develop a comprehensive, efficient, and flexible uncertainty quantification (UQ) framework implemented within a...

  19. Effects of climate model interdependency on the uncertainty quantification of extreme rainfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan

    are independent. The effects of this assumption on the uncertainty quantification of extreme rainfall projections are addressed in this study. First, the interdependency of the 95% quantile of wet days in the ENSEMBLES RCMs is estimated. For this statistic and the region studied, the RCMs cannot be assumed......Changes in rainfall extremes under climate change conditions are subject to numerous uncertainties. One of the most important uncertainties arises from the inherent uncertainty in climate models. In recent years, many efforts have been made in creating large multi-model ensembles of both Regional...... independent. Then, a Bayesian approach that accounts for the interdependency of the climate models is developed in order to quantify the uncertainty. The results of the Bayesian approach show that the uncertainty is narrower when the models are considered independent. These results highlight the importance...

  20. Generic uncertainty model for DETRA for environmental consequence analyses. Application and sample outputs

    Energy Technology Data Exchange (ETDEWEB)

    Suolanen, V.; Ilvonen, M. [VTT Energy, Espoo (Finland). Nuclear Energy

    1998-10-01

    Computer model DETRA applies a dynamic compartment modelling approach. The compartment structure of each considered application can be tailored individually. This flexible modelling method makes it possible that the transfer of radionuclides can be considered in various cases: aquatic environment and related food chains, terrestrial environment, food chains in general and food stuffs, body burden analyses of humans, etc. In the former study on this subject, modernization of the user interface of DETRA code was carried out. This new interface works in Windows environment and the usability of the code has been improved. The objective of this study has been to further develop and diversify the user interface so that also probabilistic uncertainty analyses can be performed by DETRA. The most common probability distributions are available: uniform, truncated Gaussian and triangular. The corresponding logarithmic distributions are also available. All input data related to a considered case can be varied, although this option is seldomly needed. The calculated output values can be selected as monitored values at certain simulation time points defined by the user. The results of a sensitivity run are immediately available after simulation as graphical presentations. These outcomes are distributions generated for varied parameters, density functions of monitored parameters and complementary cumulative density functions (CCDF). An application considered in connection with this work was the estimation of contamination of milk caused by radioactive deposition of Cs (10 kBq(Cs-137)/m{sup 2}). The multi-sequence calculation model applied consisted of a pasture modelling part and a dormant season modelling part. These two sequences were linked periodically simulating the realistic practice of care taking of domestic animals in Finland. The most important parameters were varied in this exercise. The performed diversifying of the user interface of DETRA code seems to provide an

  1. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  2. Estimation of Uncertainties in the Global Distance Test (GDT_TS for CASP Models.

    Directory of Open Access Journals (Sweden)

    Wenlin Li

    Full Text Available The Critical Assessment of techniques for protein Structure Prediction (or CASP is a community-wide blind test experiment to reveal the best accomplishments of structure modeling. Assessors have been using the Global Distance Test (GDT_TS measure to quantify prediction performance since CASP3 in 1998. However, identifying significant score differences between close models is difficult because of the lack of uncertainty estimations for this measure. Here, we utilized the atomic fluctuations caused by structure flexibility to estimate the uncertainty of GDT_TS scores. Structures determined by nuclear magnetic resonance are deposited as ensembles of alternative conformers that reflect the structural flexibility, whereas standard X-ray refinement produces the static structure averaged over time and space for the dynamic ensembles. To recapitulate the structural heterogeneous ensemble in the crystal lattice, we performed time-averaged refinement for X-ray datasets to generate structural ensembles for our GDT_TS uncertainty analysis. Using those generated ensembles, our study demonstrates that the time-averaged refinements produced structure ensembles with better agreement with the experimental datasets than the averaged X-ray structures with B-factors. The uncertainty of the GDT_TS scores, quantified by their standard deviations (SDs, increases for scores lower than 50 and 70, with maximum SDs of 0.3 and 1.23 for X-ray and NMR structures, respectively. We also applied our procedure to the high accuracy version of GDT-based score and produced similar results with slightly higher SDs. To facilitate score comparisons by the community, we developed a user-friendly web server that produces structure ensembles for NMR and X-ray structures and is accessible at http://prodata.swmed.edu/SEnCS. Our work helps to identify the significance of GDT_TS score differences, as well as to provide structure ensembles for estimating SDs of any scores.

  3. GLUE Based Uncertainty Estimation of Urban Drainage Modeling Using Weather Radar Precipitation Estimates

    DEFF Research Database (Denmark)

    Nielsen, Jesper Ellerbæk; Thorndahl, Søren Liedtke; Rasmussen, Michael R.

    2011-01-01

    the uncertainty of the weather radar rainfall input. The main findings of this work, is that the input uncertainty propagate through the urban drainage model with significant effects on the model result. The GLUE methodology is in general a usable way to explore this uncertainty although; the exact width......Distributed weather radar precipitation measurements are used as rainfall input for an urban drainage model, to simulate the runoff from a small catchment of Denmark. It is demonstrated how the Generalized Likelihood Uncertainty Estimation (GLUE) methodology can be implemented and used to estimate...

  4. Dealing with uncertainty in landscape genetic resistance models: a case of three co-occurring marsupials.

    Science.gov (United States)

    Dudaniec, Rachael Y; Worthington Wilmer, Jessica; Hanson, Jeffrey O; Warren, Matthew; Bell, Sarah; Rhodes, Jonathan R

    2016-01-01

    Landscape genetics lacks explicit methods for dealing with the uncertainty in landscape resistance estimation, which is particularly problematic when sample sizes of individuals are small. Unless uncertainty can be quantified, valuable but small data sets may be rendered unusable for conservation purposes. We offer a method to quantify uncertainty in landscape resistance estimates using multimodel inference as an improvement over single model-based inference. We illustrate the approach empirically using co-occurring, woodland-preferring Australian marsupials within a common study area: two arboreal gliders (Petaurus breviceps, and Petaurus norfolcensis) and one ground-dwelling antechinus (Antechinus flavipes). First, we use maximum-likelihood and a bootstrap procedure to identify the best-supported isolation-by-resistance model out of 56 models defined by linear and non-linear resistance functions. We then quantify uncertainty in resistance estimates by examining parameter selection probabilities from the bootstrapped data. The selection probabilities provide estimates of uncertainty in the parameters that drive the relationships between landscape features and resistance. We then validate our method for quantifying uncertainty using simulated genetic and landscape data showing that for most parameter combinations it provides sensible estimates of uncertainty. We conclude that small data sets can be informative in landscape genetic analyses provided uncertainty can be explicitly quantified. Being explicit about uncertainty in landscape genetic models will make results more interpretable and useful for conservation decision-making, where dealing with uncertainty is critical. © 2015 John Wiley & Sons Ltd.

  5. Modelling uncertainty due to imperfect forward model and aerosol microphysical model selection in the satellite aerosol retrieval

    Science.gov (United States)

    Määttä, Anu; Laine, Marko; Tamminen, Johanna

    2015-04-01

    This study aims to characterize the uncertainty related to the aerosol microphysical model selection and the modelling error due to approximations in the forward modelling. Many satellite aerosol retrieval algorithms rely on pre-calculated look-up tables of model parameters representing various atmospheric conditions. In the retrieval we need to choose the most appropriate aerosol microphysical models from the pre-defined set of models by fitting them to the observations. The aerosol properties, e.g. AOD, are then determined from the best models. This choice of an appropriate aerosol model composes a notable part in the AOD retrieval uncertainty. The motivation in our study was to account these two sources in the total uncertainty budget: uncertainty in selecting the most appropriate model, and uncertainty resulting from the approximations in the pre-calculated aerosol microphysical model. The systematic model error was analysed by studying the behaviour of the model residuals, i.e. the differences between modelled and observed reflectances, by statistical methods. We utilised Gaussian processes to characterize the uncertainty related to approximations in aerosol microphysics modelling due to use of look-up tables and other non-modelled systematic features in the Level 1 data. The modelling error is described by a non-diagonal covariance matrix parameterised by correlation length, which is estimated from the residuals using computational tools from spatial statistics. In addition, we utilised Bayesian model selection and model averaging methods to account the uncertainty due to aerosol model selection. By acknowledging the modelling error as a source of uncertainty in the retrieval of AOD from observed spectral reflectance, we allow the observed values to deviate from the modelled values within limits determined by both the measurement and modelling errors. This results in a more realistic uncertainty level of the retrieved AOD. The method is illustrated by both

  6. Uncertainties in predicting rice yield by current crop models under a wide range of climatic conditions

    NARCIS (Netherlands)

    Li, T.; Hasegawa, T.; Yin, X.; Zhu, Y.; Boote, K.; Adam, M.; Bregaglio, S.; Buis, S.; Confalonieri, R.; Fumoto, T.; Gaydon, D.; Marcaida III, M.; Nakagawa, H.; Oriol, P.; Ruane, A.C.; Ruget, F.; Singh, B.; Singh, U.; Tang, L.; Yoshida, H.; Zhang, Z.; Bouman, B.

    2015-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We

  7. Modeling of CO2 plasma: effect of uncertainties in the plasma chemistry

    Science.gov (United States)

    Berthelot, Antonin; Bogaerts, Annemie

    2017-11-01

    Low-temperature plasma chemical kinetic models are particularly important to the plasma community. These models typically require dozens of inputs, especially rate coefficients. The latter are not always precisely known and it is not surprising that the error on the rate coefficient data can propagate to the model output. In this paper, we present a model that uses N = 400 different combinations of rate coefficients based on the uncertainty attributed to each rate coefficient, giving a good estimation of the uncertainty on the model output due to the rate coefficients. We demonstrate that the uncertainty varies a lot with the conditions and the type of output. Relatively low uncertainties (about 15%) are found for electron density and temperature, while the uncertainty can reach more than an order of magnitude for the population of the vibrational levels in some cases and it can rise up to 100% for the CO2 conversion. The reactions that are mostly responsible for the largest uncertainties are identified. We show that the conditions of pressure, gas temperature and power density have a great effect on the uncertainty and on which reactions lead to this uncertainty. In all the cases tested here, while the absolute values may suffer from large uncertainties, the trends observed in previous modeling work are still valid. Finally, in accordance with the work of Turner, a number of ‘good practices’ is recommended.

  8. A review of different perspectives on uncertainty and risk and an alternative modeling paradigm

    International Nuclear Information System (INIS)

    Samson, Sundeep; Reneke, James A.; Wiecek, Margaret M.

    2009-01-01

    The literature in economics, finance, operations research, engineering and in general mathematics is first reviewed on the subject of defining uncertainty and risk. The review goes back to 1901. Different perspectives on uncertainty and risk are examined and a new paradigm to model uncertainty and risk is proposed using relevant ideas from this study. This new paradigm is used to represent, aggregate and propagate uncertainty and interpret the resulting variability in a challenge problem developed by Oberkampf et al. [2004, Challenge problems: uncertainty in system response given uncertain parameters. Reliab Eng Syst Safety 2004; 85(1): 11-9]. The challenge problem is further extended into a decision problem that is treated within a multicriteria decision making framework to illustrate how the new paradigm yields optimal decisions under uncertainty. The accompanying risk is defined as the probability of an unsatisfactory system response quantified by a random function of the uncertainty

  9. Climate modelling, uncertainty and responses to predictions of change

    International Nuclear Information System (INIS)

    Henderson-Sellers, A.

    1996-01-01

    Article 4.1(F) of the Framework Convention on Climate Change commits all parties to take climate change considerations into account, to the extent feasible, in relevant social, economic and environmental policies and actions and to employ methods such as impact assessments to minimize adverse effects of climate change. This could be achieved by, inter alia, incorporating climate change risk assessment into development planning processes, i.e. relating climatic change to issues of habitability and sustainability. Adaptation is an ubiquitous and beneficial natural and human strategy. Future adaptation (adjustment) to climate is inevitable at the least to decrease the vulnerability to current climatic impacts. An urgent issue is the mismatch between the predictions of global climatic change and the need for information on local to regional change in order to develop adaptation strategies. Mitigation efforts are essential since the more successful mitigation activities are, the less need there will be for adaptation responses. And, mitigation responses can be global (e.g. a uniform percentage reduction in greenhouse gas emissions) while adaptation responses will be local to regional in character and therefore depend upon confident predictions of regional climatic change. The dilemma facing policymakers is that scientists have considerable confidence in likely global climatic changes but virtually zero confidence in regional changes. Mitigation and adaptation strategies relevant to climatic change can most usefully be developed in the context of sound understanding of climate, especially the near-surface continental climate, permitting discussion of societally relevant issues. But, climate models can't yet deliver this type of regionally and locationally specific prediction and some aspects of current research even seem to indicate increased uncertainty. These topics are explored in this paper using the specific example of the prediction of land-surface climate changes

  10. How uncertainty in input and parameters influences transport model output: four-stage model case-study

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    -year model outputs uncertainty. More precisely, this study contributes to the existing literature on the topic by investigating the effects on model outputs uncertainty deriving from the use of (i) different probability distributions in the sampling process, (ii) different assignment algorithms, and (iii...... of coefficient of variation, resulting from stochastic user equilibrium and user equilibrium is, respectively, of 0.425 and 0.468. Finally, network congestion does not show a high effect on model output uncertainty at the network level. However, the final uncertainty of links with higher volume/capacity ratio......If not properly quantified, the uncertainty inherent to transport models makes analyses based on their output highly unreliable. This study investigated uncertainty in four-stage transport models by analysing a Danish case-study: the Næstved model. The model describes the demand of transport...

  11. Uncertainty analysis in rainfall-runoff modelling : Application of machine learning techniques

    NARCIS (Netherlands)

    Shrestha, D.l.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  12. Uncertainty Analysis in Rainfall-Runoff Modelling: Application of Machine Learning Techniques

    NARCIS (Netherlands)

    Shrestha, D.L.

    2009-01-01

    This thesis presents powerful machine learning (ML) techniques to build predictive models of uncertainty with application to hydrological models. Two different methods are developed and tested. First one focuses on parameter uncertainty analysis by emulating the results of Monte Carlo simulations of

  13. Leaf area index uncertainty estimates for model-data fusion applications

    Science.gov (United States)

    Andrew D. Richardson; D. Bryan Dail; D.Y. Hollinger

    2011-01-01

    Estimates of data uncertainties are required to integrate different observational data streams as model constraints using model-data fusion. We describe an approach with which random and systematic uncertainties in optical measurements of leaf area index [LAI] can be quantified. We use data from a measurement campaign at the spruce-dominated Howland Forest AmeriFlux...

  14. Uncertainty in eddy covariance measurements and its application to physiological models

    Science.gov (United States)

    D.Y. Hollinger; A.D. Richardson; A.D. Richardson

    2005-01-01

    Flux data are noisy, and this uncertainty is largely due to random measurement error. Knowledge of uncertainty is essential for the statistical evaluation of modeled andmeasured fluxes, for comparison of parameters derived by fitting models to measured fluxes and in formal data-assimilation efforts. We used the difference between simultaneous measurements from two...

  15. Quantifying uncertainty of geological 3D layer models, constructed with a-priori geological expertise

    NARCIS (Netherlands)

    Gunnink, J.J.; Maljers, D.; Hummelman, J.

    2010-01-01

    Uncertainty quantification of geological models that are constructed with additional geological expert-knowledge is not straightforward. To construct sound geological 3D layer models we use a lot of additional knowledge, with an uncertainty that is hard to quantify. Examples of geological expert

  16. Uncertainty in the environmental modelling process – A framework and guidance

    NARCIS (Netherlands)

    Refsgaard, J.C.; van der Sluijs, J.P.|info:eu-repo/dai/nl/073427489; Hojberg, A.L.; Vanrolleghem, P.

    2007-01-01

    A terminology and typology of uncertainty is presented together with a framework for the modelling process, its interaction with the broader water management process and the role of uncertainty at different stages in the modelling processes. Brief reviews have been made of 14 different (partly

  17. Evaluating critical uncertainty thresholds in a spatial model of forest pest invasion risk

    Science.gov (United States)

    Frank H. Koch; Denys Yemshanov; Daniel W. McKenney; William D. Smith

    2009-01-01

    Pest risk maps can provide useful decision support in invasive species management, but most do not adequately consider the uncertainty associated with predicted risk values. This study explores how increased uncertainty in a risk model’s numeric assumptions might affect the resultant risk map. We used a spatial stochastic model, integrating components for...

  18. Matching experimental and three dimensional numerical models for structural vibration problems with uncertainties

    Science.gov (United States)

    Langer, P.; Sepahvand, K.; Guist, C.; Bär, J.; Peplow, A.; Marburg, S.

    2018-03-01

    The simulation model which examines the dynamic behavior of real structures needs to address the impact of uncertainty in both geometry and material parameters. This article investigates three-dimensional finite element models for structural dynamics problems with respect to both model and parameter uncertainties. The parameter uncertainties are determined via laboratory measurements on several beam-like samples. The parameters are then considered as random variables to the finite element model for exploring the uncertainty effects on the quality of the model outputs, i.e. natural frequencies. The accuracy of the output predictions from the model is compared with the experimental results. To this end, the non-contact experimental modal analysis is conducted to identify the natural frequency of the samples. The results show a good agreement compared with experimental data. Furthermore, it is demonstrated that geometrical uncertainties have more influence on the natural frequencies compared to material parameters and material uncertainties are about two times higher than geometrical uncertainties. This gives valuable insights for improving the finite element model due to various parameter ranges required in a modeling process involving uncertainty.

  19. Uncertainty modelling and analysis of environmental systems: a river sediment yield example

    NARCIS (Netherlands)

    Keesman, K.J.; Koskela, J.; Guillaume, J.H.; Norton, J.P.; Croke, B.; Jakeman, A.

    2011-01-01

    Abstract: Throughout the last decades uncertainty analysis has become an essential part of environmental model building (e.g. Beck 1987; Refsgaard et al., 2007). The objective of the paper is to introduce stochastic and setmembership uncertainty modelling concepts, which basically differ in the

  20. Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System

    Science.gov (United States)

    2017-08-01

    Ballistic System by Daniel J Hornbaker Approved for public release; distribution is unlimited. NOTICES...Uncertainty from Computational Factors in Simulations of a Model Ballistic System by Daniel J Hornbaker Weapons and Materials Research...November 2016 4. TITLE AND SUBTITLE Quantifying Uncertainty from Computational Factors in Simulations of a Model Ballistic System 5a. CONTRACT NUMBER

  1. Sensitivity of a radiative transfer model to the uncertainty in the aerosol optical depth used as input

    Science.gov (United States)

    Román, Roberto; Bilbao, Julia; de Miguel, Argimiro; Pérez-Burgos, Ana

    2014-05-01

    The radiative transfer models can be used to obtain solar radiative quantities in the Earth surface as the erythemal ultraviolet (UVER) irradiance, which is the spectral irradiance weighted with the erythemal (sunburn) action spectrum, and the total shortwave irradiance (SW; 305-2,8000 nm). Aerosol and atmospheric properties are necessary as inputs in the model in order to calculate the UVER and SW irradiances under cloudless conditions, however the uncertainty in these inputs causes another uncertainty in the simulations. The objective of this work is to quantify the uncertainty in UVER and SW simulations generated by the aerosol optical depth (AOD) uncertainty. The data from different satellite retrievals were downloaded at nine Spanish places located in the Iberian Peninsula: Total ozone column from different databases, spectral surface albedo and water vapour column from MODIS instrument, AOD at 443 nm and Angström Exponent (between 443 nm and 670 nm) from MISR instrument onboard Terra satellite, single scattering albedo from OMI instrument onboard Aura satellite. The obtained AOD at 443 nm data from MISR were compared with AERONET measurements in six Spanish sites finding an uncertainty in the AOD from MISR of 0.074. In this work the radiative transfer model UVSPEC/Libradtran (1.7 version) was used to obtain the SW and UVER irradiance under cloudless conditions for each month and for different solar zenith angles (SZA) in the nine mentioned locations. The inputs used for these simulations were monthly climatology tables obtained with the available data in each location. Once obtained the UVER and SW simulations, they were repeated twice but changing the AOD monthly values by the same AOD plus/minus its uncertainty. The maximum difference between the irradiance run with AOD and the irradiance run with AOD plus/minus its uncertainty was calculated for each month, SZA, and location. This difference was considered as the uncertainty on the model caused by the AOD

  2. Uncertainty in the learning rates of energy technologies: An experiment in a global multi-regional energy system model

    International Nuclear Information System (INIS)

    Rout, Ullash K.; Blesl, Markus; Fahl, Ulrich; Remme, Uwe; Voss, Alfred

    2009-01-01

    The diffusion of promising energy technologies in the market depends on their future energy production-cost development. When analyzing these technologies in an integrated assessment model using endogenous technological learning, the uncertainty in the assumed learning rates (LRs) plays a crucial role in the production-cost development and model outcomes. This study examines the uncertainty in LRs of some energy technologies under endogenous global learning implementation and presents a floor-cost modeling procedure to systematically regulate the uncertainty in LRs of energy technologies. The article narrates the difficulties of data assimilation, as compatible with mixed integer programming segmentations, and comprehensively presents the causes of uncertainty in LRs. This work is executed using a multi-regional and long-horizon energy system model based on 'TIMES' framework. All regions receive an economic advantage to learn in a common domain, and resource-ample regions obtain a marginal advantage for better exploitation of the learning technologies, due to a lower supply-side fuel-cost development. The lowest learning investment associated with the maximum LR mobilizes more deployment of the learning technologies. The uncertainty in LRs has an impact on the diffusion of energy technologies tested, and therefore this study scrutinizes the role of policy support for some of the technologies investigated.

  3. Impact of Damping Uncertainty on SEA Model Response Variance

    Science.gov (United States)

    Schiller, Noah; Cabell, Randolph; Grosveld, Ferdinand

    2010-01-01

    Statistical Energy Analysis (SEA) is commonly used to predict high-frequency vibroacoustic levels. This statistical approach provides the mean response over an ensemble of random subsystems that share the same gross system properties such as density, size, and damping. Recently, techniques have been developed to predict the ensemble variance as well as the mean response. However these techniques do not account for uncertainties in the system properties. In the present paper uncertainty in the damping loss factor is propagated through SEA to obtain more realistic prediction bounds that account for both ensemble and damping variance. The analysis is performed on a floor-equipped cylindrical test article that resembles an aircraft fuselage. Realistic bounds on the damping loss factor are determined from measurements acquired on the sidewall of the test article. The analysis demonstrates that uncertainties in damping have the potential to significantly impact the mean and variance of the predicted response.

  4. Uncertainty Quantification and Learning in Geophysical Modeling: How Information is Coded into Dynamical Models

    Science.gov (United States)

    Gupta, H. V.

    2014-12-01

    There is a clear need for comprehensive quantification of simulation uncertainty when using geophysical models to support and inform decision-making. Further, it is clear that the nature of such uncertainty depends on the quality of information in (a) the forcing data (driver information), (b) the model code (prior information), and (c) the specific values of inferred model components that localize the model to the system of interest (inferred information). Of course, the relative quality of each varies with geophysical discipline and specific application. In this talk I will discuss a structured approach to characterizing how 'Information', and hence 'Uncertainty', is coded into the structures of physics-based geophysical models. I propose that a better understanding of what is meant by "Information", and how it is embodied in models and data, can offer a structured (less ad-hoc), robust and insightful basis for diagnostic learning through the model-data juxtaposition. In some fields, a natural consequence may be to emphasize the a priori role of System Architecture (Process Modeling) over that of the selection of System Parameterization, thereby emphasizing the more creative aspect of scientific investigation - the use of models for Discovery and Learning.

  5. Model structural uncertainty quantification and hydrogeophysical data integration using airborne electromagnetic data (Invited)

    DEFF Research Database (Denmark)

    Minsley, Burke; Christensen, Nikolaj Kruse; Christensen, Steen

    estimates of model structural uncertainty are then combined with hydrologic observations to assess the impact of model structural error on hydrologic calibration and prediction errors. Using a synthetic numerical model, we describe a sequential hydrogeophysical approach that: (1) uses Bayesian Markov chain...... Monte Carlo (McMC) methods to produce a robust estimate of uncertainty in electrical resistivity parameter values, (2) combines geophysical parameter uncertainty estimates with borehole observations of lithology to produce probabilistic estimates of model structural uncertainty over the entire AEM...... of airborne electromagnetic (AEM) data to estimate large-scale model structural geometry, i.e. the spatial distribution of different lithological units based on assumed or estimated resistivity-lithology relationships, and the uncertainty in those structures given imperfect measurements. Geophysically derived...

  6. Host Model Uncertainty in Aerosol Radiative Effects: the AeroCom Prescribed Experiment and Beyond

    Science.gov (United States)

    Stier, Philip; Schutgens, Nick; Bian, Huisheng; Boucher, Olivier; Chin, Mian; Ghan, Steven; Huneeus, Nicolas; Kinne, Stefan; Lin, Guangxing; Myhre, Gunnar; Penner, Joyce; Randles, Cynthia; Samset, Bjorn; Schulz, Michael; Yu, Hongbin; Zhou, Cheng; Bellouin, Nicolas; Ma, Xiaoyan; Yu, Fangqun; Takemura, Toshihiko

    2013-04-01

    Anthropogenic and natural aerosol radiative effects are recognized to affect global and regional climate. Multi-model "diversity" in estimates of the aerosol radiative effect is often perceived as a measure of the uncertainty in modelling aerosol itself. However, current aerosol models vary considerably in model components relevant for the calculation of aerosol radiative forcings and feedbacks and the associated "host-model uncertainties" are generally convoluted with the actual uncertainty in aerosol modelling. In the AeroCom Prescribed intercomparison study we systematically isolate and quantify host model uncertainties on aerosol forcing experiments through prescription of identical aerosol radiative properties in eleven participating models. Host model errors in aerosol radiative forcing are largest in regions of uncertain host model components, such as stratocumulus cloud decks or areas with poorly constrained surface albedos, such as sea ice. Our results demonstrate that host model uncertainties are an important component of aerosol forcing uncertainty that require further attention. However, uncertainties in aerosol radiative effects also include short-term and long-term feedback processes that will be systematically explored in future intercomparison studies. Here we will present an overview of the proposals for discussion and results from early scoping studies.

  7. Holistic uncertainty analysis in river basin modeling for climate vulnerability assessment

    Science.gov (United States)

    Taner, M. U.; Wi, S.; Brown, C.

    2017-12-01

    The challenges posed by uncertain future climate are a prominent concern for water resources managers. A number of frameworks exist for assessing the impacts of climate-related uncertainty, including internal climate variability and anthropogenic climate change, such as scenario-based approaches and vulnerability-based approaches. While in many cases climate uncertainty may be dominant, other factors such as future evolution of the river basin, hydrologic response and reservoir operations are potentially significant sources of uncertainty. While uncertainty associated with modeling hydrologic response has received attention, very little attention has focused on the range of uncertainty and possible effects of the water resources infrastructure and management. This work presents a holistic framework that allows analysis of climate, hydrologic and water management uncertainty in water resources systems analysis with the aid of a water system model designed to integrate component models for hydrology processes and water management activities. The uncertainties explored include those associated with climate variability and change, hydrologic model parameters, and water system operation rules. A Bayesian framework is used to quantify and model the uncertainties at each modeling steps in integrated fashion, including prior and the likelihood information about model parameters. The framework is demonstrated in a case study for the St. Croix Basin located at border of United States and Canada.

  8. Multiple indicators, multiple causes measurement error models.

    Science.gov (United States)

    Tekwe, Carmen D; Carter, Randy L; Cullings, Harry M; Carroll, Raymond J

    2014-11-10

    Multiple indicators, multiple causes (MIMIC) models are often employed by researchers studying the effects of an unobservable latent variable on a set of outcomes, when causes of the latent variable are observed. There are times, however, when the causes of the latent variable are not observed because measurements of the causal variable are contaminated by measurement error. The objectives of this paper are as follows: (i) to develop a novel model by extending the classical linear MIMIC model to allow both Berkson and classical measurement errors, defining the MIMIC measurement error (MIMIC ME) model; (ii) to develop likelihood-based estimation methods for the MIMIC ME model; and (iii) to apply the newly defined MIMIC ME model to atomic bomb survivor data to study the impact of dyslipidemia and radiation dose on the physical manifestations of dyslipidemia. As a by-product of our work, we also obtain a data-driven estimate of the variance of the classical measurement error associated with an estimate of the amount of radiation dose received by atomic bomb survivors at the time of their exposure. Copyright © 2014 John Wiley & Sons, Ltd.

  9. Assessment of structural model and parameter uncertainty with a multi-model system for soil water balance models

    Science.gov (United States)

    Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz

    2016-04-01

    Water for agriculture is strongly limited in arid and semi-arid regions and often of low quality in terms of salinity. The application of saline waters for irrigation increases the salt load in the rooting zone and has to be managed by leaching to maintain a healthy soil, i.e. to wash out salts by additional irrigation. Dynamic simulation models are helpful tools to calculate the root zone water fluxes and soil salinity content in order to investigate best management practices. However, there is little information on structural and parameter uncertainty for simulations regarding the water and salt balance of saline irrigation. Hence, we established a multi-model system with four different models (AquaCrop, RZWQM, SWAP, Hydrus1D/UNSATCHEM) to analyze the structural and parameter uncertainty by using the Global Likelihood and Uncertainty Estimation (GLUE) method. Hydrus1D/UNSATCHEM and SWAP were set up with multiple sets of different implemented functions (e.g. matric and osmotic stress for root water uptake) which results in a broad range of different model structures. The simulations were evaluated against soil water and salinity content observations. The posterior distribution of the GLUE analysis gives behavioral parameters sets and reveals uncertainty intervals for parameter uncertainty. Throughout all of the model sets, most parameters accounting for the soil water balance show a low uncertainty, only one or two out of five to six parameters in each model set displays a high uncertainty (e.g. pore-size distribution index in SWAP and Hydrus1D/UNSATCHEM). The differences between the models and model setups reveal the structural uncertainty. The highest structural uncertainty is observed for deep percolation fluxes between the model sets of Hydrus1D/UNSATCHEM (~200 mm) and RZWQM (~500 mm) that are more than twice as high for the latter. The model sets show a high variation in uncertainty intervals for deep percolation as well, with an interquartile range (IQR) of

  10. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  11. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  12. A GLUE uncertainty analysis of a drying model of pharmaceutical granules

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn

    2013-01-01

    uncertainty) originating from uncertainty in input data, model parameters, model structure, boundary conditions and software. In this paper, the model prediction uncertainty is evaluated for a model describing the continuous drying of single pharmaceutical wet granules in a six-segmented fluidized bed drying...... unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...

  13. On Evaluation of Recharge Model Uncertainty: a Priori and a Posteriori

    International Nuclear Information System (INIS)

    Ming Ye; Karl Pohlmann; Jenny Chapman; David Shafer

    2006-01-01

    Hydrologic environments are open and complex, rendering them prone to multiple interpretations and mathematical descriptions. Hydrologic analyses typically rely on a single conceptual-mathematical model, which ignores conceptual model uncertainty and may result in bias in predictions and under-estimation of predictive uncertainty. This study is to assess conceptual model uncertainty residing in five recharge models developed to date by different researchers based on different theories for Nevada and Death Valley area, CA. A recently developed statistical method, Maximum Likelihood Bayesian Model Averaging (MLBMA), is utilized for this analysis. In a Bayesian framework, the recharge model uncertainty is assessed, a priori, using expert judgments collected through an expert elicitation in the form of prior probabilities of the models. The uncertainty is then evaluated, a posteriori, by updating the prior probabilities to estimate posterior model probability. The updating is conducted through maximum likelihood inverse modeling by calibrating the Death Valley Regional Flow System (DVRFS) model corresponding to each recharge model against observations of head and flow. Calibration results of DVRFS for the five recharge models are used to estimate three information criteria (AIC, BIC, and KIC) used to rank and discriminate these models. Posterior probabilities of the five recharge models, evaluated using KIC, are used as weights to average head predictions, which gives posterior mean and variance. The posterior quantities incorporate both parametric and conceptual model uncertainties

  14. Integrating uncertainty in time series population forecasts: An illustration using a simple projection model

    Directory of Open Access Journals (Sweden)

    Guy J. Abel

    2013-12-01

    Full Text Available Background: Population forecasts are widely used for public policy purposes. Methods to quantify the uncertainty in forecasts tend to ignore model uncertainty and to be based on a single model. Objective: In this paper, we use Bayesian time series models to obtain future population estimates with associated measures of uncertainty. The models are compared based on Bayesian posterior model probabilities, which are then used to provide model-averaged forecasts. Methods: The focus is on a simple projection model with the historical data representing population change in England and Wales from 1841 to 2007. Bayesian forecasts to the year 2032 are obtained based on a range of models, including autoregression models, stochastic volatility models and random variance shift models. The computational steps to fit each of these models using the OpenBUGS software via R are illustrated. Results: We show that the Bayesian approach is adept in capturing multiple sources of uncertainty in population projections, including model uncertainty. The inclusion of non-constant variance improves the fit of the models and provides more realistic predictive uncertainty levels. The forecasting methodology is assessed through fitting the models to various truncated data series.

  15. A model of entry-exit decisions and capacity choice under demand uncertainty

    OpenAIRE

    Isik, Murat; Coble, Keith H.; Hudson, Darren; House, Lisa O.

    2003-01-01

    Many investment decisions of agribusiness firms, such as when to invest in an emerging market or whether to expand the capacity of the firm, involve irreversible investment and uncertainty about demand, cost or competition. This paper uses an option-value model to examine the factors affecting an agribusiness firm's decision whether and how much to invest in an emerging market under demand uncertainty. Demand uncertainty and irreversibility of investment make investment less desirable than th...

  16. Accelerating a hydrological uncertainty ensemble model using graphics processing units (GPUs)

    Science.gov (United States)

    Tristram, D.; Hughes, D.; Bradshaw, K.

    2014-01-01

    The practical application of hydrological uncertainty models that are designed to generate multiple ensembles can be severely restricted by the available computer processing power and thus, the time taken to generate the results. CPU clusters can help in this regard, but are often costly to use continuously and maintain, causing scientists to look elsewhere for speed improvements. The use of powerful graphics processing units (GPUs) for application acceleration has become a recent trend, owing to their low cost per FLOP, and their highly parallel and throughput-oriented architecture, which makes them ideal for many scientific applications. However, programming these devices efficiently is non-trivial, seemingly making their use impractical for many researchers. In this study, we investigate whether redesigning the CPU code of an adapted Pitman rainfall-runoff uncertainty model is necessary to obtain a satisfactory speedup on GPU devices. A twelvefold speedup over a multithreaded CPU implementation was achieved by using a modern GPU with minimal changes to the model code. This success leads us to believe that redesigning code for the GPU is not always necessary to obtain a worthwhile speedup.

  17. Uncertainty modelling of critical column buckling for reinforced ...

    Indian Academy of Sciences (India)

    Buckling is a critical issue for structural stability in structural design. In most of the buckling analyses, applied loads, structural and material properties are considered certain. However, in reality, these parameters are uncertain. Therefore, a prognostic solution is necessary and uncertainties have to be considered. Fuzzy logic ...

  18. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Czech Academy of Sciences Publication Activity Database

    Yu, X.; Lamačová, Anna; Duffy, Ch.; Krám, P.; Hruška, Jakub

    2016-01-01

    Roč. 90, part B (2016), s. 90-101 ISSN 0098-3004 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : Uncertainty * Evapotranspiration * Forest management * PIHM * Biome-BGC Subject RIV: DA - Hydrology ; Limnology OBOR OECD: Hydrology Impact factor: 2.533, year: 2016

  19. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    , such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...

  20. Uncertainty Aware Structural Topology Optimization Via a Stochastic Reduced Order Model Approach

    Science.gov (United States)

    Aguilo, Miguel A.; Warner, James E.

    2017-01-01

    This work presents a stochastic reduced order modeling strategy for the quantification and propagation of uncertainties in topology optimization. Uncertainty aware optimization problems can be computationally complex due to the substantial number of model evaluations that are necessary to accurately quantify and propagate uncertainties. This computational complexity is greatly magnified if a high-fidelity, physics-based numerical model is used for the topology optimization calculations. Stochastic reduced order model (SROM) methods are applied here to effectively 1) alleviate the prohibitive computational cost associated with an uncertainty aware topology optimization problem; and 2) quantify and propagate the inherent uncertainties due to design imperfections. A generic SROM framework that transforms the uncertainty aware, stochastic topology optimization problem into a deterministic optimization problem that relies only on independent calls to a deterministic numerical model is presented. This approach facilitates the use of existing optimization and modeling tools to accurately solve the uncertainty aware topology optimization problems in a fraction of the computational demand required by Monte Carlo methods. Finally, an example in structural topology optimization is presented to demonstrate the effectiveness of the proposed uncertainty aware structural topology optimization approach.

  1. Assessing uncertainties in crop and pasture ensemble model simulations of productivity and N2O emissions

    Science.gov (United States)

    Simulation models are extensively used to predict agricultural productivity and greenhouse gas (GHG) emissions. However, the uncertainties of (reduced) model ensemble simulations have not been assessed systematically for variables affecting food security and climate change mitigation, within multisp...

  2. Context, Experience, Expectation, and Action—Towards an Empirically Grounded, General Model for Analyzing Biographical Uncertainty

    Directory of Open Access Journals (Sweden)

    Herwig Reiter

    2010-01-01

    Full Text Available The article proposes a general, empirically grounded model for analyzing biographical uncertainty. The model is based on findings from a qualitative-explorative study of transforming meanings of unemployment among young people in post-Soviet Lithuania. In a first step, the particular features of the uncertainty puzzle in post-communist youth transitions are briefly discussed. A historical event like the collapse of state socialism in Europe, similar to the recent financial and economic crisis, is a generator of uncertainty par excellence: it undermines the foundations of societies and the taken-for-grantedness of related expectations. Against this background, the case of a young woman and how she responds to the novel threat of unemployment in the transition to the world of work is introduced. Her uncertainty management in the specific time perspective of certainty production is then conceptually rephrased by distinguishing three types or levels of biographical uncertainty: knowledge, outcome, and recognition uncertainty. Biographical uncertainty, it is argued, is empirically observable through the analysis of acting and projecting at the biographical level. The final part synthesizes the empirical findings and the conceptual discussion into a stratification model of biographical uncertainty as a general tool for the biographical analysis of uncertainty phenomena. URN: urn:nbn:de:0114-fqs100120

  3. Accounting for methodological, structural, and parameter uncertainty in decision-analytic models: a practical guide.

    Science.gov (United States)

    Bilcke, Joke; Beutels, Philippe; Brisson, Marc; Jit, Mark

    2011-01-01

    Accounting for uncertainty is now a standard part of decision-analytic modeling and is recommended by many health technology agencies and published guidelines. However, the scope of such analyses is often limited, even though techniques have been developed for presenting the effects of methodological, structural, and parameter uncertainty on model results. To help bring these techniques into mainstream use, the authors present a step-by-step guide that offers an integrated approach to account for different kinds of uncertainty in the same model, along with a checklist for assessing the way in which uncertainty has been incorporated. The guide also addresses special situations such as when a source of uncertainty is difficult to parameterize, resources are limited for an ideal exploration of uncertainty, or evidence to inform the model is not available or not reliable. for identifying the sources of uncertainty that influence results most are also described. Besides guiding analysts, the guide and checklist may be useful to decision makers who need to assess how well uncertainty has been accounted for in a decision-analytic model before using the results to make a decision.

  4. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    DEFF Research Database (Denmark)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.

    2015-01-01

    The ensemble Kalman filter (EnKF) is a popular data assimilation (DA) technique that has been extensively used in environmental sciences for combining complementary information from model predictions and observations. One of the major challenges in EnKF applications is the description of model un...... with respect to performance and sensitivity. Results show that inappropriate definition of model uncertainty can greatly degrade the assimilation performance, and an appropriate combination of different model uncertainty sources is advised....

  5. Managing structural uncertainty in health economic decision models: a discrepancy approach

    OpenAIRE

    Strong, M.; Oakley, J.; Chilcott, J.

    2012-01-01

    Healthcare resource allocation decisions are commonly informed by computer model predictions of population mean costs and health effects. It is common to quantify the uncertainty in the prediction due to uncertain model inputs, but methods for quantifying uncertainty due to inadequacies in model structure are less well developed. We introduce an example of a model that aims to predict the costs and health effects of a physical activity promoting intervention. Our goal is to develop a framewor...

  6. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    International Nuclear Information System (INIS)

    Darcel, C.; Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O.

    2009-11-01

    the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology and fracturing properties main characteristics. From that

  7. Uncertainty Analysis Framework - Hanford Site-Wide Groundwater Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    Cole, Charles R.; Bergeron, Marcel P.; Murray, Christopher J.; Thorne, Paul D.; Wurstner, Signe K.; Rogers, Phillip M.

    2001-11-09

    Pacific Northwest National Laboratory (PNNL) embarked on a new initiative to strengthen the technical defensibility of the predictions being made with a site-wide groundwater flow and transport model at the U.S. Department of Energy Hanford Site in southeastern Washington State. In FY 2000, the focus of the initiative was on the characterization of major uncertainties in the current conceptual model that would affect model predictions. The long-term goals of the initiative are the development and implementation of an uncertainty estimation methodology in future assessments and analyses using the site-wide model. This report focuses on the development and implementation of an uncertainty analysis framework.

  8. Can Bayesian Belief Networks help tackling conceptual model uncertainties in contaminated site risk assessment?

    DEFF Research Database (Denmark)

    Troldborg, Mads; Thomsen, Nanna Isbak; McKnight, Ursula S.

    A key component in risk assessment of contaminated sites is the formulation of a conceptual site model. The conceptual model is a simplified representation of reality and forms the basis for the mathematical modelling of contaminant fate and transport at the site. A conceptual model should...... therefore identify the most important site-specific features and processes that may affect the contaminant transport behaviour at the site. The development of a conceptual model will always be associated with uncertainties due to lack of data and understanding of the site conditions, and often many...... different conceptual models may describe the same contaminated site equally well. In many cases, conceptual model uncertainty has been shown to be one of the dominant sources for uncertainty and is therefore essential to account for when quantifying uncertainties in risk assessments. We present here...

  9. Propagation of Uncertainty in System Parameters of a LWR Model by Sampling MCNPX Calculations - Burnup Analysis

    Science.gov (United States)

    Campolina, Daniel de A. M.; Lima, Claubia P. B.; Veloso, Maria Auxiliadora F.

    2014-06-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95th percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input.

  10. Statistical methodology for discrete fracture model - including fracture size, orientation uncertainty together with intensity uncertainty and variability

    Energy Technology Data Exchange (ETDEWEB)

    Darcel, C. (Itasca Consultants SAS (France)); Davy, P.; Le Goc, R.; Dreuzy, J.R. de; Bour, O. (Geosciences Rennes, UMR 6118 CNRS, Univ. def Rennes, Rennes (France))

    2009-11-15

    the lineament scale (k{sub t} = 2) on the other, addresses the issue of the nature of the transition. We develop a new 'mechanistic' model that could help in modeling why and where this transition can occur. The transition between both regimes would occur for a fracture length of 1-10 m and even at a smaller scale for the few outcrops that follow the self-similar density model. A consequence for the disposal issue is that the model that is likely to apply in the 'blind' scale window between 10-100 m is the self-similar model as it is defined for large-scale lineaments. The self-similar model, as it is measured for some outcrops and most lineament maps, is definitely worth being investigated as a reference for scales above 1-10 m. In the rest of the report, we develop a methodology for incorporating uncertainty and variability into the DFN modeling. Fracturing properties arise from complex processes which produce an intrinsic variability; characterizing this variability as an admissible variation of model parameter or as the division of the site into subdomains with distinct DFN models is a critical point of the modeling effort. Moreover, the DFN model encompasses a part of uncertainty, due to data inherent uncertainties and sampling limits. Both effects must be quantified and incorporated into the DFN site model definition process. In that context, all available borehole data including recording of fracture intercept positions, pole orientation and relative uncertainties are used as the basis for the methodological development and further site model assessment. An elementary dataset contains a set of discrete fracture intercepts from which a parent orientation/density distribution can be computed. The elementary bricks of the site, from which these initial parent density distributions are computed, rely on the former Single Hole Interpretation division of the boreholes into sections whose local boundaries are expected to reflect - locally - geology

  11. Multidisciplinary Design Optimization Under Uncertainty: An Information Model Approach (PREPRINT)

    Science.gov (United States)

    2011-03-01

    preferences in his Postulates I-IV to ensure rational decision making . Our approach to decision making satisfies the Savage postulates. Multicriteria ...Challenges associated with decision making for large complex systems in the pres- ence of uncertainty and risk have been of special interest to scientists...is heavily influenced by the government reports cited. Of spe- cial interest to us are the challenges associated with the decision making under

  12. Identifying and assessing critical uncertainty thresholds in a forest pest risk model

    Science.gov (United States)

    Frank H. Koch; Denys Yemshanov

    2015-01-01

    Pest risk maps can provide helpful decision support for invasive alien species management, but often fail to address adequately the uncertainty associated with their predicted risk values. Th is chapter explores how increased uncertainty in a risk model’s numeric assumptions (i.e. its principal parameters) might aff ect the resulting risk map. We used a spatial...

  13. Application of Uncertainty and Sensitivity Analysis to a Kinetic Model for Enzymatic Biodiesel Production

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Nordblad, Mathias; Woodley, John

    2014-01-01

    This paper demonstrates the added benefits of using uncertainty and sensitivity analysis in the kinetics of enzymatic biodiesel production. For this study, a kinetic model by Fedosov and co-workers is used. For the uncertainty analysis the Monte Carlo procedure was used to statistically quantify...

  14. Application of the emission inventory model TEAM: Uncertainties in dioxin emission estimates for central Europe

    NARCIS (Netherlands)

    Pulles, M.P.J.; Kok, H.; Quass, U.

    2006-01-01

    This study uses an improved emission inventory model to assess the uncertainties in emissions of dioxins and furans associated with both knowledge on the exact technologies and processes used, and with the uncertainties of both activity data and emission factors. The annual total emissions for the

  15. Estimation of uncertainties in predictions of environmental transfer models: evaluation of methods and application to CHERPAC

    International Nuclear Information System (INIS)

    Koch, J.; Peterson, S-R.

    1995-10-01

    Models used to simulate environmental transfer of radionuclides typically include many parameters, the values of which are uncertain. An estimation of the uncertainty associated with the predictions is therefore essential. Difference methods to quantify the uncertainty in the prediction parameter uncertainties are reviewed. A statistical approach using random sampling techniques is recommended for complex models with many uncertain parameters. In this approach, the probability density function of the model output is obtained from multiple realizations of the model according to a multivariate random sample of the different input parameters. Sampling efficiency can be improved by using a stratified scheme (Latin Hypercube Sampling). Sample size can also be restricted when statistical tolerance limits needs to be estimated. Methods to rank parameters according to their contribution to uncertainty in the model prediction are also reviewed. Recommended are measures of sensitivity, correlation and regression coefficients that can be calculated on values of input and output variables generated during the propagation of uncertainties through the model. A parameter uncertainty analysis is performed for the CHERPAC food chain model which estimates subjective confidence limits and intervals on the predictions at a 95% confidence level. A sensitivity analysis is also carried out using partial rank correlation coefficients. This identified and ranks the parameters which are the main contributors to uncertainty in the predictions, thereby guiding further research efforts. (author). 44 refs., 2 tabs., 4 figs

  16. Addressing imperfect maintenance modelling uncertainty in unavailability and cost based optimization

    International Nuclear Information System (INIS)

    Sanchez, Ana; Carlos, Sofia; Martorell, Sebastian; Villanueva, Jose F.

    2009-01-01

    Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral

  17. Addressing imperfect maintenance modelling uncertainty in unavailability and cost based optimization

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, Ana [Department of Statistics and Operational Research, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain); Carlos, Sofia [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain); Martorell, Sebastian [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain)], E-mail: smartore@iqn.upv.es; Villanueva, Jose F. [Department of Chemical and Nuclear Engineering, Polytechnic University of Valencia, Camino de Vera, s/n, 46071 Valencia (Spain)

    2009-01-15

    Optimization of testing and maintenance activities performed in the different systems of a complex industrial plant is of great interest as the plant availability and economy strongly depend on the maintenance activities planned. Traditionally, two types of models, i.e. deterministic and probabilistic, have been considered to simulate the impact of testing and maintenance activities on equipment unavailability and the cost involved. Both models present uncertainties that are often categorized as either aleatory or epistemic uncertainties. The second group applies when there is limited knowledge on the proper model to represent a problem, and/or the values associated to the model parameters, so the results of the calculation performed with them incorporate uncertainty. This paper addresses the problem of testing and maintenance optimization based on unavailability and cost criteria and considering epistemic uncertainty in the imperfect maintenance modelling. It is framed as a multiple criteria decision making problem where unavailability and cost act as uncertain and conflicting decision criteria. A tolerance interval based approach is used to address uncertainty with regard to effectiveness parameter and imperfect maintenance model embedded within a multiple-objective genetic algorithm. A case of application for a stand-by safety related system of a nuclear power plant is presented. The results obtained in this application show the importance of considering uncertainties in the modelling of imperfect maintenance, as the optimal solutions found are associated with a large uncertainty that influences the final decision making depending on, for example, if the decision maker is risk averse or risk neutral.

  18. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    time, especially with respect to large-scale transport models. The study described in this paper contributes to fill the gap by investigating the effects of uncertainty in socio-economic variables growth rate projections on large-scale transport model forecasts, using the Danish National Transport......A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  19. Considering the Epistemic Uncertainties of the Variogram Model in Locating Additional Exploratory Drillholes

    Directory of Open Access Journals (Sweden)

    Saeed Soltani

    2015-06-01

    Full Text Available To enhance the certainty of the grade block model, it is necessary to increase the number of exploratory drillholes and collect more data from the deposit. The inputs of the process of locating these additional drillholes include the variogram model parameters, locations of the samples taken from the initial drillholes, and the geological block model. The uncertainties of these inputs will lead to uncertainties in the optimal locations of additional drillholes. Meanwhile, the locations of the initial data are crisp, but the variogram model parameters and the geological model have uncertainties due to the limitation of the number of initial data. In this paper, effort has been made to consider the effects of variogram uncertainties on the optimal location of additional drillholes using the fuzzy kriging and solve the locating problem with the genetic algorithm (GA optimization method.A bauxite deposit case study has shown the efficiency of the proposed model.

  20. Model structural uncertainty quantification and hydrologic parameter and prediction error analysis using airborne electromagnetic data

    DEFF Research Database (Denmark)

    Minsley, B. J.; Christensen, Nikolaj Kruse; Christensen, Steen

    electromagnetic (AEM) data. Our estimates of model structural uncertainty follow a Bayesian framework that accounts for both the uncertainties in geophysical parameter estimates given AEM data, and the uncertainties in the relationship between lithology and geophysical parameters. Using geostatistical sequential......Model structure, or the spatial arrangement of subsurface lithological units, is fundamental to the hydrological behavior of Earth systems. Knowledge of geological model structure is critically important in order to make informed hydrological predictions and management decisions. Model structure...... is never perfectly known, however, and incorrect assumptions can be a significant source of error when making model predictions. We describe a systematic approach for quantifying model structural uncertainty that is based on the integration of sparse borehole observations and large-scale airborne...

  1. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    International Nuclear Information System (INIS)

    Sig Drellack, Lance Prothro

    2007-01-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  2. Assessment and Reduction of Model Parametric Uncertainties: A Case Study with A Distributed Hydrological Model

    Science.gov (United States)

    Gan, Y.; Liang, X. Z.; Duan, Q.; Xu, J.; Zhao, P.; Hong, Y.

    2017-12-01

    The uncertainties associated with the parameters of a hydrological model need to be quantified and reduced for it to be useful for operational hydrological forecasting and decision support. An uncertainty quantification framework is presented to facilitate practical assessment and reduction of model parametric uncertainties. A case study, using the distributed hydrological model CREST for daily streamflow simulation during the period 2008-2010 over ten watershed, was used to demonstrate the performance of this new framework. Model behaviors across watersheds were analyzed by a two-stage stepwise sensitivity analysis procedure, using LH-OAT method for screening out insensitive parameters, followed by MARS-based Sobol' sensitivity indices for quantifying each parameter's contribution to the response variance due to its first-order and higher-order effects. Pareto optimal sets of the influential parameters were then found by the adaptive surrogate-based multi-objective optimization procedure, using MARS model for approximating the parameter-response relationship and SCE-UA algorithm for searching the optimal parameter sets of the adaptively updated surrogate model. The final optimal parameter sets were validated against the daily streamflow simulation of the same watersheds during the period 2011-2012. The stepwise sensitivity analysis procedure efficiently reduced the number of parameters that need to be calibrated from twelve to seven, which helps to limit the dimensionality of calibration problem and serves to enhance the efficiency of parameter calibration. The adaptive MARS-based multi-objective calibration exercise provided satisfactory solutions to the reproduction of the observed streamflow for all watersheds. The final optimal solutions showed significant improvement when compared to the default solutions, with about 65-90% reduction in 1-NSE and 60-95% reduction in |RB|. The validation exercise indicated a large improvement in model performance with about 40

  3. Uncertainty in Various Habitat Suitability Models and Its Impact on Habitat Suitability Estimates for Fish

    OpenAIRE

    Lin, Yu-Pin; Lin, Wei-Chih; Wu, Wei-Yao

    2015-01-01

    Species distribution models (SDMs) are extensively used to project habitat suitability of species in stream ecological studies. Owing to complex sources of uncertainty, such models may yield projections with varying degrees of uncertainty. To better understand projected spatial distributions and the variability between habitat suitability projections, this study uses five SDMs that are based on the outputs of a two-dimensional hydraulic model to project the suitability of habitats and to eval...

  4. Crop Model Improvement Reduces the Uncertainty of the Response to Temperature of Multi-Model Ensembles

    Science.gov (United States)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold; Ewert, Frank; Mueller, Christoph; Roetter, Reimund P.; Ruane, Alex C.; Semenov, Mikhail A.; Wallach, Daniel; Wang, Enli

    2016-01-01

    To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of models needed in a MME. Herein, 15 wheat growth models of a larger MME were improved through re-parameterization and/or incorporating or modifying heat stress effects on phenology, leaf growth and senescence, biomass growth, and grain number and size using detailed field experimental data from the USDA Hot Serial Cereal experiment (calibration data set). Simulation results from before and after model improvement were then evaluated with independent field experiments from a CIMMYT worldwide field trial network (evaluation data set). Model improvements decreased the variation (10th to 90th model ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures greater than 24 C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME uncertainty range by 27% increased MME prediction skills by 47%. Results suggest that the mean level of variation observed in field experiments and used as a benchmark can be reached with half the number of models in the MME. Improving crop models is therefore important to increase the certainty of model-based impact assessments and allow more practical, i.e. smaller MMEs to be used effectively.

  5. Scenario and modelling uncertainty in global mean temperature change derived from emission driven Global Climate Models

    OpenAIRE

    B. B. B. Booth; D. Bernie; D. McNeall; E. Hawkins; J. Caesar; C. Boulton; P. Friedlingstein; D. Sexton

    2012-01-01

    We compare future changes in global mean temperature in response to different future scenarios which, for the first time, arise from emission driven rather than concentration driven perturbed parameter ensemble of a Global Climate Model (GCM). These new GCM simulations sample uncertainties in atmospheric feedbacks, land carbon cycle, ocean physics and aerosol sulphur cycle processes. We find broader ranges of projected temperature responses arising when considering emission rather than concen...

  6. Evaluating prediction uncertainty

    International Nuclear Information System (INIS)

    McKay, M.D.

    1995-03-01

    The probability distribution of a model prediction is presented as a proper basis for evaluating the uncertainty in a model prediction that arises from uncertainty in input values. Determination of important model inputs and subsets of inputs is made through comparison of the prediction distribution with conditional prediction probability distributions. Replicated Latin hypercube sampling and variance ratios are used in estimation of the distributions and in construction of importance indicators. The assumption of a linear relation between model output and inputs is not necessary for the indicators to be effective. A sequential methodology which includes an independent validation step is applied in two analysis applications to select subsets of input variables which are the dominant causes of uncertainty in the model predictions. Comparison with results from methods which assume linearity shows how those methods may fail. Finally, suggestions for treating structural uncertainty for submodels are presented

  7. Characterizing parameter sensitivity and uncertainty for a snow model across hydroclimatic regimes

    Science.gov (United States)

    He, Minxue; Hogue, Terri S.; Franz, Kristie J.; Margulis, Steven A.; Vrugt, Jasper A.

    2011-01-01

    The National Weather Service (NWS) uses the SNOW17 model to forecast snow accumulation and ablation processes in snow-dominated watersheds nationwide. Successful application of the SNOW17 relies heavily on site-specific estimation of model parameters. The current study undertakes a comprehensive sensitivity and uncertainty analysis of SNOW17 model parameters using forcing and snow water equivalent (SWE) data from 12 sites with differing meteorological and geographic characteristics. The Generalized Sensitivity Analysis and the recently developed Differential Evolution Adaptive Metropolis (DREAM) algorithm are utilized to explore the parameter space and assess model parametric and predictive uncertainty. Results indicate that SNOW17 parameter sensitivity and uncertainty generally varies between sites. Of the six hydroclimatic characteristics studied, only air temperature shows strong correlation with the sensitivity and uncertainty ranges of two parameters, while precipitation is highly correlated with the uncertainty of one parameter. Posterior marginal distributions of two parameters are also shown to be site-dependent in terms of distribution type. The SNOW17 prediction ensembles generated by the DREAM-derived posterior parameter sets contain most of the observed SWE. The proposed uncertainty analysis provides posterior parameter information on parameter uncertainty and distribution types that can serve as a foundation for a data assimilation framework for hydrologic models.

  8. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one......, such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...

  9. Estimation of Model Uncertainties in Closed-loop Systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2008-01-01

    This paper describe a method for estimation of parameters or uncertainties in closed-loop systems. The method is based on an application of the dual YJBK (after Youla, Jabr, Bongiorno and Kucera) parameterization of all systems stabilized by a given controller. The dual YJBK transfer function...... is a measure for the variation in the system seen through the feedback controller. It is shown that it is possible to isolate a certain number of parameters or uncertain blocks in the system exactly. This is obtained by modifying the feedback controller through the YJBK transfer function together with pre...

  10. Crop model improvement reduces the uncertainty of the response to temperature of multi-model ensembles

    DEFF Research Database (Denmark)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold

    2017-01-01

    ensemble percentile range) of grain yields simulated by the MME on average by 39% in the calibration data set and by 26% in the independent evaluation data set for crops grown in mean seasonal temperatures >24 °C. MME mean squared error in simulating grain yield decreased by 37%. A reduction in MME...... uncertainty range by 27% increased MME prediction skills by 47%. Results suggest that the mean level of variation observed in field experiments and used as a benchmark can be reached with half the number of models in the MME. Improving crop models is therefore important to increase the certainty of model...

  11. Entropic uncertainty relations in the Heisenberg XXZ model and its controlling via filtering operations

    Science.gov (United States)

    Ming, Fei; Wang, Dong; Shi, Wei-Nan; Huang, Ai-Jun; Sun, Wen-Yang; Ye, Liu

    2018-04-01

    The uncertainty principle is recognized as an elementary ingredient of quantum theory and sets up a significant bound to predict outcome of measurement for a couple of incompatible observables. In this work, we develop dynamical features of quantum memory-assisted entropic uncertainty relations (QMA-EUR) in a two-qubit Heisenberg XXZ spin chain with an inhomogeneous magnetic field. We specifically derive the dynamical evolutions of the entropic uncertainty with respect to the measurement in the Heisenberg XXZ model when spin A is initially correlated with quantum memory B. It has been found that the larger coupling strength J of the ferromagnetism ( J 0 ) chains can effectively degrade the measuring uncertainty. Besides, it turns out that the higher temperature can induce the inflation of the uncertainty because the thermal entanglement becomes relatively weak in this scenario, and there exists a distinct dynamical behavior of the uncertainty when an inhomogeneous magnetic field emerges. With the growing magnetic field | B | , the variation of the entropic uncertainty will be non-monotonic. Meanwhile, we compare several different optimized bounds existing with the initial bound proposed by Berta et al. and consequently conclude Adabi et al.'s result is optimal. Moreover, we also investigate the mixedness of the system of interest, dramatically associated with the uncertainty. Remarkably, we put forward a possible physical interpretation to explain the evolutionary phenomenon of the uncertainty. Finally, we take advantage of a local filtering operation to steer the magnitude of the uncertainty. Therefore, our explorations may shed light on the entropic uncertainty under the Heisenberg XXZ model and hence be of importance to quantum precision measurement over solid state-based quantum information processing.

  12. Mass discharge estimation from contaminated sites: Multi-model solutions for assessment of conceptual uncertainty

    Science.gov (United States)

    Thomsen, N. I.; Troldborg, M.; McKnight, U. S.; Binning, P. J.; Bjerg, P. L.

    2012-04-01

    Mass discharge estimates are increasingly being used in the management of contaminated sites. Such estimates have proven useful for supporting decisions related to the prioritization of contaminated sites in a groundwater catchment. Potential management options can be categorised as follows: (1) leave as is, (2) clean up, or (3) further investigation needed. However, mass discharge estimates are often very uncertain, which may hamper the management decisions. If option 1 is incorrectly chosen soil and water quality will decrease, threatening or destroying drinking water resources. The risk of choosing option 2 is to spend money on remediating a site that does not pose a problem. Choosing option 3 will often be safest, but may not be the optimal economic solution. Quantification of the uncertainty in mass discharge estimates can therefore greatly improve the foundation for selecting the appropriate management option. The uncertainty of mass discharge estimates depends greatly on the extent of the site characterization. A good approach for uncertainty estimation will be flexible with respect to the investigation level, and account for both parameter and conceptual model uncertainty. We propose a method for quantifying the uncertainty of dynamic mass discharge estimates from contaminant point sources on the local scale. The method considers both parameter and conceptual uncertainty through a multi-model approach. The multi-model approach evaluates multiple conceptual models for the same site. The different conceptual models consider different source characterizations and hydrogeological descriptions. The idea is to include a set of essentially different conceptual models where each model is believed to be realistic representation of the given site, based on the current level of information. Parameter uncertainty is quantified using Monte Carlo simulations. For each conceptual model we calculate a transient mass discharge estimate with uncertainty bounds resulting from

  13. A tool for efficient, model-independent management optimization under uncertainty

    Science.gov (United States)

    White, Jeremy; Fienen, Michael N.; Barlow, Paul M.; Welter, Dave E.

    2018-01-01

    To fill a need for risk-based environmental management optimization, we have developed PESTPP-OPT, a model-independent tool for resource management optimization under uncertainty. PESTPP-OPT solves a sequential linear programming (SLP) problem and also implements (optional) efficient, “on-the-fly” (without user intervention) first-order, second-moment (FOSM) uncertainty techniques to estimate model-derived constraint uncertainty. Combined with a user-specified risk value, the constraint uncertainty estimates are used to form chance-constraints for the SLP solution process, so that any optimal solution includes contributions from model input and observation uncertainty. In this way, a “single answer” that includes uncertainty is yielded from the modeling analysis. PESTPP-OPT uses the familiar PEST/PEST++ model interface protocols, which makes it widely applicable to many modeling analyses. The use of PESTPP-OPT is demonstrated with a synthetic, integrated surface-water/groundwater model. The function and implications of chance constraints for this synthetic model are discussed.

  14. Future bloom and blossom frost risk for Malus domestica considering climate model and impact model uncertainties.

    Directory of Open Access Journals (Sweden)

    Holger Hoffmann

    Full Text Available The future bloom and risk of blossom frosts for Malus domestica were projected using regional climate realizations and phenological ( = impact models. As climate impact projections are susceptible to uncertainties of climate and impact models and model concatenation, the significant horizon of the climate impact signal was analyzed by applying 7 impact models, including two new developments, on 13 climate realizations of the IPCC emission scenario A1B. Advancement of phenophases and a decrease in blossom frost risk for Lower Saxony (Germany for early and late ripeners was determined by six out of seven phenological models. Single model/single grid point time series of bloom showed significant trends by 2021-2050 compared to 1971-2000, whereas the joint signal of all climate and impact models did not stabilize until 2043. Regarding blossom frost risk, joint projection variability exceeded the projected signal. Thus, blossom frost risk cannot be stated to be lower by the end of the 21st century despite a negative trend. As a consequence it is however unlikely to increase. Uncertainty of temperature, blooming date and blossom frost risk projection reached a minimum at 2078-2087. The projected phenophases advanced by 5.5 d K(-1, showing partial compensation of delayed fulfillment of the winter chill requirement and faster completion of the following forcing phase in spring. Finally, phenological model performance was improved by considering the length of day.

  15. Locating and quantifying geological uncertainty in three-dimensional models : analysis of the Gippsland Basin, southeastern Australia

    OpenAIRE

    Lindsay, M. D.; Ailleres, L.; Jessell, Mark; de Kemp, E. A.; Betts, P. G.

    2012-01-01

    Geological three-dimensional (3D) models are constructed to reliably represent a given geological target. The reliability of a model is heavily dependent on the input data and is sensitive to uncertainty. This study examines the uncertainty introduced by geological orientation data by producing a suite of implicit 3d models generated from orientation measurements subjected to uncertainty simulations. The resulting uncertainty associated with different regions of the geological model can be lo...

  16. Review of uncertainty estimates associated with models for assessing the impact of breeder reactor radioactivity releases

    International Nuclear Information System (INIS)

    Miller, C.; Little, C.A.

    1982-08-01

    The purpose is to summarize estimates based on currently available data of the uncertainty associated with radiological assessment models. The models being examined herein are those recommended previously for use in breeder reactor assessments. Uncertainty estimates are presented for models of atmospheric and hydrologic transport, terrestrial and aquatic food-chain bioaccumulation, and internal and external dosimetry. Both long-term and short-term release conditions are discussed. The uncertainty estimates presented in this report indicate that, for many sites, generic models and representative parameter values may be used to calculate doses from annual average radionuclide releases when these calculated doses are on the order of one-tenth or less of a relevant dose limit. For short-term, accidental releases, especially those from breeder reactors located in sites dominated by complex terrain and/or coastal meteorology, the uncertainty in the dose calculations may be much larger than an order of magnitude. As a result, it may be necessary to incorporate site-specific information into the dose calculation under these circumstances to reduce this uncertainty. However, even using site-specific information, natural variability and the uncertainties in the dose conversion factor will likely result in an overall uncertainty of greater than an order of magnitude for predictions of dose or concentration in environmental media following shortterm releases

  17. Theoretical uncertainties of the Duflo–Zuker shell-model mass formulae

    International Nuclear Information System (INIS)

    Qi, Chong

    2015-01-01

    It is becoming increasingly important to understand the uncertainties of nuclear mass model calculations and their limitations when extrapolating to driplines. In this paper we evaluate the parameter uncertainties of the Duflo–Zuker (DZ) shell model mass formulae by fitting to the latest experimental mass compilation AME2012 using the least square and minimax fitting procedures. We also analyze the propagation of the uncertainties in binding energy calculations when extrapolated to driplines. The parameter uncertainties and uncertain propagations are evaluated with the help of the covariance matrix thus derived. Large deviations from the extrapolations of AME2012 are seen in superheavy nuclei. A simplified version of the DZ model (DZ19) with much smaller uncertainties than that of DZ33 is proposed. Calculations are compared with results from other mass formulae. Systematics on the uncertainty propagation as well as the positions of the driplines are also presented. The DZ mass formulae are shown to be well defined with good extrapolation properties and rather small uncertainties, even though some of the parameters of the full DZ33 model cannot be fully determined by fitting to available experimental data. (paper)

  18. Uncertainties in model predictions of nitrogen fluxes from agro-ecosystems in Europe

    Directory of Open Access Journals (Sweden)

    J. Kros

    2012-11-01

    Full Text Available To assess the responses of nitrogen and greenhouse gas emissions to pan-European changes in land cover, land management and climate, an integrated dynamic model, INTEGRATOR, has been developed. This model includes both simple process-based descriptions and empirical relationships and uses detailed GIS-based environmental and farming data in combination with various downscaling methods. This paper analyses the propagation of uncertainties in model inputs and parameters to outputs of INTEGRATOR, using a Monte Carlo analysis. Uncertain model inputs and parameters were represented by probability distributions, while spatial correlation in these uncertainties was taken into account by assigning correlation coefficients at various spatial scales. The uncertainty propagation was analysed for the emissions of NH3, N2O and NOx, N leaching to groundwater and N runoff to surface water for the entire EU27 and for individual countries. Results show large uncertainties for N leaching and runoff (relative errors of ∼ 19% for Europe as a whole, and smaller uncertainties for emission of N2O, NH3 and NOx (relative errors of ∼ 12%. Uncertainties for Europe as a whole were much smaller compared to uncertainties at country level, because errors partly cancelled out due to spatial aggregation.

  19. Impact of rainfall temporal resolution on urban water quality modelling performance and uncertainties.

    Science.gov (United States)

    Manz, Bastian Johann; Rodríguez, Juan Pablo; Maksimović, Cedo; McIntyre, Neil

    2013-01-01

    A key control on the response of an urban drainage model is how well the observed rainfall records represent the real rainfall variability. Particularly in urban catchments with fast response flow regimes, the selection of temporal resolution in rainfall data collection is critical. Furthermore, the impact of the rainfall variability on the model response is amplified for water quality estimates, as uncertainty in rainfall intensity affects both the rainfall-runoff and pollutant wash-off sub-models, thus compounding uncertainties. A modelling study was designed to investigate the impact of altering rainfall temporal resolution on the magnitude and behaviour of uncertainties associated with the hydrological modelling compared with water quality modelling. The case study was an 85-ha combined sewer sub-catchment in Bogotá (Colombia). Water quality estimates showed greater sensitivity to the inter-event variability in rainfall hyetograph characteristics than to changes in the rainfall input temporal resolution. Overall, uncertainties from the water quality model were two- to five-fold those of the hydrological model. However, owing to the intrinsic scarcity of observations in urban water quality modelling, total model output uncertainties, especially from the water quality model, were too large to make recommendations for particular model structures or parameter values with respect to rainfall temporal resolution.

  20. A stochastic multi-agent optimization model for energy infrastructure planning under uncertainty and competition.

    Science.gov (United States)

    2017-07-04

    This paper presents a stochastic multi-agent optimization model that supports energy infrastruc- : ture planning under uncertainty. The interdependence between dierent decision entities in the : system is captured in an energy supply chain network, w...

  1. Effects of climate model interdependency on the uncertainty quantification of extreme reinfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan

    The inherent uncertainty in climate models is one of the most important uncertainties in climate change impact studies. In recent years, several uncertainty quantification methods based on multi-model ensembles have been suggested. Most of these methods assume that the climate models...... are independent. This study investigates the validity of this assumption and its effects on the estimated probabilistic projections of the changes in the 95% quantile of wet days. The methodology is divided in two main parts. First, the interdependency of the ENSEMBLES RCMs is estimated using the methodology...... developed by Pennell and Reichler (2011). The results show that the projections from the ENSEMBLES RCMs cannot be assumed independent. This result is then used to estimate the uncertainty in climate model projections. A Bayesian approach has been developed using the procedure suggested by Tebaldi et al...

  2. Effects of climate model interdependency on the uncertainty quantification of extreme rainfall projections

    DEFF Research Database (Denmark)

    Sunyer, M. A.; Rosbjerg, Dan; Arnbjerg-Nielsen, Karsten

    2017-01-01

    The inherent uncertainty in climate models is one of the most important uncertainties in climate change impact studies. In recent years, several uncertainty quantification methods based on multi-model ensembles have been suggested. Most of these methods assume that the climate models...... are independent. This study investigates the validity of this assumption and its effects on the estimated probabilistic projections of the changes in the 95% quantile of wet days. The methodology is divided in two main parts. First, the interdependency of the ENSEMBLES RCMs is estimated using the methodology...... developed by Pennell and Reichler (2011). The results show that the projections from the ENSEMBLES RCMs cannot be assumed independent. This result is then used to estimate the uncertainty in climate model projections. A Bayesian approach has been developed using the procedure suggested by Tebaldi et al...

  3. Simultaneous Experimentation as a Learning Strategy: Business Model Development Under Uncertainty

    NARCIS (Netherlands)

    Andries, Petra; Debackere, Koenraad; van Looy, Bart

    2013-01-01

    Ventures operating under uncertainty face challenges defining a sustainable value proposition. Six longitudinal case studies reveal two approaches to business model development: focused commitment and simultaneous experimentation. While focused commitment positively affects initial growth, this

  4. Uncertainty Representation and Interpretation in Model-based Prognostics Algorithms based on Kalman Filter Estimation

    Data.gov (United States)

    National Aeronautics and Space Administration — This article discusses several aspects of uncertainty represen- tation and management for model-based prognostics method- ologies based on our experience with Kalman...

  5. Uncertainty and Sensitivity Analysis of Filtration Models for Non-Fickian transport and Hyperexponential deposition

    DEFF Research Database (Denmark)

    Yuan, Hao; Sin, Gürkan

    2011-01-01

    filtration coefficients and the CTRW equation expressed in Laplace space, are selected to simulate eight experiments. These experiments involve both porous media and colloid-medium interactions of different heterogeneity degrees. The uncertainty of elliptic equation predictions with distributed filtration......Uncertainty and sensitivity analyses are carried out to investigate the predictive accuracy of the filtration models for describing non-Fickian transport and hyperexponential deposition. Five different modeling approaches, involving the elliptic equation with different types of distributed...... coefficients is larger than that with a single filtration coefficient. The uncertainties of model predictions from the elliptic equation and CTRW equation in Laplace space are minimal for solute transport. Higher uncertainties of parameter estimation and model outputs are observed in the cases with the porous...

  6. Essays in energy policy and planning modeling under uncertainty: Value of information, optimistic biases, and simulation of capacity markets

    Science.gov (United States)

    Hu, Ming-Che

    . Theoretical analyses prove the general existence of this bias for both competitive and oligopolistic models when production costs and demand curves are uncertain. Also demonstrated is an optimistic bias for the net benefits of introducing a new technology into a market when the cost of the new technology is uncertainty. The optimistic biases are quantified for a model of the northwest European electricity market (including Belgium, France, Germany and the Netherlands). Demand uncertainty results in an optimistic bias of 150,000-220,000 [Euro]/hr of total surplus and natural gas price uncertainty yields a smaller bias of 8,000-10,000 [Euro]/hr for total surplus. Further, adding a new uncertain technology (biomass) to the set of possible generation methods almost doubles the optimistic bias (14,000-18,000 [Euro]/hr). The third question concerns ex ante evaluation of the Reliability Pricing Model (RPM)---the new PJM capacity market---launched in June 2007. A Monte Carlo simulation model is developed to simulate PJM capacity market and predict market performance, producer revenue, and consumer payments. An important input to RPM is a demand curve for capacity; several alternative demand curves are compared, and sensitivity analyses conducted of those conclusions. One conclusion is that the sloped demand curves are more robust because those demand curves gives higher reliability with lower consumer payments. In addition, the performance of the curves is evaluated for a more sophisticated market design in which the demand curve can be adjusted in response to previous market outcomes and where the capital costs may change unexpectedly. The simulation shows that curve adjustment increases system reliability with lower consumer payments. Also the effect of learning-by-doing, leading to lower plant capital costs, leads to higher average reserve margin and lower consumer payments. In contrast, a the sudden rise in capital costs causes a decrease in reliability and an increase in

  7. A Study on Uncertainty Quantification of Reflood Model using CIRCE Methodology

    International Nuclear Information System (INIS)

    Jeon, Seongsu; Hong, Soonjoon; Oh, Deogyeon; Bang, Youngseok

    2013-01-01

    The CIRCE method is intended to quantify the uncertainties of the correlations of a code. It may replace the expert judgment generally used. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. In this paper, the application process of CIRCE methodology and main results are briefly described. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM. In this study, an uncertainty quantification of reflood model was performed using CIRCE methodology. The application of CIRCE provided the satisfactory results. This research is expected to be useful to improve the present audit calculation methodology, KINS-REM

  8. A Bayesian-based multilevel factorial analysis method for analyzing parameter uncertainty of hydrological model

    Science.gov (United States)

    Liu, Y. R.; Li, Y. P.; Huang, G. H.; Zhang, J. L.; Fan, Y. R.

    2017-10-01

    In this study, a Bayesian-based multilevel factorial analysis (BMFA) method is developed to assess parameter uncertainties and their effects on hydrological model responses. In BMFA, Differential Evolution Adaptive Metropolis (DREAM) algorithm is employed to approximate the posterior distributions of model parameters with Bayesian inference; factorial analysis (FA) technique is used for measuring the specific variations of hydrological responses in terms of posterior distributions to investigate the individual and interactive effects of parameters on model outputs. BMFA is then applied to a case study of the Jinghe River watershed in the Loess Plateau of China to display its validity and applicability. The uncertainties of four sensitive parameters, including soil conservation service runoff curve number to moisture condition II (CN2), soil hydraulic conductivity (SOL_K), plant available water capacity (SOL_AWC), and soil depth (SOL_Z), are investigated. Results reveal that (i) CN2 has positive effect on peak flow, implying that the concentrated rainfall during rainy season can cause infiltration-excess surface flow, which is an considerable contributor to peak flow in this watershed; (ii) SOL_K has positive effect on average flow, implying that the widely distributed cambisols can lead to medium percolation capacity; (iii) the interaction between SOL_AWC and SOL_Z has noticeable effect on the peak flow and their effects are dependent upon each other, which discloses that soil depth can significant influence the processes of plant uptake of soil water in this watershed. Based on the above findings, the significant parameters and the relationship among uncertain parameters can be specified, such that hydrological model's capability for simulating/predicting water resources of the Jinghe River watershed can be improved.

  9. Examination of the uncertainty in contaminant fate and transport modeling: a case study in the Venice Lagoon.

    Science.gov (United States)

    Sommerfreund, J; Arhonditsis, G B; Diamond, M L; Frignani, M; Capodaglio, G; Gerino, M; Bellucci, L; Giuliani, S; Mugnai, C

    2010-03-01

    A Monte Carlo analysis is used to quantify environmental parametric uncertainty in a multi-segment, multi-chemical model of the Venice Lagoon. Scientific knowledge, expert judgment and observational data are used to formulate prior probability distributions that characterize the uncertainty pertaining to 43 environmental system parameters. The propagation of this uncertainty through the model is then assessed by a comparative analysis of the moments (central tendency, dispersion) of the model output distributions. We also apply principal component analysis in combination with correlation analysis to identify the most influential parameters, thereby gaining mechanistic insights into the ecosystem functioning. We found that modeled concentrations of Cu, Pb, OCDD/F and PCB-180 varied by up to an order of magnitude, exhibiting both contaminant- and site-specific variability. These distributions generally overlapped with the measured concentration ranges. We also found that the uncertainty of the contaminant concentrations in the Venice Lagoon was characterized by two modes of spatial variability, mainly driven by the local hydrodynamic regime, which separate the northern and central parts of the lagoon and the more isolated southern basin. While spatial contaminant gradients in the lagoon were primarily shaped by hydrology, our analysis also shows that the interplay amongst the in-place historical pollution in the central lagoon, the local suspended sediment concentrations and the sediment burial rates exerts significant control on the variability of the contaminant concentrations. We conclude that the probabilistic analysis presented herein is valuable for quantifying uncertainty and probing its cause in over-parameterized models, while some of our results can be used to dictate where additional data collection efforts should focus on and the directions that future model refinement should follow. (c) 2009 Elsevier Inc. All rights reserved.

  10. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Science.gov (United States)

    Franz, K. J.; Hogue, T. S.

    2011-11-01

    The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP) systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE), and the Shuffle Complex Evolution Metropolis (SCEM). Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA) model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  11. Evaluating uncertainty estimates in hydrologic models: borrowing measures from the forecast verification community

    Directory of Open Access Journals (Sweden)

    K. J. Franz

    2011-11-01

    Full Text Available The hydrologic community is generally moving towards the use of probabilistic estimates of streamflow, primarily through the implementation of Ensemble Streamflow Prediction (ESP systems, ensemble data assimilation methods, or multi-modeling platforms. However, evaluation of probabilistic outputs has not necessarily kept pace with ensemble generation. Much of the modeling community is still performing model evaluation using standard deterministic measures, such as error, correlation, or bias, typically applied to the ensemble mean or median. Probabilistic forecast verification methods have been well developed, particularly in the atmospheric sciences, yet few have been adopted for evaluating uncertainty estimates in hydrologic model simulations. In the current paper, we overview existing probabilistic forecast verification methods and apply the methods to evaluate and compare model ensembles produced from two different parameter uncertainty estimation methods: the Generalized Uncertainty Likelihood Estimator (GLUE, and the Shuffle Complex Evolution Metropolis (SCEM. Model ensembles are generated for the National Weather Service SACramento Soil Moisture Accounting (SAC-SMA model for 12 forecast basins located in the Southeastern United States. We evaluate the model ensembles using relevant metrics in the following categories: distribution, correlation, accuracy, conditional statistics, and categorical statistics. We show that the presented probabilistic metrics are easily adapted to model simulation ensembles and provide a robust analysis of model performance associated with parameter uncertainty. Application of these methods requires no information in addition to what is already available as part of traditional model validation methodology and considers the entire ensemble or uncertainty range in the approach.

  12. Some concepts of model uncertainty for performance assessments of nuclear waste repositories

    International Nuclear Information System (INIS)

    Eisenberg, N.A.; Sagar, B.; Wittmeyer, G.W.

    1994-01-01

    Models of the performance of nuclear waste repositories will be central to making regulatory decisions regarding the safety of such facilities. The conceptual model of repository performance is represented by mathematical relationships, which are usually implemented as one or more computer codes. A geologic system may allow many conceptual models, which are consistent with the observations. These conceptual models may or may not have the same mathematical representation. Experiences in modeling the performance of a waste repository representation. Experiences in modeling the performance of a waste repository (which is, in part, a geologic system), show that this non-uniqueness of conceptual models is a significant source of model uncertainty. At the same time, each conceptual model has its own set of parameters and usually, it is not be possible to completely separate model uncertainty from parameter uncertainty for the repository system. Issues related to the origin of model uncertainty, its relation to parameter uncertainty, and its incorporation in safety assessments are discussed from a broad regulatory perspective. An extended example in which these issues are explored numerically is also provided

  13. Reducing Uncertainty in Chemistry Climate Model Predictions of Stratospheric Ozone

    Science.gov (United States)

    Douglass, A. R.; Strahan, S. E.; Oman, L. D.; Stolarski, R. S.

    2014-01-01

    Chemistry climate models (CCMs) are used to predict the future evolution of stratospheric ozone as ozone-depleting substances decrease and greenhouse gases increase, cooling the stratosphere. CCM predictions exhibit many common features, but also a broad range of values for quantities such as year of ozone-return-to-1980 and global ozone level at the end of the 21st century. Multiple linear regression is applied to each of 14 CCMs to separate ozone response to chlorine change from that due to climate change. We show that the sensitivity of lower atmosphere ozone to chlorine change deltaO3/deltaCly is a near linear function of partitioning of total inorganic chlorine (Cly) into its reservoirs; both Cly and its partitioning are controlled by lower atmospheric transport. CCMs with realistic transport agree with observations for chlorine reservoirs and produce similar ozone responses to chlorine change. After 2035 differences in response to chlorine contribute little to the spread in CCM results as the anthropogenic contribution to Cly becomes unimportant. Differences among upper stratospheric ozone increases due to temperature decreases are explained by differences in ozone sensitivity to temperature change deltaO3/deltaT due to different contributions from various ozone loss processes, each with their own temperature dependence. In the lower atmosphere, tropical ozone decreases caused by a predicted speed-up in the Brewer-Dobson circulation may or may not be balanced by middle and high latitude increases, contributing most to the spread in late 21st century predictions.

  14. Predicting Cumulative Incidence Probability: Marginal and Cause-Specific Modelling

    DEFF Research Database (Denmark)

    Scheike, Thomas H.; Zhang, Mei-Jie

    2005-01-01

    cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling......cumulative incidence probability; cause-specific hazards; subdistribution hazard; binomial modelling...

  15. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    2012-01-01

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one......, such as time-evolving shorelines and paleo coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...... due to GIA. GIA errors are also important in the far field of previously glaciated areas and in the time evolution of global indicators. In this regard we also account for other possible errors sources which can impact global indicators like the sea level history related to GIA....

  16. Accounting for uncertainty in ecological analysis: the strengths and limitations of hierarchical statistical modeling.

    Science.gov (United States)

    Cressie, Noel; Calder, Catherine A; Clark, James S; Ver Hoef, Jay M; Wikle, Christopher K

    2009-04-01

    Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

  17. A novel approach to parameter uncertainty analysis of hydrological models using neural networks

    Directory of Open Access Journals (Sweden)

    D. P. Solomatine

    2009-07-01

    Full Text Available In this study, a methodology has been developed to emulate a time consuming Monte Carlo (MC simulation by using an Artificial Neural Network (ANN for the assessment of model parametric uncertainty. First, MC simulation of a given process model is run. Then an ANN is trained to approximate the functional relationships between the input variables of the process model and the synthetic uncertainty descriptors estimated from the MC realizations. The trained ANN model encapsulates the underlying characteristics of the parameter uncertainty and can be used to predict uncertainty descriptors for the new data vectors. This approach was validated by comparing the uncertainty descriptors in the verification data set with those obtained by the MC simulation. The method is applied to estimate the parameter uncertainty of a lumped conceptual hydrological model, HBV, for the Brue catchment in the United Kingdom. The results are quite promising as the prediction intervals estimated by the ANN are reasonably accurate. The proposed techniques could be useful in real time applications when it is not practicable to run a large number of simulations for complex hydrological models and when the forecast lead time is very short.

  18. Development of mechanistic sorption model and treatment of uncertainties for Ni sorption on montmorillonite/bentonite

    International Nuclear Information System (INIS)

    Ochs, Michael; Ganter, Charlotte; Tachi, Yukio; Suyama, Tadahiro; Yui, Mikazu

    2011-02-01

    Sorption and diffusion of radionuclides in buffer materials (bentonite) are the key processes in the safe geological disposal of radioactive waste, because migration of radionuclides in this barrier is expected to be diffusion-controlled and retarded by sorption processes. It is therefore necessary to understand the detailed/coupled processes of sorption and diffusion in compacted bentonite and develop mechanistic /predictive models, so that reliable parameters can be set under a variety of geochemical conditions relevant to performance assessment (PA). For this purpose, JAEA has developed the integrated sorption and diffusion (ISD) model/database in montmorillonite/bentonite systems. The main goal of the mechanistic model/database development is to provide a tool for a consistent explanation, prediction, and uncertainty assessment of K d as well as diffusion parameters needed for the quantification of radionuclide transport. The present report focuses on developing the thermodynamic sorption model (TSM) and on the quantification and handling of model uncertainties in applications, based on illustrating by example of Ni sorption on montmorillonite/bentonite. This includes 1) a summary of the present state of the art of thermodynamic sorption modeling, 2) a discussion of the selection of surface species and model design appropriate for the present purpose, 3) possible sources and representations of TSM uncertainties, and 4) details of modeling, testing and uncertainty evaluation for Ni sorption. Two fundamentally different approaches are presented and compared for representing TSM uncertainties: 1) TSM parameter uncertainties calculated by FITEQL optimization routines and some statistical procedure, 2) overall error estimated by direct comparison of modeled and experimental K d values. The overall error in K d is viewed as the best representation of model uncertainty in ISD model/database development. (author)

  19. Uncertainty in projected climate change caused by methodological discrepancy in estimating CO2 emissions from fossil fuel combustion

    Science.gov (United States)

    Quilcaille, Yann; Gasser, Thomas; Ciais, Philippe; Lecocq, Franck; Janssens-Maenhout, Greet; Mohr, Steve; Andres, Robert J.; Bopp, Laurent

    2016-04-01

    There are different methodologies to estimate CO2 emissions from fossil fuel combustion. The term "methodology" refers to the way subtypes of fossil fuels are aggregated and their implied emissions factors. This study investigates how the choice of a methodology impacts historical and future CO2 emissions, and ensuing climate change projections. First, we use fossil fuel extraction data from the Geologic Resources Supply-Demand model of Mohr et al. (2015). We compare four different methodologies to transform amounts of fossil fuel extracted into CO2 emissions based on the methodologies used by Mohr et al. (2015), CDIAC, EDGARv4.3, and IPCC 1996. We thus obtain 4 emissions pathways, for the historical period 1750-2012, that we compare to the emissions timeseries from EDGARv4.3 (1970-2012) and CDIACv2015 (1751-2011). Using the 3 scenarios by Mohr et al. (2015) for projections till 2300 under the assumption of an Early (Low emission), Best Guess or Late (High emission) extraction peaking, we obtain 12 different pathways of CO2 emissions over 1750-2300. Second, we extend these CO2-only pathways to all co-emitted and climatically active species. Co-emission ratios for CH4, CO, BC, OC, SO2, VOC, N2O, NH3, NOx are calculated on the basis of the EDGAR v4.3 dataset, and are then used to produce complementary pathways of non-CO2 emissions from fossil fuel combustion only. Finally, the 12 emissions scenarios are integrated using the compact Earth system model OSCAR v2.2, in order to quantify the impact of the selected driver onto climate change projections. We find historical cumulative fossil fuel CO2 emissions from 1750 to 2012 ranging from 365 GtC to 392 GtC depending upon the methodology used to convert fossil fuel into CO2 emissions. We notice a drastic increase of the impact of the methodology in the projections. For the High emission scenario with Late fuel extraction peaking, cumulated CO2 emissions from 1700 to 2100 range from 1505 GtC to 1685 GtC; this corresponds

  20. A method for analyzing geothermal gradient histories using the statistical assessment of uncertainties in maturity models

    Energy Technology Data Exchange (ETDEWEB)

    Huvaz, O. [Turkish Petroleum Corp., Ankara (Turkey). Exploration Group; Thomsen, R.O. [Maersk Oil and Gas AS, Copenhagen (Denmark); Noeth, S. [Schlumberger Data and Consultating Services, Houston, TX (United States)

    2005-04-01

    A major factor contributing to uncertainty in basin modelling is the determination of the parameters necessary to reconstruct the basin's thermal history. Thermal maturity modelling is widely used in basin modelling for assessing the exploration risk. Of the available models, the chemical kinetic model Easy%Ro has gained wide acceptance. In this study, the thermal gradient at five wells in the Danish North Sea is calibrated against vitrinite reflectance using the Easy%Ro model coupled with an inverse scheme in order to perform sensitivity analysis and to assess the uncertainty. The mean squared residual (MSR) is used as a quantitative measure of mismatch between the modelled and measured reflectance values. A 90% confidence interval is constructed for the determined mean of the squared residuals to assess the uncertainty for the given level of confidence. The sensitivity of the Easy%Ro model to variations in the thermal gradient is investigated using the uncertainty associated with scatter in the calibration data. The best thermal gradient (minimum MSR) is obtained from the MSR curve for each well. The aim is to show how the reconstruction of the thermal gradient is related to the control data and the applied model. The applied method helps not only to determine the average thermal gradient history of a basin, but also helps to investigate the quality of the calibration data and provides a quick assessment of the uncertainty and sensitivity of any parameter in a forward deterministic model. (author)

  1. Bias and uncertainty in regression-calibrated models of groundwater flow in heterogeneous media

    Science.gov (United States)

    Cooley, R.L.; Christensen, S.

    2006-01-01

    Groundwater models need to account for detailed but generally unknown spatial variability (heterogeneity) of the hydrogeologic model inputs. To address this problem we replace the large, m-dimensional stochastic vector ?? that reflects both small and large scales of heterogeneity in the inputs by a lumped or smoothed m-dimensional approximation ????*, where ?? is an interpolation matrix and ??* is a stochastic vector of parameters. Vector ??* has small enough dimension to allow its estimation with the available data. The consequence of the replacement is that model function f(????*) written in terms of the approximate inputs is in error with respect to the same model function written in terms of ??, ??,f(??), which is assumed to be nearly exact. The difference f(??) - f(????*), termed model error, is spatially correlated, generates prediction biases, and causes standard confidence and prediction intervals to be too small. Model error is accounted for in the weighted nonlinear regression methodology developed to estimate ??* and assess model uncertainties by incorporating the second-moment matrix of the model errors into the weight matrix. Techniques developed by statisticians to analyze classical nonlinear regression methods are extended to analyze the revised method. The analysis develops analytical expressions for bias terms reflecting the interaction of model nonlinearity and model error, for correction factors needed to adjust the sizes of confidence and prediction intervals for this interaction, and for correction factors needed to adjust the sizes of confidence and prediction intervals for possible use of a diagonal weight matrix in place of the correct one. If terms expressing the degree of intrinsic nonlinearity for f(??) and f(????*) are small, then most of the biases are small and the correction factors are reduced in magnitude. Biases, correction factors, and confidence and prediction intervals were obtained for a test problem for which model error is

  2. The role of hydrological model complexity and uncertainty in climate change impact assessment

    Directory of Open Access Journals (Sweden)

    D. Caya

    2009-08-01

    Full Text Available Little quantitative knowledge is as yet available about the role of hydrological model complexity for climate change impact assessment. This study investigates and compares the varieties of different model response of three hydrological models (PROMET, Hydrotel, HSAMI, each representing a different model complexity in terms of process description, parameter space and spatial and temporal scale. The study is performed in the Ammer watershed, a 709 km2 catchment in the Bavarian alpine forelands, Germany. All models are driven and validated by a 30-year time-series (1971–2000 of observation data. It is expressed by objective functions, that all models, HSAMI and Hydrotel due to calibration, perform almost equally well for runoff simulation over the validation period. Some systematic deviances in the hydrographs and the spatial patterns of hydrologic variables are however quite distinct and thus further discussed.

    Virtual future climate (2071–2100 is generated by the Canadian Regional Climate Model (vers 3.7.1, driven by the Coupled Global Climate Model (vers. 2 based on an A2 emission scenario (IPCC 2007. The hydrological model performance is evaluated by flow indicators, such as flood frequency, annual 7-day and 30-day low flow and maximum seasonal flows. The modified climatic boundary conditions cause dramatic deviances in hydrologic model response. HSAMI shows tremendous overestimation of evapotranspiration, while Hydrotel and PROMET behave in comparable range. Still, their significant differences, like spatially explicit patterns of summerly water shortage or spring flood intensity, highlight the necessity to extend and quantify the uncertainty discussion in climate change impact analysis towards the remarkable effect of hydrological model complexity. It is obvious that for specific application purposes, water resources managers need to be made aware of this effect and have to take its implications into account for

  3. The cascade of uncertainty in modeling the impacts of climate change on Europe's forests

    Science.gov (United States)

    Reyer, Christopher; Lasch-Born, Petra; Suckow, Felicitas; Gutsch, Martin

    2015-04-01

    Projecting the impacts of global change on forest ecosystems is a cornerstone for designing sustainable forest management strategies and paramount for assessing the potential of Europe's forest to contribute to the EU bioeconomy. Research on climate change impacts on forests relies to a large extent on model applications along a model chain from Integrated Assessment Models to General and Regional Circulation Models that provide important driving variables for forest models. Or to decision support systems that synthesize findings of more detailed forest models to inform forest managers. At each step in the model chain, model-specific uncertainties about, amongst others, parameter values, input data or model structure accumulate, leading to a cascade of uncertainty. For example, climate change impacts on forests strongly depend on the in- or exclusion of CO2-effects or on the use of an ensemble of climate models rather than relying on one particular climate model. In the past, these uncertainties have not or only partly been considered in studies of climate change impacts on forests. This has left managers and decision-makers in doubt of how robust the projected impacts on forest ecosystems are. We deal with this cascade of uncertainty in a structured way and the objective of this presentation is to assess how different types of uncertainties affect projections of the effects of climate change on forest ecosystems. To address this objective we synthesized a large body of scientific literature on modeled productivity changes and the effects of extreme events on plant processes. Furthermore, we apply the process-based forest growth model 4C to forest stands all over Europe and assess how different climate models, emission scenarios and assumptions about the parameters and structure of 4C affect the uncertainty of the model projections. We show that there are consistent regional changes in forest productivity such as an increase in NPP in cold and wet regions while

  4. Propagation of uncertainty in system parameters of a LWR model by sampling MCNPX calculations - Burnup analysis

    International Nuclear Information System (INIS)

    Campolina, D. de A. M.; Lima, C.P.B.; Veloso, M.A.F.

    2013-01-01

    For all the physical components that comprise a nuclear system there is an uncertainty. Assessing the impact of uncertainties in the simulation of fissionable material systems is essential for a best estimate calculation that has been replacing the conservative model calculations as the computational power increases. The propagation of uncertainty in a simulation using a Monte Carlo code by sampling the input parameters is recent because of the huge computational effort required. In this work a sample space of MCNPX calculations was used to propagate the uncertainty. The sample size was optimized using the Wilks formula for a 95. percentile and a two-sided statistical tolerance interval of 95%. Uncertainties in input parameters of the reactor considered included geometry dimensions and densities. It was showed the capacity of the sampling-based method for burnup when the calculations sample size is optimized and many parameter uncertainties are investigated together, in the same input. Particularly it was shown that during the burnup, the variances when considering all the parameters uncertainties is equivalent to the sum of variances if the parameter uncertainties are sampled separately

  5. Assessing the DICE model: uncertainty associated with the emission and retention of greenhouse gases

    International Nuclear Information System (INIS)

    Kaufmann, R.K.

    1997-01-01

    Analysis of the DICE model indicates that it contains unsupported assumptions, simple extrapolations, and mis-specifications that cause it to understate the rate at which economic activity emits greenhouse gases and the rate at which the atmosphere retains greenhouse gases. The model assumes a world population that is 2 billion people lower than the 'base case' projected by demographers. The model extrapolates a decline in the quantity of greenhouse gases emitted per unit of economic activity that is possible only if there is a structural break in the economic and engineering factors have determined this ratio over the last century. The model uses a single equation to simulate the rate at which greenhouse gases accumulate in the atmosphere. The forecast for the airborne fraction generated by this equation contradicts forecasts generated by models that represent the physical and chemical processes which determine the movement of carbon from the atmosphere to the ocean. When these unsupported assumptions, simple extrapolations, and misspecifications are remedied with simple fixes, the economic impact of global climate change increases several fold. Similarly, these remedies increase the impact of uncertainty on estimates for the economic impact of global climate change. Together, these results indicate that considerable scientific and economic research is needed before the threat of climate change can be dismissed with any degree of certainty. 23 refs., 3 figs

  6. Uncertainty assessment of integrated distributed hydrological models using GLUE with Markov chain Monte Carlo sampling

    DEFF Research Database (Denmark)

    Blasone, Roberta-Serena; Madsen, Henrik; Rosbjerg, Dan

    2008-01-01

    uncertainty estimation (GLUE) procedure based on Markov chain Monte Carlo sampling is applied in order to improve the performance of the methodology in estimating parameters and posterior output distributions. The description of the spatial variations of the hydrological processes is accounted for by defining......-distributed responses are, however, still quite unexplored. Especially for complex models, rigorous parameterization, reduction of the parameter space and use of efficient and effective algorithms are essential to facilitate the calibration process and make it more robust. Moreover, for these models multi...... the identifiability of the parameters and results in satisfactory multi-variable simulations and uncertainty estimates. However, the parameter uncertainty alone cannot explain the total uncertainty at all the sites, due to limitations in the distributed data included in the model calibration. The study also indicates...

  7. Spatial uncertainty modeling of fuzzy information in images for pattern classification.

    Directory of Open Access Journals (Sweden)

    Tuan D Pham

    Full Text Available The modeling of the spatial distribution of image properties is important for many pattern recognition problems in science and engineering. Mathematical methods are needed to quantify the variability of this spatial distribution based on which a decision of classification can be made in an optimal sense. However, image properties are often subject to uncertainty due to both incomplete and imprecise information. This paper presents an integrated approach for estimating the spatial uncertainty of vagueness in images using the theory of geostatistics and the calculus of probability measures of fuzzy events. Such a model for the quantification of spatial uncertainty is utilized as a new image feature extraction method, based on which classifiers can be trained to perform the task of pattern recognition. Applications of the proposed algorithm to the classification of various types of image data suggest the usefulness of the proposed uncertainty modeling technique for texture feature extraction.

  8. Sliding mode fault tolerant control dealing with modeling uncertainties and actuator faults.

    Science.gov (United States)

    Wang, Tao; Xie, Wenfang; Zhang, Youmin

    2012-05-01

    In this paper, two sliding mode control algorithms are developed for nonlinear systems with both modeling uncertainties and actuator faults. The first algorithm is developed under an assumption that the uncertainty bounds are known. Different design parameters are utilized to deal with modeling uncertainties and actuator faults, respectively. The second algorithm is an adaptive version of the first one, which is developed to accommodate uncertainties and faults without utilizing exact bounds information. The stability of the overall control systems is proved by using a Lyapunov function. The effectiveness of the developed algorithms have been verified on a nonlinear longitudinal model of Boeing 747-100/200. Copyright © 2012 ISA. Published by Elsevier Ltd. All rights reserved.

  9. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  10. Balancing the stochastic description of uncertainties as a function of hydrologic model complexity

    Science.gov (United States)

    Del Giudice, D.; Reichert, P.; Albert, C.; Kalcic, M.; Logsdon Muenich, R.; Scavia, D.; Bosch, N. S.; Michalak, A. M.

    2016-12-01

    Uncertainty analysis is becoming an important component of forecasting water and pollutant fluxes in urban and rural environments. Properly accounting for errors in the modeling process can help to robustly assess the uncertainties associated with the inputs (e.g. precipitation) and outputs (e.g. runoff) of hydrological models. In recent years we have investigated several Bayesian methods to infer the parameters of a mechanistic hydrological model along with those of the stochastic error component. The latter describes the uncertainties of model outputs and possibly inputs. We have adapted our framework to a variety of applications, ranging from predicting floods in small stormwater systems to nutrient loads in large agricultural watersheds. Given practical constraints, we discuss how in general the number of quantities to infer probabilistically varies inversely with the complexity of the mechanistic model. Most often, when evaluating a hydrological model of intermediate complexity, we can infer the parameters of the model as well as of the output error model. Describing the output errors as a first order autoregressive process can realistically capture the "downstream" effect of inaccurate inputs and structure. With simpler runoff models we can additionally quantify input uncertainty by using a stochastic rainfall process. For complex hydrologic transport models, instead, we show that keeping model parameters fixed and just estimating time-dependent output uncertainties could be a viable option. The common goal across all these applications is to create time-dependent prediction intervals which are both reliable (cover the nominal amount of validation data) and precise (are as narrow as possible). In conclusion, we recommend focusing both on the choice of the hydrological model and of the probabilistic error description. The latter can include output uncertainty only, if the model is computationally-expensive, or, with simpler models, it can separately account

  11. Bayesian uncertainty assessment of a semi-distributed integrated catchment model of phosphorus transport.

    Science.gov (United States)

    Starrfelt, Jostein; Kaste, Øyvind

    2014-07-01

    Process-based models of nutrient transport are often used as tools for management of eutrophic waters, as decision makers need to judge the potential effects of alternative remediation measures, under current conditions and with future land use and climate change. All modelling exercises entail uncertainty arising from various sources, such as the input data, selection of parameter values and the choice of model itself. Here we perform Bayesian uncertainty assessment of an integrated catchment model of phosphorus (INCA-P). We use an auto-calibration procedure and an algorithm for including parametric uncertainty to simulate phosphorus transport in a Norwegian lowland river basin. Two future scenarios were defined to exemplify the importance of parametric uncertainty in generating predictions. While a worst case scenario yielded a robust prediction of increased loading of phosphorus, a best case scenario only gave rise to a reduction in load with probability 0.78, highlighting the importance of taking parametric uncertainty into account in process-based catchment scale modelling of possible remediation scenarios. Estimates of uncertainty can be included in information provided to decision makers, thus making a stronger scientific basis for sound decisions to manage water resources.

  12. Uncertainty in urban stormwater quality modelling: the influence of likelihood measure formulation in the GLUE methodology.

    Science.gov (United States)

    Freni, Gabriele; Mannina, Giorgio; Viviani, Gapare

    2009-12-15

    In the last years, the attention on integrated analysis of sewer networks, wastewater treatment plants and receiving waters has been growing. However, the common lack of data in the urban water-quality field and the incomplete knowledge regarding the interpretation of the main phenomena taking part in integrated urban water systems draw attention to the necessity of evaluating the reliability of model results. Uncertainty analysis can provide useful hints and information regarding the best model approach to be used by assessing its degrees of significance and reliability. Few studies deal with uncertainty assessment in the integrated urban-drainage field. In order to fill this gap, there has been a general trend towards transferring the knowledge and the methodologies from other fields. In this respect, the Generalised Likelihood Uncertainty Evaluation (GLUE) methodology, which is widely applied in the field of hydrology, can be a possible candidate for providing a solution to the above problem. However, the methodology relies on several user-defined hypotheses in the selection of a specific formulation of the likelihood measure. This paper presents a survey aimed at evaluating the influence of the likelihood measure formulation in the assessment of uncertainty in integrated urban-drainage modelling. To accomplish this objective, a home-made integrated urban-drainage model was applied to the Savena case study (Bologna, IT). In particular, the integrated urban-drainage model uncertainty was evaluated employing different likelihood measures. The results demonstrate that the subjective selection of the likelihood measure greatly affects the GLUE uncertainty analysis.

  13. Model Uncertainty and Test of a Segmented Mirror Telescope

    Science.gov (United States)

    2014-03-01

    the eigenanalysis of large systems. A NASTRAN compatible model was provided by NPS. The model consists of elements representing all the... NASTRAN compatible model was provided by NPS. First attempts at model validation are presented in Ref. 10. The model consists of the full telescope...Superelements in MSC Nastran , a Super Tool for Segmented Optics. MSC.Software VPD Conference. Huntington Beach, CA. Gibson, L., & Ashby, M. (1988). Cellular

  14. Sensitivity to plant modelling uncertainties in optimal feedback control of sound radiation from a panel

    DEFF Research Database (Denmark)

    Mørkholt, Jakob

    1997-01-01

    Optimal feedback control of broadband sound radiation from a rectangular baffled panel has been investigated through computer simulations. Special emphasis has been put on the sensitivity of the optimal feedback control to uncertainties in the modelling of the system under control.A model...... in terms of a set of radiation filters modelling the radiation dynamics.Linear quadratic feedback control applied to the panel in order to minimise the radiated sound power has then been simulated. The sensitivity of the model based controller to modelling uncertainties when using feedback from actual...

  15. Uncertainty analysis on simple mass balance model to calculate critical loads for soil acidity

    International Nuclear Information System (INIS)

    Li Harbin; McNulty, Steven G.

    2007-01-01

    Simple mass balance equations (SMBE) of critical acid loads (CAL) in forest soil were developed to assess potential risks of air pollutants to ecosystems. However, to apply SMBE reliably at large scales, SMBE must be tested for adequacy and uncertainty. Our goal was to provide a detailed analysis of uncertainty in SMBE so that sound strategies for scaling up CAL estimates to the national scale could be developed. Specifically, we wanted to quantify CAL uncertainty under natural variability in 17 model parameters, and determine their relative contributions in predicting CAL. Results indicated that uncertainty in CAL came primarily from components of base cation weathering (BC w ; 49%) and acid neutralizing capacity (46%), whereas the most critical parameters were BC w base rate (62%), soil depth (20%), and soil temperature (11%). Thus, improvements in estimates of these factors are crucial to reducing uncertainty and successfully scaling up SMBE for national assessments of CAL. - A comprehensive uncertainty analysis, with advanced techniques and full list and full value ranges of all individual parameters, was used to examine a simple mass balance model and address questions of error partition and uncertainty reduction in critical acid load estimates that were not fully answered by previous studies

  16. Propagation of hydrological modeling uncertainties on bed load transport simulations in steep mountain streams

    Science.gov (United States)

    Eichner, Bernhard; Koller, Julian; Kammerlander, Johannes; Schöber, Johannes; Achleitner, Stefan

    2017-04-01

    As mountain streams are sources of both, water and sediment, they are strongly influencing the whole downstream river network. Besides large flood flow events, the continuous transport of sediments during the year is in the focus of this work. Since small mountain streams are usually not measured, spatial distributed hydrological models are used to assess the internal discharges triggering the sediment transport. In general model calibration will never be perfect and is focused on specific criteria such as mass balance or peak flow, etc. The remaining uncertainties influence the subsequent applications, where the simulation results are used. The presented work focuses on the question, how modelling uncertainties in hydrological modelling impact the subsequent simulation of sediment transport. The applied auto calibration by means of MonteCarlo Simulation optimizes the model parameters for different aspects (efficiency criteria) of the runoff time series. In this case, we investigated the impacts of different hydrological criteria on a subsequent bed load transport simulation in catchment of the Längentaler Bach, a small catchment in the Stubai Alps. The used hydrologic model HQSim is a physically based semi-distributed water balance model. Different hydrologic response units (HRU), which are characterized by elevation, orientation, vegetation, soil type and depth, drain with various delay into specified river reaches. The runoff results of the Monte-Carlo simulation are evaluated in comparison to runoff gauge, where water is collected by the Tiroler Wasserkraft AG (TIWAG). Using the Nash-Sutcliffe efficiency (NSE) on events and main runoff period (summer), the weighted root mean squared error (RMSE) on duration curve and a combination of different criteria, a set of best fit parametrization with varying runoff series was received as input for the bed load transport simulation. These simulations are performed with sedFlow, a tool especially developed for bed load

  17. Measure of Uncertainty in Process Models Using Stochastic Petri Nets and Shannon Entropy

    Directory of Open Access Journals (Sweden)

    Martin Ibl

    2016-01-01

    Full Text Available When modelling and analysing business processes, the main emphasis is usually put on model validity and accuracy, i.e., the model meets the formal specification and also models the relevant system. In recent years, a series of metrics has begun to develop, which allows the quantification of the specific properties of process models. These characteristics are, for instance, complexity, comprehensibility, cohesion, and uncertainty. This work is focused on defining a method that allows us to measure the uncertainty of a process model, which was modelled by using stochastic Petri nets (SPN. The principle of this method consists of mapping of all reachable marking of SPN into the continuous-time Markov chain and then calculating its stationary probabilities. The uncertainty is then measured as the entropy of the Markov chain (it is possible to calculate the uncertainty of the specific subset of places as well as of whole net. Alternatively, the uncertainty index is quantified as a percentage of the calculated entropy against maximum entropy (the resulting value is normalized to the interval <0,1>. The calculated entropy can also be used as a measure of the model complexity.

  18. Validation and uncertainty analysis of a pre-treatment 2D dose prediction model

    Science.gov (United States)

    Baeza, Jose A.; Wolfs, Cecile J. A.; Nijsten, Sebastiaan M. J. J. G.; Verhaegen, Frank

    2018-02-01

    Independent verification of complex treatment delivery with megavolt photon beam radiotherapy (RT) has been effectively used to detect and prevent errors. This work presents the validation and uncertainty analysis of a model that predicts 2D portal dose images (PDIs) without a patient or phantom in the beam. The prediction model is based on an exponential point dose model with separable primary and secondary photon fluence components. The model includes a scatter kernel, off-axis ratio map, transmission values and penumbra kernels for beam-delimiting components. These parameters were derived through a model fitting procedure supplied with point dose and dose profile measurements of radiation fields. The model was validated against a treatment planning system (TPS; Eclipse) and radiochromic film measurements for complex clinical scenarios, including volumetric modulated arc therapy (VMAT). Confidence limits on fitted model parameters were calculated based on simulated measurements. A sensitivity analysis was performed to evaluate the effect of the parameter uncertainties on the model output. For the maximum uncertainty, the maximum deviating measurement sets were propagated through the fitting procedure and the model. The overall uncertainty was assessed using all simulated measurements. The validation of the prediction model against the TPS and the film showed a good agreement, with on average 90.8% and 90.5% of pixels passing a (2%,2 mm) global gamma analysis respectively, with a low dose threshold of 10%. The maximum and overall uncertainty of the model is dependent on the type of clinical plan used as input. The results can be used to study the robustness of the model. A model for predicting accurate 2D pre-treatment PDIs in complex RT scenarios can be used clinically and its uncertainties can be taken into account.

  19. Stochastic modelling of landfill processes incorporating waste heterogeneity and data uncertainty

    International Nuclear Information System (INIS)

    Zacharof, A.I.; Butler, A.P.

    2004-01-01

    A landfill is a very complex heterogeneous environment and as such it presents many modelling challenges. Attempts to develop models that reproduce these complexities generally involve the use of large numbers of spatially dependent parameters that cannot be properly characterised in the face of data uncertainty. An alternative method is presented, which couples a simplified microbial degradation model with a stochastic hydrological and contaminant transport model. This provides a framework for incorporating the complex effects of spatial heterogeneity within the landfill in a simplified manner, along with other key variables. A methodology for handling data uncertainty is also integrated into the model structure. Illustrative examples of the model's output are presented to demonstrate effects of data uncertainty on leachate composition and gas volume prediction

  20. A model of designing as the intersection between uncertainty perception, information processing, and coevolution

    DEFF Research Database (Denmark)

    Lasso, Sarah Venturim; Cash, Philip; Daalhuizen, Jaap

    2016-01-01

    activity can be modelled by describing the steps and phases that entails a specific design activity (Bedny & Harris, 2005). In all aspects and stages of the NPD process, uncertainty plays a key role both within the project itself as well as in relation to the project environment (Huang, Liu & Ho, 2015...... a different perception of uncertainty. This difference can for example lead to a lack of agreement on the best solution. In NPD projects information is used to minimize the uncertainties inherent to innovation (Stockstrom & Herstatt, 2008; Huang, Liu & Ho, 2015), however, it is important to accept...

  1. Fuzzy uncertainty modeling applied to AP1000 nuclear power plant LOCA

    International Nuclear Information System (INIS)

    Ferreira Guimaraes, Antonio Cesar; Franklin Lapa, Celso Marcelo; Lamego Simoes Filho, Francisco Fernando; Cabral, Denise Cunha

    2011-01-01

    Research highlights: → This article presents an uncertainty modelling study using a fuzzy approach. → The AP1000 Westinghouse NPP was used and it is provided of passive safety systems. → The use of advanced passive safety systems in NPP has limited operational experience. → Failure rates and basic events probabilities used on the fault tree analysis. → Fuzzy uncertainty approach was employed to reliability of the AP1000 large LOCA. - Abstract: This article presents an uncertainty modeling study using a fuzzy approach applied to the Westinghouse advanced nuclear reactor. The AP1000 Westinghouse Nuclear Power Plant (NPP) is provided of passive safety systems, based on thermo physics phenomenon, that require no operating actions, soon after an incident has been detected. The use of advanced passive safety systems in NPP has limited operational experience. As it occurs in any reliability study, statistically non-significant events report introduces a significant uncertainty level about the failure rates and basic events probabilities used on the fault tree analysis (FTA). In order to model this uncertainty, a fuzzy approach was employed to reliability analysis of the AP1000 large break Loss of Coolant Accident (LOCA). The final results have revealed that the proposed approach may be successfully applied to modeling of uncertainties in safety studies.

  2. Propagation of uncertainty and sensitivity analysis in an integral oil-gas plume model

    KAUST Repository

    Wang, Shitao

    2016-05-27

    Polynomial Chaos expansions are used to analyze uncertainties in an integral oil-gas plume model simulating the Deepwater Horizon oil spill. The study focuses on six uncertain input parameters—two entrainment parameters, the gas to oil ratio, two parameters associated with the droplet-size distribution, and the flow rate—that impact the model\\'s estimates of the plume\\'s trap and peel heights, and of its various gas fluxes. The ranges of the uncertain inputs were determined by experimental data. Ensemble calculations were performed to construct polynomial chaos-based surrogates that describe the variations in the outputs due to variations in the uncertain inputs. The surrogates were then used to estimate reliably the statistics of the model outputs, and to perform an analysis of variance. Two experiments were performed to study the impacts of high and low flow rate uncertainties. The analysis shows that in the former case the flow rate is the largest contributor to output uncertainties, whereas in the latter case, with the uncertainty range constrained by aposteriori analyses, the flow rate\\'s contribution becomes negligible. The trap and peel heights uncertainties are then mainly due to uncertainties in the 95% percentile of the droplet size and in the entrainment parameters.

  3. Uncertainty assessment and sensitivity analysis of soil moisture based on model parameter errors - Results from four regions in China

    Science.gov (United States)

    Sun, Guodong; Peng, Fei; Mu, Mu

    2017-12-01

    Model parameter errors are an important cause of uncertainty in soil moisture simulation. In this study, a conditional nonlinear optimal perturbation related to parameter (CNOP-P) approach and a sophisticated land surface model (the Common Land Model, CoLM) are employed in four regions in China to explore extent of uncertainty in soil moisture simulations due to model parameter errors. The CNOP-P approach facilitates calculation of the upper bounds of uncertainty due to parameter errors and investigation of the nonlinear effects of parameter combination on uncertainties in simulation and prediction. The range of uncertainty for simulated soil moisture was found to be from 0.04 to 0.58 m3 m-3. Based on the CNOP-P approach, a new approach is applied to explore a relatively sensitive and important parameter combination for soil moisture simulations and predictions. It is found that the relatively sensitive parameter combination is region- and season-dependent. Furthermore, the results show that simulation of soil moisture could be improved if the errors in these important parameter combinations are reduced. In four study regions, the average extent of improvement (61.6%) in simulating soil moisture using the new approach based on the CNOP-P is larger than that (53.4%) using the one-at-a-time (OAT) approach. These results indicate that simulation and prediction of soil moisture is improved by considering the nonlinear effects of important physical parameter combinations. In addition, the new approach based on the CNOP-P is found to be an effective method to discern the nonlinear effects of important physical parameter combinations on numerical simulation and prediction.

  4. Uncertainty estimation and ensemble forecast with a chemistry-transport model - Application to air-quality modeling and simulation

    International Nuclear Information System (INIS)

    Mallet, Vivien

    2005-01-01

    The thesis deals with the evaluation of a chemistry-transport model, not primarily with classical comparisons to observations, but through the estimation of its a priori uncertainties due to input data, model formulation and numerical approximations. These three uncertainty sources are studied respectively on the basis of Monte Carlos simulations, multi-models simulations and numerical schemes inter-comparisons. A high uncertainty is found, in output ozone concentrations. In order to overtake the limitations due to the uncertainty, a solution is ensemble forecast. Through combinations of several models (up to forty-eight models) on the basis of past observations, the forecast can be significantly improved. The achievement of this work has also led to develop the innovative modelling-system Polyphemus. (author) [fr

  5. Uncertainty analysis in Titan ionospheric simulated ion mass spectra: unveiling a set of issues for models accuracy improvement

    Science.gov (United States)

    Hébrard, Eric; Carrasco, Nathalie; Dobrijevic, Michel; Pernot, Pascal

    Ion Neutral Mass Spectrometer (INMS) aboard Cassini revealed a rich coupled ion-neutral chemistry in the ionosphere, producing heavy hydrocarbons and nitriles ions. The modeling of such a complex environment is challenging, as it requires a detailed and accurate description of the different relevant processes such as photodissociation cross sections and neutral-neutral reaction rates on one hand, and ionisation cross sections, ion-molecule and recombination reaction rates on the other hand. Underpinning models calculations, each of these processes is parameterized by kinetic constants which, when known, have been studied experimentally and/or theoretically over a range of temperatures and pressures that are most often not representative of Titan's atmosphere. The sizeable experimental and theoretical uncertainties reported in the literature merge therefore with the uncertainties resulting subsequently from the unavoidable estimations or extrapolations to Titan's atmosphere conditions. Such large overall uncertainties have to be accounted for in all resulting inferences most of all to evaluate the quality of the model definition. We have undertaken a systematic study of the uncertainty sources in the simulation of ion mass spectra as recorded by Cassini/INMS in Titan ionosphere during the T5 flyby at 1200 km. Our simulated spectra seem much less affected by the uncertainties on ion-molecule reactions than on neutral-neutral reactions. Photochemical models of Titan's atmosphere are indeed so poorly predictive at high altitudes, in the sense that their computed predictions display such large uncertainties, that we found them to give rise to bimodal and hypersensitive abundance distributions for some major compounds like acetylene C2 H2 and ethylene C2 H4 . We will show to what extent global uncertainty and sensitivity analysis enabled us to identify the causes of this bimodality and to pinpoint the key processes that mostly contribute to limit the accuracy of the

  6. Uncertainties in future-proof decision-making: the Dutch Delta Model

    Science.gov (United States)

    IJmker, Janneke; Snippen, Edwin; Ruijgh, Erik

    2013-04-01

    In 1953, a number of European countries experienced flooding after a major storm event coming from the northwest. Over 2100 people died of the resulting floods, 1800 of them being Dutch. This gave rise to the development of the so-called Delta Works and Zuiderzee Works that strongly reduced the flood risk in the Netherlands. These measures were a response to a large flooding event. As boundary conditions have changed (increasing population, increasing urban development, etc.) , the flood risk should be evaluated continuously, and measures should be taken if necessary. The Delta Programme was designed to be prepared for future changes and to limit the flood risk, taking into account economics, nature, landscape, residence and recreation . To support decisions in the Delta Programme, the Delta Model was developed. By using four different input scenarios (extremes in climate and economics) and variations in system setup, the outcomes of the Delta Model represent a range of possible outcomes for the hydrological situation in 2050 and 2100. These results flow into effect models that give insight in the integrated effects on freshwater supply (including navigation, industry and ecology) and flood risk. As the long-term water management policy of the Netherlands for the next decades will be based on these results, they have to be reliable. Therefore, a study was carried out to investigate the impact of uncertainties on the model outcomes. The study focused on "known unknowns": uncertainties in the boundary conditions, in the parameterization and in the model itself. This showed that for different parts of the Netherlands, the total uncertainty is in the order of meters! Nevertheless, (1) the total uncertainty is dominated by uncertainties in boundary conditions. Internal model uncertainties are subordinate to that. Furthermore, (2) the model responses develop in a logical way, such that the exact model outcomes might be uncertain, but the outcomes of different model runs

  7. Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies (Final Report)

    Science.gov (United States)

    EPA announced the availability of the final report, Uncertainty and Variability in Physiologically-Based Pharmacokinetic (PBPK) Models: Key Issues and Case Studies. This report summarizes some of the recent progress in characterizing uncertainty and variability in physi...

  8. Effects of uncertainty in model predictions of individual tree volume on large area volume estimates

    Science.gov (United States)

    Ronald E. McRoberts; James A. Westfall

    2014-01-01

    Forest inventory estimates of tree volume for large areas are typically calculated by adding model predictions of volumes for individual trees. However, the uncertainty in the model predictions is generally ignored with the result that the precision of the large area volume estimates is overestimated. The primary study objective was to estimate the effects of model...

  9. Use of Paired Simple and Complex Models to Reduce Predictive Bias and Quantify Uncertainty

    DEFF Research Database (Denmark)

    Doherty, John; Christensen, Steen

    2011-01-01

    into the costs of model simplification, and into how some of these costs may be reduced. It then describes a methodology for paired model usage through which predictive bias of a simplified model can be detected and corrected, and postcalibration predictive uncertainty can be quantified. The methodology...

  10. Application of probabilistic modelling for the uncertainty evaluation of alignment measurements of large accelerator magnets assemblies

    Science.gov (United States)

    Doytchinov, I.; Tonnellier, X.; Shore, P.; Nicquevert, B.; Modena, M.; Mainaud Durand, H.

    2018-05-01

    Micrometric assembly and alignment requirements for future particle accelerators, and especially large assemblies, create the need for accurate uncertainty budgeting of alignment measurements. Measurements and uncertainties have to be accurately stated and traceable, to international standards, for metre-long sized assemblies, in the range of tens of µm. Indeed, these hundreds of assemblies will be produced and measured by several suppliers around the world, and will have to be integrated into a single machine. As part of the PACMAN project at CERN, we proposed and studied a practical application of probabilistic modelling of task-specific alignment uncertainty by applying a simulation by constraints calibration method. Using this method, we calibrated our measurement model using available data from ISO standardised tests (10360 series) for the metrology equipment. We combined this model with reference measurements and analysis of the measured data to quantify the actual specific uncertainty of each alignment measurement procedure. Our methodology was successfully validated against a calibrated and traceable 3D artefact as part of an international inter-laboratory study. The validated models were used to study the expected alignment uncertainty and important sensitivity factors in measuring the shortest and longest of the compact linear collider study assemblies, 0.54 m and 2.1 m respectively. In both cases, the laboratory alignment uncertainty was within the targeted uncertainty budget of 12 µm (68% confidence level). It was found that the remaining uncertainty budget for any additional alignment error compensations, such as the thermal drift error due to variation in machine operation heat load conditions, must be within 8.9 µm and 9.8 µm (68% confidence level) respectively.

  11. Implications of Uncertainty in Fossil Fuel Emissions for Terrestrial Ecosystem Modeling

    Science.gov (United States)

    King, A. W.; Ricciuto, D. M.; Mao, J.; Andres, R. J.

    2017-12-01

    Given observations of the increase in atmospheric CO2, estimates of anthropogenic emissions and models of oceanic CO2 uptake, one can estimate net global CO2 exchange between the atmosphere and terrestrial ecosystems as the residual of the balanced global carbon budget. Estimates from the Global Carbon Project 2016 show that terrestrial ecosystems are a growing sink for atmospheric CO2 (averaging 2.12 Gt C y-1 for the period 1959-2015 with a growth rate of 0.03 Gt C y-1 per year) but with considerable year-to-year variability (standard deviation of 1.07 Gt C y-1). Within the uncertainty of the observations, emissions estimates and ocean modeling, this residual calculation is a robust estimate of a global terrestrial sink for CO2. A task of terrestrial ecosystem science is to explain the trend and variability in this estimate. However, "within the uncertainty" is an important caveat. The uncertainty (2σ; 95% confidence interval) in fossil fuel emissions is 8.4% (±0.8 Gt C in 2015). Combined with uncertainty in other carbon budget components, the 2σ uncertainty surrounding the global net terrestrial ecosystem CO2 exchange is ±1.6 Gt C y-1. Ignoring the uncertainty, the estimate of a general terrestrial sink includes 2 years (1987 and 1998) in which terrestrial ecosystems are a small source of CO2 to the atmosphere. However, with 2σ uncertainty, terrestrial ecosystems may have been a source in as many as 18 years. We examine how well global terrestrial biosphere models simulate the trend and interannual variability of the global-budget estimate of the terrestrial sink within the context of this uncertainty (e.g., which models fall outside the 2σ uncertainty and in what years). Models are generally capable of reproducing the trend in net terrestrial exchange, but are less able to capture interannual variability and often fall outside the 2σ uncertainty. The trend in the residual carbon budget estimate is primarily associated with the increase in atmospheric CO2

  12. Using finite mixture models in thermal-hydraulics system code uncertainty analysis

    Energy Technology Data Exchange (ETDEWEB)

    Carlos, S., E-mail: scarlos@iqn.upv.es [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Sánchez, A. [Department d’Estadística Aplicada i Qualitat, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Ginestar, D. [Department de Matemàtica Aplicada, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain); Martorell, S. [Department d’Enginyeria Química i Nuclear, Universitat Politècnica de València, Camí de Vera s.n, 46022 València (Spain)

    2013-09-15

    Highlights: • Best estimate codes simulation needs uncertainty quantification. • The output variables can present multimodal probability distributions. • The analysis of multimodal distribution is performed using finite mixture models. • Two methods to reconstruct output variable probability distribution are used. -- Abstract: Nuclear Power Plant safety analysis is mainly based on the use of best estimate (BE) codes that predict the plant behavior under normal or accidental conditions. As the BE codes introduce uncertainties due to uncertainty in input parameters and modeling, it is necessary to perform uncertainty assessment (UA), and eventually sensitivity analysis (SA), of the results obtained. These analyses are part of the appropriate treatment of uncertainties imposed by current regulation based on the adoption of the best estimate plus uncertainty (BEPU) approach. The most popular approach for uncertainty assessment, based on Wilks’ method, obtains a tolerance/confidence interval, but it does not completely characterize the output variable behavior, which is required for an extended UA and SA. However, the development of standard UA and SA impose high computational cost due to the large number of simulations needed. In order to obtain more information about the output variable and, at the same time, to keep computational cost as low as possible, there has been a recent shift toward developing metamodels (model of model), or surrogate models, that approximate or emulate complex computer codes. In this way, there exist different techniques to reconstruct the probability distribution using the information provided by a sample of values as, for example, the finite mixture models. In this paper, the Expectation Maximization and the k-means algorithms are used to obtain a finite mixture model that reconstructs the output variable probability distribution from data obtained with RELAP-5 simulations. Both methodologies have been applied to a separated

  13. Parametric uncertainty analysis of pulse wave propagation in a model of a human arterial network

    Science.gov (United States)

    Xiu, Dongbin; Sherwin, Spencer J.

    2007-10-01

    Reduced models of human arterial networks are an efficient approach to analyze quantitative macroscopic features of human arterial flows. The justification for such models typically arise due to the significantly long wavelength associated with the system in comparison to the lengths of arteries in the networks. Although these types of models have been employed extensively and many issues associated with their implementations have been widely researched, the issue of data uncertainty has received comparatively little attention. Similar to many biological systems, a large amount of uncertainty exists in the value of the parameters associated with the models. Clearly reliable assessment of the system behaviour cannot be made unless the effect of such data uncertainty is quantified. In this paper we present a study of parametric data uncertainty in reduced modelling of human arterial networks which is governed by a hyperbolic system. The uncertain parameters are modelled as random variables and the governing equations for the arterial network therefore become stochastic. This type stochastic hyperbolic systems have not been previously systematically studied due to the difficulties introduced by the uncertainty such as a potential change in the mathematical character of the system and imposing boundary conditions. We demonstrate how the application of a high-order stochastic collocation method based on the generalized polynomial chaos expansion, combined with a discontinuous Galerkin spectral/hp element discretization in physical space, can successfully simulate this type of hyperbolic system subject to uncertain inputs with bounds. Building upon a numerical study of propagation of uncertainty and sensitivity in a simplified model with a single bifurcation, a systematical parameter sensitivity analysis is conducted on the wave dynamics in a multiple bifurcating human arterial network. Using the physical understanding of the dynamics of pulse waves in these types of

  14. A systematic approach to the modelling of measurements for uncertainty evaluation

    International Nuclear Information System (INIS)

    Sommer, K D; Weckenmann, A; Siebert, B R L

    2005-01-01

    The evaluation of measurement uncertainty is based on both, the knowledge about the measuring process and the quantities which influence the measurement result. The knowledge about the measuring process is represented by the model equation which expresses the interrelation between the measurand and the input quantities. Therefore, the modelling of the measurement is a key element of modern uncertainty evaluation. A modelling concept has been developed that is based on the idea of the measuring chain. It gets on with only a few generic model structures. From this concept, a practical stepwise procedure has been derived

  15. Comparative uncertainty analysis of copper loads in stormwater systems using GLUE and grey-box modeling

    DEFF Research Database (Denmark)

    Lindblom, Erik Ulfson; Madsen, Henrik; Mikkelsen, Peter Steen

    2007-01-01

    of the measurements. In the second attempt the conceptual model is reformulated to a grey-box model followed by parameter estimation. Given data from an extensive measurement campaign, the two methods suggest that the output of the stormwater pollution model is associated with significant uncertainty....... With the proposed model and input data, the GLUE analysis show that the total sampled copper mass can be predicted within a range of +/- 50% of the median value ( 385 g), whereas the grey-box analysis showed a prediction uncertainty of less than +/- 30%. Future work will clarify the pros and cons of the two methods...

  16. LOCALITY UNCERTAINTY AND THE DIFFERENTIAL PERFORMANCE OF FOUR COMMON NICHE-BASED MODELING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Miguel Fernandez

    2009-09-01

    Full Text Available We address a poorly understood aspect of ecological niche modeling: its sensitivity to different levels of geographic uncertainty in organism occurrence data. Our primary interest was to assess how accuracy degrades under increasing uncertainty, with performance measured indirectly through model consistency. We used Monte Carlo simulations and a similarity measure to assess model sensitivity across three variables: locality accuracy, niche modeling method, and species. Randomly generated data sets with known levels of locality uncertainty were compared to an original prediction using Fuzzy Kappa. Data sets where locality uncertainty is low were expected to produce similar distribution maps to the original. In contrast, data sets where locality uncertainty is high were expected to produce less similar maps. BIOCLIM, DOMAIN, Maxent and GARP were used to predict the distributions for 1200 simulated datasets (3 species x 4 buffer sizes x 100 randomized data sets. Thus, our experimental design produced a total of 4800 similarity measures, with each of the simulated distributions compared to the prediction of the original data set and corresponding modeling method. A general linear model (GLM analysis was performed which enables us to simultaneously measure the effect of buffer size, modeling method, and species, as well as interactions among all variables. Our results show that modeling method has the largest effect on similarity scores and uniquely accounts for 40% of the total variance in the model. The second most important factor was buffer size, but it uniquely accounts for only 3% of the variation in the model. The newer and currently more popular methods, GARP and Maxent, were shown to produce more inconsistent predictions than the earlier and simpler methods, BIOCLIM and DOMAIN. Understanding the performance of different niche modeling methods under varying levels of geographic uncertainty is an important step toward more productive

  17. Global Sensitivity Analysis for Identifying Important Parameters of Nitrogen Nitrification and Denitrification under Model and Scenario Uncertainties

    Science.gov (United States)

    Ye, M.; Chen, Z.; Shi, L.; Zhu, Y.; Yang, J.

    2017-12-01

    Nitrogen reactive transport modeling is subject to uncertainty in model parameters, structures, and scenarios. While global sensitivity analysis is a vital tool for identifying the parameters important to nitrogen reactive transport, conventional global sensitivity analysis only considers parametric uncertainty. This may result in inaccurate selection of important parameters, because parameter importance may vary under different models and modeling scenarios. By using a recently developed variance-based global sensitivity analysis method, this paper identifies important parameters with simultaneous consideration of parametric uncertainty, model uncertainty, and scenario uncertainty. In a numerical example of nitrogen reactive transport modeling, a combination of three scenarios of soil temperature and two scenarios of soil moisture leads to a total of six scenarios. Four alternative models are used to evaluate reduction functions used for calculating actual rates of nitrification and denitrification. The model uncertainty is tangled with scenario uncertainty, as the reduction functions depend on soil temperature and moisture content. The results of sensitivity analysis show that parameter importance varies substantially between different models and modeling scenarios, which may lead to inaccurate selection of important parameters if model and scenario uncertainties are not considered. This problem is avoided by using the new method of sensitivity analysis in the context of model averaging and scenario averaging. The new method of sensitivity analysis can be applied to other problems of contaminant transport modeling when model uncertainty and/or scenario uncertainty are present.

  18. MODELS OF AIR TRAFFIC CONTROLLERS ERRORS PREVENTION IN TERMINAL CONTROL AREAS UNDER UNCERTAINTY CONDITIONS

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2017-03-01

    Full Text Available Purpose: the aim of this study is to research applied models of air traffic controllers’ errors prevention in terminal control areas (TMA under uncertainty conditions. In this work the theoretical framework descripting safety events and errors of air traffic controllers connected with the operations in TMA is proposed. Methods: optimisation of terminal control area formal description based on the Threat and Error management model and the TMA network model of air traffic flows. Results: the human factors variables associated with safety events in work of air traffic controllers under uncertainty conditions were obtained. The Threat and Error management model application principles to air traffic controller operations and the TMA network model of air traffic flows were proposed. Discussion: Information processing context for preventing air traffic controller errors, examples of threats in work of air traffic controllers, which are relevant for TMA operations under uncertainty conditions.

  19. What Makes Hydrologic Models Differ? Using SUMMA to Systematically Explore Model Uncertainty and Error

    Science.gov (United States)

    Bennett, A.; Nijssen, B.; Chegwidden, O.; Wood, A.; Clark, M. P.

    2017-12-01

    Model intercomparison experiments have been conducted to quantify the variability introduced during the model development process, but have had limited success in identifying the sources of this model variability. The Structure for Unifying Multiple Modeling Alternatives (SUMMA) has been developed as a framework which defines a general set of conservation equations for mass and energy as well as a common core of numerical solvers along with the ability to set options for choosing between different spatial discretizations and flux parameterizations. SUMMA can be thought of as a framework for implementing meta-models which allows for the investigation of the impacts of decisions made during the model development process. Through this flexibility we develop a hierarchy of definitions which allows for models to be compared to one another. This vocabulary allows us to define the notion of weak equivalence between model instantiations. Through this weak equivalence we develop the concept of model mimicry, which can be used to investigate the introduction of uncertainty and error during the modeling process as well as provide a framework for identifying modeling decisions which may complement or negate one another. We instantiate SUMMA instances that mimic the behaviors of the Variable Infiltration Capacity (VIC) model and the Precipitation Runoff Modeling System (PRMS) by choosing modeling decisions which are implemented in each model. We compare runs from these models and their corresponding mimics across the Columbia River Basin located in the Pacific Northwest of the United States and Canada. From these comparisons, we are able to determine the extent to which model implementation has an effect on the results, as well as determine the changes in sensitivity of parameters due to these implementation differences. By examining these changes in results and sensitivities we can attempt to postulate changes in the modeling decisions which may provide better estimation of

  20. Eliciting geologists' tacit model of the uncertainty of mapped geological boundaries

    Science.gov (United States)

    Lark, R. M.; Lawley, R. S.; Barron, A. J. M.; Aldiss, D. T.; Ambrose, K.; Cooper, A. H.; Lee, J. R.; Waters, C. N.

    2015-01-01

    It is generally accepted that geological linework, such as mapped boundaries, are uncertain for various reasons. It is difficult to quantify this uncertainty directly, because the investigation of error in a boundary at a single location may be costly and time consuming, and many such observations are needed to estimate an uncertainty model with confidence. However, it is also recognized across many disciplines that experts generally have a tacit model of the uncertainty of information that they produce (interpretations, diagnoses etc.) and formal methods exist to extract this model in usable form by elicitation. In this paper we report a trial in which uncertainty models for mapped boundaries in six geological scenarios were elicited from a group of five experienced geologists. In five cases a consensus distribution was obtained, which reflected both the initial individually elicted distribution and a structured process of group discussion in which individuals revised their opinions. In a sixth case a consensus was not reached. This concerned a boundary between superficial deposits where the geometry of the contact is hard to visualize. The trial showed that the geologists' tacit model of uncertainty in mapped boundaries reflects factors in addition to the cartographic error usually treated by buffering linework or in written guidance on its application. It suggests that further application of elicitation, to scenarios at an appropriate level of generalization, could be useful to provide working error models for the application and interpretation of linework.

  1. Coupled Monte Carlo simulation and Copula theory for uncertainty analysis of multiphase flow simulation models.

    Science.gov (United States)

    Jiang, Xue; Na, Jin; Lu, Wenxi; Zhang, Yu

    2017-11-01

    Simulation-optimization techniques are effective in identifying an optimal remediation strategy. Simulation models with uncertainty, primarily in the form of parameter uncertainty with different degrees of correlation, influence the reliability of the optimal remediation strategy. In this study, a coupled Monte Carlo simulation and Copula theory is proposed for uncertainty analysis of a simulation model when parameters are correlated. Using the self-adaptive weight particle swarm optimization Kriging method, a surrogate model was constructed to replace the simulation model and reduce the computational burden and time consumption resulting from repeated and multiple Monte Carlo simulations. The Akaike information criterion (AIC) and the Bayesian information criterion (BIC) were employed to identify whether the t Copula function or the Gaussian Copula is the optimal Copula function to match the relevant structure of the parameters. The results show that both the AIC and BIC values of the t Copula function are less than those of the Gaussian Copula function. This indicates that the t Copula function is the optimal function for matching the relevant structure of the parameters. The outputs of the simulation model when parameter correlation was considered and when it was ignored were compared. The results show that the amplitude of the fluctuation interval when parameter correlation was considered is less than the corresponding amplitude when parameter estimation was ignored. Moreover, it was demonstrated that considering the correlation among parameters is essential for uncertainty analysis of a simulation model, and the results of uncertainty analysis should be incorporated into the remediation strategy optimization process.

  2. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2014-02-01

    Full Text Available Assessing climate change impacts on pesticide leaching requires careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-western Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-western Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios has the potential to provide robust probabilistic estimates of future pesticide losses.

  3. Refining the Committee Approach and Uncertainty Prediction in Hydrological Modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  4. Refining the committee approach and uncertainty prediction in hydrological modelling

    NARCIS (Netherlands)

    Kayastha, N.

    2014-01-01

    Due to the complexity of hydrological systems a single model may be unable to capture the full range of a catchment response and accurately predict the streamflows. The multi modelling approach opens up possibilities for handling such difficulties and allows improve the predictive capability of

  5. Quantifying uncertainty and sensitivity in sea ice models

    Energy Technology Data Exchange (ETDEWEB)

    Urrego Blanco, Jorge Rolando [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hunke, Elizabeth Clare [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Urban, Nathan Mark [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-15

    The Los Alamos Sea Ice model has a number of input parameters for which accurate values are not always well established. We conduct a variance-based sensitivity analysis of hemispheric sea ice properties to 39 input parameters. The method accounts for non-linear and non-additive effects in the model.

  6. Separation of uncertainty and interindividual variability in human exposure modeling.

    NARCIS (Netherlands)

    Ragas, A.M.J.; Brouwer, F.P.E.; Buchner, F.L.; Hendriks, H.W.; Huijbregts, M.A.J.

    2009-01-01

    The NORMTOX model predicts the lifetime-averaged exposure to contaminants through multiple environmental media, that is, food, air, soil, drinking and surface water. The model was developed to test the coherence of Dutch environmental quality objectives (EQOs). A set of EQOs is called coherent if

  7. Noodles: a tool for visualization of numerical weather model ensemble uncertainty.

    Science.gov (United States)

    Sanyal, Jibonananda; Zhang, Song; Dyer, Jamie; Mercer, Andrew; Amburn, Philip; Moorhead, Robert J

    2010-01-01

    Numerical weather prediction ensembles are routinely used for operational weather forecasting. The members of these ensembles are individual simulations with either slightly perturbed initial conditions or different model parameterizations, or occasionally both. Multi-member ensemble output is usually large, multivariate, and challenging to interpret interactively. Forecast meteorologists are interested in understanding the uncertainties associated with numerical weather prediction; specifically variability between the ensemble members. Currently, visualization of ensemble members is mostly accomplished through spaghetti plots of a single mid-troposphere pressure surface height contour. In order to explore new uncertainty visualization methods, the Weather Research and Forecasting (WRF) model was used to create a 48-hour, 18 member parameterization ensemble of the 13 March 1993 "Superstorm". A tool was designed to interactively explore the ensemble uncertainty of three important weather variables: water-vapor mixing ratio, perturbation potential temperature, and perturbation pressure. Uncertainty was quantified using individual ensemble member standard deviation, inter-quartile range, and the width of the 95% confidence interval. Bootstrapping was employed to overcome the dependence on normality in the uncertainty metrics. A coordinated view of ribbon and glyph-based uncertainty visualization, spaghetti plots, iso-pressure colormaps, and data transect plots was provided to two meteorologists for expert evaluation. They found it useful in assessing uncertainty in the data, especially in finding outliers in the ensemble run and therefore avoiding the WRF parameterizations that lead to these outliers. Additionally, the meteorologists could identify spatial regions where the uncertainty was significantly high, allowing for identification of poorly simulated storm environments and physical interpretation of these model issues.

  8. Dealing with unquantifiable uncertainties in landslide modelling for urban risk reduction in developing countries

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in

  9. Evaluation of kinetic uncertainty in numerical models of petroleum generation

    Science.gov (United States)

    Peters, K.E.; Walters, C.C.; Mankiewicz, P.J.

    2006-01-01

    Oil-prone marine petroleum source rocks contain type I or type II kerogen having Rock-Eval pyrolysis hydrogen indices greater than 600 or 300-600 mg hydrocarbon/g total organic carbon (HI, mg HC/g TOC), respectively. Samples from 29 marine source rocks worldwide that contain mainly type II kerogen (HI = 230-786 mg HC/g TOC) were subjected to open-system programmed pyrolysis to determine the activation energy distributions for petroleum generation. Assuming a burial heating rate of 1??C/m.y. for each measured activation energy distribution, the calculated average temperature for 50% fractional conversion of the kerogen in the samples to petroleum is approximately 136 ?? 7??C, but the range spans about 30??C (???121-151??C). Fifty-two outcrop samples of thermally immature Jurassic Oxford Clay Formation were collected from five locations in the United Kingdom to determine the variations of kinetic response for one source rock unit. The samples contain mainly type I or type II kerogens (HI = 230-774 mg HC/g TOC). At a heating rate of 1??C/m.y., the calculated temperatures for 50% fractional conversion of the Oxford Clay kerogens to petroleum differ by as much as 23??C (127-150??C). The data indicate that kerogen type, as defined by hydrogen index, is not systematically linked to kinetic response, and that default kinetics for the thermal decomposition of type I or type II kerogen can introduce unacceptable errors into numerical simulations. Furthermore, custom kinetics based on one or a few samples may be inadequate to account for variations in organofacies within a source rock. We propose three methods to evaluate the uncertainty contributed by kerogen kinetics to numerical simulations: (1) use the average kinetic distribution for multiple samples of source rock and the standard deviation for each activation energy in that distribution; (2) use source rock kinetics determined at several locations to describe different parts of the study area; and (3) use a weighted

  10. Can hydraulic-modelled rating curves reduce uncertainty in high flow data?

    Science.gov (United States)

    Westerberg, Ida; Lam, Norris; Lyon, Steve W.

    2017-04-01

    Flood risk assessments rely on accurate discharge data records. Establishing a reliable rating curve for calculating discharge from stage at a gauging station normally takes years of data collection efforts. Estimation of high flows is particularly difficult as high flows occur rarely and are often practically difficult to gauge. Hydraulically-modelled rating curves can be derived based on as few as two concurrent stage-discharge and water-surface slope measurements at different flow conditions. This means that a reliable rating curve can, potentially, be derived much faster than a traditional rating curve based on numerous stage-discharge gaugings. In this study we compared the uncertainty in discharge data that resulted from these two rating curve modelling approaches. We applied both methods to a Swedish catchment, accounting for uncertainties in the stage-discharge gauging and water-surface slope data for the hydraulic model and in the stage-discharge gauging data and rating-curve parameters for the traditional method. We focused our analyses on high-flow uncertainty and the factors that could reduce this uncertainty. In particular, we investigated which data uncertainties were most important, and at what flow conditions the gaugings should preferably be taken. First results show that the hydraulically-modelled rating curves were more sensitive to uncertainties in the calibration measurements of discharge than water surface slope. The uncertainty of the hydraulically-modelled rating curves were lowest within the range of the three calibration stage-discharge gaugings (i.e. between median and two-times median flow) whereas uncertainties were higher outside of this range. For instance, at the highest observed stage of the 24-year stage record, the 90% uncertainty band was -15% to +40% of the official rating curve. Additional gaugings at high flows (i.e. four to five times median flow) would likely substantially reduce those uncertainties. These first results show

  11. Efficient Methods for Bayesian Uncertainty Analysis and Global Optimization of Computationally Expensive Environmental Models

    Science.gov (United States)

    Shoemaker, Christine; Espinet, Antoine; Pang, Min

    2015-04-01

    Models of complex environmental systems can be computationally expensive in order to describe the dynamic interactions of the many components over a sizeable time period. Diagnostics of these systems can include forward simulations of calibrated models under uncertainty and analysis of alternatives of systems management. This discussion will focus on applications of new surrogate optimization and uncertainty analysis methods to environmental models that can enhance our ability to extract information and understanding. For complex models, optimization and especially uncertainty analysis can require a large number of model simulations, which is not feasible for computationally expensive models. Surrogate response surfaces can be used in Global Optimization and Uncertainty methods to obtain accurate answers with far fewer model evaluations, which made the methods practical for computationally expensive models for which conventional methods are not feasible. In this paper we will discuss the application of the SOARS surrogate method for estimating Bayesian posterior density functions for model parameters for a TOUGH2 model of geologic carbon sequestration. We will also briefly discuss new parallel surrogate global optimization algorithm applied to two groundwater remediation sites that was implemented on a supercomputer with up to 64 processors. The applications will illustrate the use of these methods to predict the impact of monitoring and management on subsurface contaminants.

  12. Model Justified Search Algorithms for Scheduling Under Uncertainty

    National Research Council Canada - National Science Library

    Howe, Adele; Whitley, L. D

    2008-01-01

    .... We also identified plateaus as a significant barrier to superb performance of local search on scheduling and have studied several canonical discrete optimization problems to discover and model the nature of plateaus...

  13. Model Updating and Uncertainty Management for Aircraft Prognostic Systems Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This proposal addresses the integration of physics-based damage propagation models with diagnostic measures of current state of health in a mathematically rigorous...

  14. A Parallel Disintegrated Model for Uncertainty Analysis in Estimating Electrical Power Outage Areas

    Science.gov (United States)

    Omitaomu, O. A.

    2008-05-01

    The electrical infrastructure is central to national economy and security in view of the interdependencies that exist between it and other critical infrastructures. Recent studies show that the proportion of electric power outages attributed to weather-related events such as tornadoes, hurricanes, floods, and fires appears to be growing; this increase is attributed to the steady increase in the number of most severe weather events over the past few decades. Assessing the impacts of an actual extreme weather event on electrical infrastructure is usually not a difficult task; however, such an after-the-fact assessment is not useful for disaster preparedness. Therefore, the ability to estimate the possible power outage areas and affected population in case of an anticipated extreme weather event is a necessary tool for effective disaster preparedness and response management. Data for electrical substations are publicly available through the annual Federal Energy Regulatory Commission filings. However, the data do not include substations' service areas; therefore, the geographical area served by each substation critical for estimating outage areas is unknown. As a result, a Cellular Automata (CA) model was developed by the author for estimating substations' service areas using modified Moore neighborhood approach. The outputs of the CA model has recently been used for estimating power outage areas during the February 5/6, 2008 tornado outbreaks in Tennessee and for estimating the number of affected population during the February 26, 2008 substation failure in Florida. The estimation of these outage areas, like all assessments of impacts of extreme weather events, is subject to several sources of uncertainty. The uncertainty is due largely to events variability and incomplete knowledge about the events. Events variability (temporal and spatial variability) is attributed to inherent fluctuations or differences in the variable of concern; incomplete knowledge about

  15. Deterministic sensitivity and uncertainty analysis for large-scale computer models

    International Nuclear Information System (INIS)

    Worley, B.A.; Pin, F.G.; Oblow, E.M.; Maerker, R.E.; Horwedel, J.E.; Wright, R.Q.

    1988-01-01

    This paper presents a comprehensive approach to sensitivity and uncertainty analysis of large-scale computer models that is analytic (deterministic) in principle and that is firmly based on the model equations. The theory and application of two systems based upon computer calculus, GRESS and ADGEN, are discussed relative to their role in calculating model derivatives and sensitivities without a prohibitive initial manpower investment. Storage and computational requirements for these two systems are compared for a gradient-enhanced version of the PRESTO-II computer model. A Deterministic Uncertainty Analysis (DUA) method that retains the characteristics of analytically computing result uncertainties based upon parameter probability distributions is then introduced and results from recent studies are shown. 29 refs., 4 figs., 1 tab

  16. Statistical model uncertainty and OPERA-like time-of-flight measurements

    CERN Document Server

    Riordan, Oliver

    2011-01-01

    Time-of-flight measurements such as the OPERA and MINOS experiments rely crucially on statistical analysis (as well as many other ingredients) for their conclusions. The nature of these experiments leads to a simple class of statistical models for the results; however, which model in the class is appropriate is not known exactly, as this depends on information obtained experimentally, which is subject to noise and other errors. To obtain robust conclusions, this problem, known as "model uncertainty," needs to be addressed, with quantitative bounds on the effect such uncertainty may have on the final result. The OPERA (and MINOS) analysis appears to take steps to mitigate the effects of model uncertainty, though without quantifying any remaining effect. We describe one of the strategies used (averaging individual probability distributions), and point out a potential source of error if this is not carried out correctly. We then argue that the correct version of this strategy is not the most effective, and sugge...

  17. α-Decomposition for estimating parameters in common cause failure modeling based on causal inference

    International Nuclear Information System (INIS)

    Zheng, Xiaoyu; Yamaguchi, Akira; Takata, Takashi

    2013-01-01

    The traditional α-factor model has focused on the occurrence frequencies of common cause failure (CCF) events. Global α-factors in the α-factor model are defined as fractions of failure probability for particular groups of components. However, there are unknown uncertainties in the CCF parameters estimation for the scarcity of available failure data. Joint distributions of CCF parameters are actually determined by a set of possible causes, which are characterized by CCF-triggering abilities and occurrence frequencies. In the present paper, the process of α-decomposition (Kelly-CCF method) is developed to learn about sources of uncertainty in CCF parameter estimation. Moreover, it aims to evaluate CCF risk significances of different causes, which are named as decomposed α-factors. Firstly, a Hybrid Bayesian Network is adopted to reveal the relationship between potential causes and failures. Secondly, because all potential causes have different occurrence frequencies and abilities to trigger dependent failures or independent failures, a regression model is provided and proved by conditional probability. Global α-factors are expressed by explanatory variables (causes’ occurrence frequencies) and parameters (decomposed α-factors). At last, an example is provided to illustrate the process of hierarchical Bayesian inference for the α-decomposition process. This study shows that the α-decomposition method can integrate failure information from cause, component and system level. It can parameterize the CCF risk significance of possible causes and can update probability distributions of global α-factors. Besides, it can provide a reliable way to evaluate uncertainty sources and reduce the uncertainty in probabilistic risk assessment. It is recommended to build databases including CCF parameters and corresponding causes’ occurrence frequency of each targeted system

  18. Reducing uncertainty in high-resolution sea ice models.

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, Kara J.; Bochev, Pavel Blagoveston

    2013-07-01

    Arctic sea ice is an important component of the global climate system, reflecting a significant amount of solar radiation, insulating the ocean from the atmosphere and influencing ocean circulation by modifying the salinity of the upper ocean. The thickness and extent of Arctic sea ice have shown a significant decline in recent decades with implications for global climate as well as regional geopolitics. Increasing interest in exploration as well as climate feedback effects make predictive mathematical modeling of sea ice a task of tremendous practical import. Satellite data obtained over the last few decades have provided a wealth of information on sea ice motion and deformation. The data clearly show that ice deformation is focused along narrow linear features and this type of deformation is not well-represented in existing models. To improve sea ice dynamics we have incorporated an anisotropic rheology into the Los Alamos National Laboratory global sea ice model, CICE. Sensitivity analyses were performed using the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) to determine the impact of material parameters on sea ice response functions. Two material strength parameters that exhibited the most significant impact on responses were further analyzed to evaluate their influence on quantitative comparisons between model output and data. The sensitivity analysis along with ten year model runs indicate that while the anisotropic rheology provides some benefit in velocity predictions, additional improvements are required to make this material model a viable alternative for global sea ice simulations.

  19. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    CERN Document Server

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  20. Application Of Global Sensitivity Analysis And Uncertainty Quantification In Dynamic Modelling Of Micropollutants In Stormwater Runoff

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen

    2012-01-01

    The need for estimating micropollutants fluxes in stormwater systems increases the role of stormwater quality models as support for urban water managers, although the application of such models is affected by high uncertainty. This study presents a procedure for identifying the major sources...... of uncertainty in a conceptual lumped dynamic stormwater runoff quality model that is used in a study catchment to estimate (i) copper loads, (ii) compliance with dissolved Cu concentration limits on stormwater discharge and (iii) the fraction of Cu loads potentially intercepted by a planned treatment facility...

  1. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    Science.gov (United States)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique

  2. Accounting for Uncertainty in Decision Analytic Models Using Rank Preserving Structural Failure Time Modeling: Application to Parametric Survival Models.

    Science.gov (United States)

    Bennett, Iain; Paracha, Noman; Abrams, Keith; Ray, Joshua

    2018-01-01

    Rank Preserving Structural Failure Time models are one of the most commonly used statistical methods to adjust for treatment switching in oncology clinical trials. The method is often applied in a decision analytic model without appropriately accounting for additional uncertainty when determining the allocation of health care resources. The aim of the study is to describe novel approaches to adequately account for uncertainty when using a Rank Preserving Structural Failure Time model in a decision analytic model. Using two examples, we tested and compared the performance of the novel Test-based method with the resampling bootstrap method and with the conventional approach of no adjustment. In the first example, we simulated life expectancy using a simple decision analytic model based on a hypothetical oncology trial with treatment switching. In the second example, we applied the adjustment method on published data when no individual patient data were available. Mean estimates of overall and incremental life expectancy were similar across methods. However, the bootstrapped and test-based estimates consistently produced greater estimates of uncertainty compared with the estimate without any adjustment applied. Similar results were observed when using the test based approach on a published data showing that failing to adjust for uncertainty led to smaller confidence intervals. Both the bootstrapping and test-based approaches provide a solution to appropriately incorporate uncertainty, with the benefit that the latter can implemented by researchers in the absence of individual patient data. Copyright © 2018 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  3. Model uncertainty of various settlement estimation methods in shallow tunnels excavation; case study: Qom subway tunnel

    Science.gov (United States)

    Khademian, Amir; Abdollahipour, Hamed; Bagherpour, Raheb; Faramarzi, Lohrasb

    2017-10-01

    In addition to the numerous planning and executive challenges, underground excavation in urban areas is always followed by certain destructive effects especially on the ground surface; ground settlement is the most important of these effects for which estimation there exist different empirical, analytical and numerical methods. Since geotechnical models are associated with considerable model uncertainty, this study characterized the model uncertainty of settlement estimation models through a systematic comparison between model predictions and past performance data derived from instrumentation. To do so, the amount of surface settlement induced by excavation of the Qom subway tunnel was estimated via empirical (Peck), analytical (Loganathan and Poulos) and numerical (FDM) methods; the resulting maximum settlement value of each model were 1.86, 2.02 and 1.52 cm, respectively. The comparison of these predicted amounts with the actual data from instrumentation was employed to specify the uncertainty of each model. The numerical model outcomes, with a relative error of 3.8%, best matched the reality and the analytical method, with a relative error of 27.8%, yielded the highest level of model uncertainty.

  4. Hierarchical mixture of experts and diagnostic modeling approach to reduce hydrologic model structural uncertainty: STRUCTURAL UNCERTAINTY DIAGNOSTICS

    Energy Technology Data Exchange (ETDEWEB)

    Moges, Edom [Civil and Environmental Engineering Department, Washington State University, Richland Washington USA; Demissie, Yonas [Civil and Environmental Engineering Department, Washington State University, Richland Washington USA; Li, Hong-Yi [Hydrology Group, Pacific Northwest National Laboratory, Richland Washington USA

    2016-04-01

    In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integrate expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.

  5. Sensitivity, Error and Uncertainty Quantification: Interfacing Models at Different Scales

    International Nuclear Information System (INIS)

    Krstic, Predrag S.

    2014-01-01

    Discussion on accuracy of AMO data to be used in the plasma modeling codes for astrophysics and nuclear fusion applications, including plasma-material interfaces (PMI), involves many orders of magnitude of energy, spatial and temporal scales. Thus, energies run from tens of K to hundreds of millions of K, temporal and spatial scales go from fs to years and from nm’s to m’s and more, respectively. The key challenge for the theory and simulation in this field is the consistent integration of all processes and scales, i.e. an “integrated AMO science” (IAMO). The principal goal of the IAMO science is to enable accurate studies of interactions of electrons, atoms, molecules, photons, in many-body environment, including complex collision physics of plasma-material interfaces, leading to the best decisions and predictions. However, the accuracy requirement for a particular data strongly depends on the sensitivity of the respective plasma modeling applications to these data, which stresses a need for immediate sensitivity analysis feedback of the plasma modeling and material design communities. Thus, the data provision to the plasma modeling community is a “two-way road” as long as the accuracy of the data is considered, requiring close interactions of the AMO and plasma modeling communities.

  6. Interval Predictor Models for Data with Measurement Uncertainty

    Science.gov (United States)

    Lacerda, Marcio J.; Crespo, Luis G.

    2017-01-01

    An interval predictor model (IPM) is a computational model that predicts the range of an output variable given input-output data. This paper proposes strategies for constructing IPMs based on semidefinite programming and sum of squares (SOS). The models are optimal in the sense that they yield an interval valued function of minimal spread containing all the observations. Two different scenarios are considered. The first one is applicable to situations where the data is measured precisely whereas the second one is applicable to data subject to known biases and measurement error. In the latter case, the IPMs are designed to fully contain regions in the input-output space where the data is expected to fall. Moreover, we propose a strategy for reducing the computational cost associated with generating IPMs as well as means to simulate them. Numerical examples illustrate the usage and performance of the proposed formulations.

  7. Predictive uncertainty reduction in coupled neutron-kinetics/thermal hydraulics modeling of the BWR-TT2 benchmark

    Energy Technology Data Exchange (ETDEWEB)

    Badea, Aurelian F., E-mail: aurelian.badea@kit.edu [Karlsruhe Institute of Technology, Vincenz-Prießnitz-Str. 3, 76131 Karlsruhe (Germany); Cacuci, Dan G. [Center for Nuclear Science and Energy/Dept. of ME, University of South Carolina, 300 Main Street, Columbia, SC 29208 (United States)

    2017-03-15

    Highlights: • BWR Turbine Trip 2 (BWR-TT2) benchmark. • Substantial (up to 50%) reduction of uncertainties in the predicted transient power. • 6660 uncertain model parameters were calibrated. - Abstract: By applying a comprehensive predictive modeling methodology, this work demonstrates a substantial (up to 50%) reduction of uncertainties in the predicted total transient power in the BWR Turbine Trip 2 (BWR-TT2) benchmark while calibrating the numerical simulation of this benchmark, comprising 6090 macroscopic cross sections, and 570 thermal-hydraulics parameters involved in modeling the phase-slip correlation, transient outlet pressure, and total mass flow. The BWR-TT2 benchmark is based on an experiment that was carried out in 1977 in the NPP Peach Bottom 2, involving the closure of the turbine stop valve which caused a pressure wave that propagated with attenuation into the reactor core. The condensation of the steam in the reactor core caused by the pressure increase led to a positive reactivity insertion. The subsequent rise of power was limited by the feedback and the insertion of the control rods. The BWR-TT2 benchmark was modeled with the three-dimensional reactor physics code system DYN3D, by coupling neutron kinetics with two-phase thermal-hydraulics. All 6660 DYN3D model parameters were calibrated by applying a predictive modeling methodology that combines experimental and computational information to produce optimally predicted best-estimate results with reduced predicted uncertainties. Simultaneously, the predictive modeling methodology yields optimally predicted values for the BWR total transient power while reducing significantly the accompanying predicted standard deviations.

  8. Hierarchical Multi-Scale Approach To Validation and Uncertainty Quantification of Hyper-Spectral Image Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Engel, David W.; Reichardt, Thomas A.; Kulp, Thomas J.; Graff, David; Thompson, Sandra E.

    2016-09-17

    Validating predictive models and quantifying uncertainties inherent in the