WorldWideScience

Sample records for model structure uncertainty

  1. Numerical Modelling of Structures with Uncertainties

    Directory of Open Access Journals (Sweden)

    Kahsin Maciej

    2017-04-01

    Full Text Available The nature of environmental interactions, as well as large dimensions and complex structure of marine offshore objects, make designing, building and operation of these objects a great challenge. This is the reason why a vast majority of investment cases of this type include structural analysis, performed using scaled laboratory models and complemented by extended computer simulations. The present paper focuses on FEM modelling of the offshore wind turbine supporting structure. Then problem is studied using the modal analysis, sensitivity analysis, as well as the design of experiment (DOE and response surface model (RSM methods. The results of modal analysis based simulations were used for assessing the quality of the FEM model against the data measured during the experimental modal analysis of the scaled laboratory model for different support conditions. The sensitivity analysis, in turn, has provided opportunities for assessing the effect of individual FEM model parameters on the dynamic response of the examined supporting structure. The DOE and RSM methods allowed to determine the effect of model parameter changes on the supporting structure response.

  2. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    Science.gov (United States)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  3. Inspection Uncertainty and Model Uncertainty Updating for Ship Structures Subjected to Corrosion Deterioration

    Institute of Scientific and Technical Information of China (English)

    LIDian-qing; ZHANGSheng-kun

    2004-01-01

    The classical probability theory cannot effectively quantify the parameter uncertainty in probability of detection.Furthermore,the conventional data analytic method and expert judgment method fail to handle the problem of model uncertainty updating with the information from nondestructive inspection.To overcome these disadvantages,a Bayesian approach was proposed to quantify the parameter uncertainty in probability of detection.Furthermore,the formulae of the multiplication factors to measure the statistical uncertainties in the probability of detection following the Weibull distribution were derived.A Bayesian updating method was applied to compute the posterior probabilities of model weights and the posterior probability density functions of distribution parameters of probability of detection.A total probability model method was proposed to analyze the problem of multi-layered model uncertainty updating.This method was then applied to the problem of multilayered corrosion model uncertainty updating for ship structures.The results indicate that the proposed method is very effective in analyzing the problem of multi-layered model uncertainty updating.

  4. Nonlinear structural finite element model updating and uncertainty quantification

    Science.gov (United States)

    Ebrahimian, Hamed; Astroza, Rodrigo; Conte, Joel P.

    2015-04-01

    This paper presents a framework for nonlinear finite element (FE) model updating, in which state-of-the-art nonlinear structural FE modeling and analysis techniques are combined with the maximum likelihood estimation method (MLE) to estimate time-invariant parameters governing the nonlinear hysteretic material constitutive models used in the FE model of the structure. The estimation uncertainties are evaluated based on the Cramer-Rao lower bound (CRLB) theorem. A proof-of-concept example, consisting of a cantilever steel column representing a bridge pier, is provided to verify the proposed nonlinear FE model updating framework.

  5. Predicting the Term Structure of Interest Rates: Incorporating parameter uncertainty, model uncertainty and macroeconomic information

    NARCIS (Netherlands)

    M.D. de Pooter (Michiel); F. Ravazzolo (Francesco); D.J.C. van Dijk (Dick)

    2007-01-01

    textabstractWe forecast the term structure of U.S. Treasury zero-coupon bond yields by analyzing a range of models that have been used in the literature. We assess the relevance of parameter uncertainty by examining the added value of using Bayesian inference compared to frequentist estimation

  6. Comparing Two Strategies to Model Uncertainties in Structural Dynamics

    Directory of Open Access Journals (Sweden)

    Rubens Sampaio

    2010-01-01

    Full Text Available In the modeling of dynamical systems, uncertainties are present and they must be taken into account to improve the prediction of the models. Some strategies have been used to model uncertainties and the aim of this work is to discuss two of those strategies and to compare them. This will be done using the simplest model possible: a two d.o.f. (degrees of freedom dynamical system. A simple system is used because it is very helpful to assure a better understanding and, consequently, comparison of the strategies. The first strategy (called parametric strategy consists in taking each spring stiffness as uncertain and a random variable is associated to each one of them. The second strategy (called nonparametric strategy is more general and considers the whole stiffness matrix as uncertain, and associates a random matrix to it. In both cases, the probability density functions either of the random parameters or of the random matrix are deduced from the Maximum Entropy Principle using only the available information. With this example, some important results can be discussed, which cannot be assessed when complex structures are used, as it has been done so far in the literature. One important element for the comparison of the two strategies is the analysis of the samples spaces and the how to compare them.

  7. An Uncertainty Structure Matrix for Models and Simulations

    Science.gov (United States)

    Green, Lawrence L.; Blattnig, Steve R.; Hemsch, Michael J.; Luckring, James M.; Tripathi, Ram K.

    2008-01-01

    Software that is used for aerospace flight control and to display information to pilots and crew is expected to be correct and credible at all times. This type of software is typically developed under strict management processes, which are intended to reduce defects in the software product. However, modeling and simulation (M&S) software may exhibit varying degrees of correctness and credibility, depending on a large and complex set of factors. These factors include its intended use, the known physics and numerical approximations within the M&S, and the referent data set against which the M&S correctness is compared. The correctness and credibility of an M&S effort is closely correlated to the uncertainty management (UM) practices that are applied to the M&S effort. This paper describes an uncertainty structure matrix for M&S, which provides a set of objective descriptions for the possible states of UM practices within a given M&S effort. The columns in the uncertainty structure matrix contain UM elements or practices that are common across most M&S efforts, and the rows describe the potential levels of achievement in each of the elements. A practitioner can quickly look at the matrix to determine where an M&S effort falls based on a common set of UM practices that are described in absolute terms that can be applied to virtually any M&S effort. The matrix can also be used to plan those steps and resources that would be needed to improve the UM practices for a given M&S effort.

  8. A Bayesian Chance-Constrained Method for Hydraulic Barrier Design Under Model Structure Uncertainty

    Science.gov (United States)

    Chitsazan, N.; Pham, H. V.; Tsai, F. T. C.

    2014-12-01

    The groundwater community has widely recognized the model structure uncertainty as the major source of model uncertainty in groundwater modeling. Previous studies in the aquifer remediation design, however, rarely discuss the impact of the model structure uncertainty. This study combines the chance-constrained (CC) programming with the Bayesian model averaging (BMA) as a BMA-CC framework to assess the effect of model structure uncertainty in the remediation design. To investigate the impact of the model structure uncertainty on the remediation design, we compare the BMA-CC method with the traditional CC programming that only considers the model parameter uncertainty. The BMA-CC method is employed to design a hydraulic barrier to protect public supply wells of the Government St. pump station from saltwater intrusion in the "1,500-foot" sand and the "1-700-foot" sand of the Baton Rouge area, southeastern Louisiana. To address the model structure uncertainty, we develop three conceptual groundwater models based on three different hydrostratigraphy structures. The results show that using the traditional CC programming overestimates design reliability. The results also show that at least five additional connector wells are needed to achieve more than 90% design reliability level. The total amount of injected water from connector wells is higher than the total pumpage of the protected public supply wells. While reducing injection rate can be achieved by reducing reliability level, the study finds that the hydraulic barrier design to protect the Government St. pump station is not economically attractive.

  9. An Updating Method for Structural Dynamics Models with Uncertainties

    Directory of Open Access Journals (Sweden)

    B. Faverjon

    2008-01-01

    Full Text Available One challenge in the numerical simulation of industrial structures is model validation based on experimental data. Among the indirect or parametric methods available, one is based on the “mechanical” concept of constitutive relation error estimator introduced in order to quantify the quality of finite element analyses. In the case of uncertain measurements obtained from a family of quasi-identical structures, parameters need to be modeled randomly. In this paper, we consider the case of a damped structure modeled with stochastic variables. Polynomial chaos expansion and reduced bases are used to solve the stochastic problems involved in the calculation of the error.

  10. Hierarchical mixture of experts and diagnostic modeling approach to reduce hydrologic model structural uncertainty: STRUCTURAL UNCERTAINTY DIAGNOSTICS

    Energy Technology Data Exchange (ETDEWEB)

    Moges, Edom [Civil and Environmental Engineering Department, Washington State University, Richland Washington USA; Demissie, Yonas [Civil and Environmental Engineering Department, Washington State University, Richland Washington USA; Li, Hong-Yi [Hydrology Group, Pacific Northwest National Laboratory, Richland Washington USA

    2016-04-01

    In most water resources applications, a single model structure might be inadequate to capture the dynamic multi-scale interactions among different hydrological processes. Calibrating single models for dynamic catchments, where multiple dominant processes exist, can result in displacement of errors from structure to parameters, which in turn leads to over-correction and biased predictions. An alternative to a single model structure is to develop local expert structures that are effective in representing the dominant components of the hydrologic process and adaptively integrate them based on an indicator variable. In this study, the Hierarchical Mixture of Experts (HME) framework is applied to integrate expert model structures representing the different components of the hydrologic process. Various signature diagnostic analyses are used to assess the presence of multiple dominant processes and the adequacy of a single model, as well as to identify the structures of the expert models. The approaches are applied for two distinct catchments, the Guadalupe River (Texas) and the French Broad River (North Carolina) from the Model Parameter Estimation Experiment (MOPEX), using different structures of the HBV model. The results show that the HME approach has a better performance over the single model for the Guadalupe catchment, where multiple dominant processes are witnessed through diagnostic measures. Whereas, the diagnostics and aggregated performance measures prove that French Broad has a homogeneous catchment response, making the single model adequate to capture the response.

  11. Framework system and research flow of uncertainty in 3D geological structure models

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    Uncertainty in 3D geological structure models has become a bottleneck that restricts the development and application of 3D geological modeling.In order to solve this problem during periods of accuracy assessment,error detection and dynamic correction in 3D geological structure models,we have reviewed the current situation and development trends in 3D geological modeling.The main context of uncertainty in 3D geological structure models is discussed.Major research issues and a general framework system of unce...

  12. Multilevel model reduction for uncertainty quantification in computational structural dynamics

    Science.gov (United States)

    Ezvan, O.; Batou, A.; Soize, C.; Gagliardini, L.

    2016-11-01

    Within the continuum mechanics framework, there are two main approaches to model interfaces: classical cohesive zone modeling (CZM) and interface elasticity theory. The classical CZM deals with geometrically non-coherent interfaces for which the constitutive relation is expressed in terms of traction-separation laws. However, CZM lacks any response related to the stretch of the mid-plane of the interface. This issue becomes problematic particularly at small scales with increasing interface area to bulk volume ratios, where interface elasticity is no longer negligible. The interface elasticity theory, in contrast to CZM, deals with coherent interfaces that are endowed with their own energetic structures, and thus is capable of capturing elastic resistance to tangential stretch. Nonetheless, the interface elasticity theory suffers from the lack of inelastic material response, regardless of the strain level. The objective of this contribution therefore is to introduce a generalized mechanical interface model that couples both the elastic response along the interface and the cohesive response across the interface whereby interface degradation is taken into account. The material degradation of the interface mid-plane is captured by a non-local damage model of integral-type. The out-of-plane decohesion is described by a classical cohesive zone model. These models are then coupled through their corresponding damage variables. The non-linear governing equations and the weak forms thereof are derived. The numerical implementation is carried out using the finite element method and consistent tangents are derived. Finally, a series of numerical examples is studied to provide further insight into the problem and to carefully elucidate key features of the proposed theory.

  13. Model structural uncertainty quantification and hydrologic parameter and prediction error analysis using airborne electromagnetic data

    DEFF Research Database (Denmark)

    Minsley, B. J.; Christensen, Nikolaj Kruse; Christensen, Steen

    Model structure, or the spatial arrangement of subsurface lithological units, is fundamental to the hydrological behavior of Earth systems. Knowledge of geological model structure is critically important in order to make informed hydrological predictions and management decisions. Model structure...... indicator simulation, we produce many realizations of model structure that are consistent with observed datasets and prior knowledge. Given estimates of model structural uncertainty, we incorporate hydrologic observations to evaluate the errors in hydrologic parameter or prediction errors that occur when...... is never perfectly known, however, and incorrect assumptions can be a significant source of error when making model predictions. We describe a systematic approach for quantifying model structural uncertainty that is based on the integration of sparse borehole observations and large-scale airborne...

  14. Partitioning uncertainty in ocean carbon uptake projections: Internal variability, emission scenario, and model structure

    Science.gov (United States)

    Lovenduski, Nicole S.; McKinley, Galen A.; Fay, Amanda R.; Lindsay, Keith; Long, Matthew C.

    2016-09-01

    We quantify and isolate the sources of projection uncertainty in annual-mean sea-air CO2 flux over the period 2006-2080 on global and regional scales using output from two sets of ensembles with the Community Earth System Model (CESM) and models participating in the 5th Coupled Model Intercomparison Project (CMIP5). For annual-mean, globally-integrated sea-air CO2 flux, uncertainty grows with prediction lead time and is primarily attributed to uncertainty in emission scenario. At the regional scale of the California Current System, we observe relatively high uncertainty that is nearly constant for all prediction lead times, and is dominated by internal climate variability and model structure, respectively in the CESM and CMIP5 model suites. Analysis of CO2 flux projections over 17 biogeographical biomes reveals a spatially heterogenous pattern of projection uncertainty. On the biome scale, uncertainty is driven by a combination of internal climate variability and model structure, with emission scenario emerging as the dominant source for long projection lead times in both modeling suites.

  15. Assessment of structural model and parameter uncertainty with a multi-model system for soil water balance models

    Science.gov (United States)

    Michalik, Thomas; Multsch, Sebastian; Frede, Hans-Georg; Breuer, Lutz

    2016-04-01

    Water for agriculture is strongly limited in arid and semi-arid regions and often of low quality in terms of salinity. The application of saline waters for irrigation increases the salt load in the rooting zone and has to be managed by leaching to maintain a healthy soil, i.e. to wash out salts by additional irrigation. Dynamic simulation models are helpful tools to calculate the root zone water fluxes and soil salinity content in order to investigate best management practices. However, there is little information on structural and parameter uncertainty for simulations regarding the water and salt balance of saline irrigation. Hence, we established a multi-model system with four different models (AquaCrop, RZWQM, SWAP, Hydrus1D/UNSATCHEM) to analyze the structural and parameter uncertainty by using the Global Likelihood and Uncertainty Estimation (GLUE) method. Hydrus1D/UNSATCHEM and SWAP were set up with multiple sets of different implemented functions (e.g. matric and osmotic stress for root water uptake) which results in a broad range of different model structures. The simulations were evaluated against soil water and salinity content observations. The posterior distribution of the GLUE analysis gives behavioral parameters sets and reveals uncertainty intervals for parameter uncertainty. Throughout all of the model sets, most parameters accounting for the soil water balance show a low uncertainty, only one or two out of five to six parameters in each model set displays a high uncertainty (e.g. pore-size distribution index in SWAP and Hydrus1D/UNSATCHEM). The differences between the models and model setups reveal the structural uncertainty. The highest structural uncertainty is observed for deep percolation fluxes between the model sets of Hydrus1D/UNSATCHEM (~200 mm) and RZWQM (~500 mm) that are more than twice as high for the latter. The model sets show a high variation in uncertainty intervals for deep percolation as well, with an interquartile range (IQR) of

  16. An Integrated Hydrologic Bayesian Multi-Model Combination Framework: Confronting Input, parameter and model structural uncertainty in Hydrologic Prediction

    Energy Technology Data Exchange (ETDEWEB)

    Ajami, N K; Duan, Q; Sorooshian, S

    2006-05-05

    This paper presents a new technique--Integrated Bayesian Uncertainty Estimator (IBUNE) to account for the major uncertainties of hydrologic rainfall-runoff predictions explicitly. The uncertainties from the input (forcing) data--mainly the precipitation observations and from the model parameters are reduced through a Monte Carlo Markov Chain (MCMC) scheme named Shuffled Complex Evolution Metropolis (SCEM) algorithm which has been extended to include a precipitation error model. Afterwards, the Bayesian Model Averaging (BMA) scheme is employed to further improve the prediction skill and uncertainty estimation using multiple model output. A series of case studies using three rainfall-runoff models to predict the streamflow in the Leaf River basin, Mississippi are used to examine the necessity and usefulness of this technique. The results suggests that ignoring either input forcings error or model structural uncertainty will lead to unrealistic model simulations and their associated uncertainty bounds which does not consistently capture and represent the real-world behavior of the watershed.

  17. Understanding quantitative structure-property relationships uncertainty in environmental fate modeling.

    Science.gov (United States)

    Sarfraz Iqbal, M; Golsteijn, Laura; Öberg, Tomas; Sahlin, Ullrika; Papa, Ester; Kovarich, Simona; Huijbregts, Mark A J

    2013-04-01

    In cases in which experimental data on chemical-specific input parameters are lacking, chemical regulations allow the use of alternatives to testing, such as in silico predictions based on quantitative structure-property relationships (QSPRs). Such predictions are often given as point estimates; however, little is known about the extent to which uncertainties associated with QSPR predictions contribute to uncertainty in fate assessments. In the present study, QSPR-induced uncertainty in overall persistence (POV ) and long-range transport potential (LRTP) was studied by integrating QSPRs into probabilistic assessments of five polybrominated diphenyl ethers (PBDEs), using the multimedia fate model Simplebox. The uncertainty analysis considered QSPR predictions of the fate input parameters' melting point, water solubility, vapor pressure, organic carbon-water partition coefficient, hydroxyl radical degradation, biodegradation, and photolytic degradation. Uncertainty in POV and LRTP was dominated by the uncertainty in direct photolysis and the biodegradation half-life in water. However, the QSPRs developed specifically for PBDEs had a relatively low contribution to uncertainty. These findings suggest that the reliability of the ranking of PBDEs on the basis of POV and LRTP can be substantially improved by developing better QSPRs to estimate degradation properties. The present study demonstrates the use of uncertainty and sensitivity analyses in nontesting strategies and highlights the need for guidance when compounds fall outside the applicability domain of a QSPR.

  18. Model structural uncertainty quantification and hydrogeophysical data integration using airborne electromagnetic data (Invited)

    DEFF Research Database (Denmark)

    Minsley, Burke; Christensen, Nikolaj Kruse; Christensen, Steen

    Detailed estimates of physical property distributions- such as electrical resistivity- are common end products of geophysical surveys, but are often of limited use for the geologist, hydrologist, or resource manager who is tasked with making decisions based on these data. Here, we focus on the use...... of airborne electromagnetic (AEM) data to estimate large-scale model structural geometry, i.e. the spatial distribution of different lithological units based on assumed or estimated resistivity-lithology relationships, and the uncertainty in those structures given imperfect measurements. Geophysically derived...... that illustrate the complete workflow from geophysical parameter uncertainty analysis to the impact of model structural uncertainty on hydrologic parameter estimates. We also discuss some of the computational challenges associated with application to large AEM surveys with many thousands of data locations....

  19. Uncertainty modeling in vibration, control and fuzzy analysis of structural systems

    CERN Document Server

    Halder, Achintya; Ayyub, Bilal M

    1997-01-01

    This book gives an overview of the current state of uncertainty modeling in vibration, control, and fuzzy analysis of structural and mechanical systems. It is a coherent compendium written by leading experts and offers the reader a sampling of exciting research areas in several fast-growing branches in this field. Uncertainty modeling and analysis are becoming an integral part of system definition and modeling in many fields. The book consists of ten chapters that report the work of researchers, scientists and engineers on theoretical developments and diversified applications in engineering sy

  20. Evaluating uncertainty in simulation models

    Energy Technology Data Exchange (ETDEWEB)

    McKay, M.D.; Beckman, R.J.; Morrison, J.D.; Upton, S.C.

    1998-12-01

    The authors discussed some directions for research and development of methods for assessing simulation variability, input uncertainty, and structural model uncertainty. Variance-based measures of importance for input and simulation variables arise naturally when using the quadratic loss function of the difference between the full model prediction y and the restricted prediction {tilde y}. The concluded that generic methods for assessing structural model uncertainty do not now exist. However, methods to analyze structural uncertainty for particular classes of models, like discrete event simulation models, may be attainable.

  1. On the Uncertainty of Identification of Civil Engineering Structures Using ARMA Models

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Kirkegaard, Poul Henning

    1995-01-01

    In this paper the uncertainties of modal parameters estimated using ARMA models for identification of civil engineering structures are investigated. How to initialize the predictor part of a Gauss-Newton optimization algorithm is put in focus. A backward-forecasting procedure for initialization...

  2. On the Uncertainty of Identification of Civil Engineering Structures using ARMA Models

    DEFF Research Database (Denmark)

    Andersen, P.; Brincker, Rune; Kirkegaard, Poul Henning

    In this paper the uncertainties of modal parameters estimated using ARMA models for identification of civil engineering structures are investigated. How to initialize the predictor part of a Gauss-Newton optimization algorithm is put in focus. A backward-forecasting procedure for initialization...

  3. On the Uncertainty of Identification of Civil Engineering Structures Using ARMA Models

    DEFF Research Database (Denmark)

    Andersen, Palle; Brincker, Rune; Kirkegaard, Poul Henning

    1995-01-01

    In this paper the uncertainties of modal parameters estimated using ARMA models for identification of civil engineering structures are investigated. How to initialize the predictor part of a Gauss-Newton optimization algorithm is put in focus. A backward-forecasting procedure for initialization...... of the predictor is proposed. This procedure is compared with a standard prediction error method optimization algorithm in a simulation study. It is found that the uncertainties can be reduced by a proper selection of the initial conditions for the predictor....

  4. Cost-Benefit Assessment of Inspection and Repair Planning for Ship Structures Considering Corrosion Model Uncertainty

    Institute of Scientific and Technical Information of China (English)

    LI Dian-qing; TANG Wen-yong; ZHANG Sheng-kun

    2005-01-01

    Owing to high costs and unnecessary inspections necessitated by the traditional inspection planning for ship structures, the risk-based inspection and repair planning should be investigated for the most cost-effective inspection. This paper aims to propose a cost-benefit assessment model of risk-based inspection and repair planning for ship structures subjected to corrosion deterioration. Then, the benefit-cost ratio is taken to be an index for the selection of the optimal inspection and repair strategy. The planning problem is formulated as an optimization problem where the benefit-cost ratio for the expected lifetime is maximized with a constraint on the minimum acceptable reliability index. To account for the effect of corrosion model uncertainty on the cost-benefit assessment, two corrosion models, namely, Paik's model and Guedes Soares' model, are adopted for analysis. A numerical example is presented to illustrate the proposed method. Sensitivity studies are also provided. The results indicate that the proposed method of risk-based cost-benefit analysis can effectively integrate the economy with reliability of the inspection and repair planning. A balance can be achieved between the risk cost and total expected inspection and repair costs with the proposed method, which is very effective in selecting the optimal inspection and repair strategy. It is pointed out that the corrosion model uncertainty and parametric uncertainty have a significant impact on the cost-benefit assessment of inspection and repair planning.

  5. Robust H∞ control for aseismic structures With uncertainties in model parameters

    Institute of Scientific and Technical Information of China (English)

    Song Gang; Lin Jiahao; Zhao Yan; W.Paul Howson; Fred W Williams

    2007-01-01

    This paper presents a robust H∞ output feedback control approach for structural systems with uncertainties in model parameters by using available acceleration measurements and proposes conditions for the existence of such a robust ontput feedback controller.The uncertainties of structural stiffness,damping and mass parameters are assumed to be norm-bounded.The proposed control approach is formulated within the framework of linear matrix inequalities,for which existing convex optimization techniques,such as the LMI toolbox in MATLAB,can be used effectively and conveniently.To illustrate the effectiveness of the proposed robust H∞ strategy,a six-story building was subjected both to the 1940 El Centro earthquake record and to a suddenly applied Kanai-Tajimi filtered white noise random excitation.The results show that the proposed robust H∞ controller provides satisfactory results with or without variation of the structural stiffness,damping and mass parameters.

  6. Model Uncertainty for Bilinear Hysteric Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    In structural reliability analysis at least three types of uncertainty must be considered, namely physical uncertainty, statistical uncertainty, and model uncertainty (see e.g. Thoft-Christensen & Baker [1]). The physical uncertainty is usually modelled by a number of basic variables by predictive...... density functions, Veneziano [2]. In general, model uncertainty is the uncertainty connected with mathematical modelling of the physical reality. When structural reliability analysis is related to the concept of a failure surface (or limit state surface) in the n-dimension basic variable space then model...... uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used....

  7. Climate model uncertainty vs. conceptual geological uncertainty in hydrological modeling

    Directory of Open Access Journals (Sweden)

    T. O. Sonnenborg

    2015-04-01

    Full Text Available Projections of climate change impact are associated with a cascade of uncertainties including CO2 emission scenario, climate model, downscaling and impact model. The relative importance of the individual uncertainty sources is expected to depend on several factors including the quantity that is projected. In the present study the impacts of climate model uncertainty and geological model uncertainty on hydraulic head, stream flow, travel time and capture zones are evaluated. Six versions of a physically based and distributed hydrological model, each containing a unique interpretation of the geological structure of the model area, are forced by 11 climate model projections. Each projection of future climate is a result of a GCM-RCM model combination (from the ENSEMBLES project forced by the same CO2 scenario (A1B. The changes from the reference period (1991–2010 to the future period (2081–2100 in projected hydrological variables are evaluated and the effects of geological model and climate model uncertainties are quantified. The results show that uncertainty propagation is context dependent. While the geological conceptualization is the dominating uncertainty source for projection of travel time and capture zones, the uncertainty on the climate models is more important for groundwater hydraulic heads and stream flow.

  8. Finite element model updating for large span spatial steel structure considering uncertainties

    Institute of Scientific and Technical Information of China (English)

    TENG Jun; ZHU Yan-huang; ZHOU Feng; LI Hui; OU Jin-ping

    2010-01-01

    In order to establish the baseline finite element model for structural health monitoring,a new method of model updating was proposed after analyzing the uncertainties of measured data and the error of finite element model.In the new method,the finite element model was replaced by the multi-output support vector regression machine(MSVR).The interval variables of the measured frequency were sampled by Latin hypercube sampling method.The samples of frequency were regarded as the inputs of the trained MSVR.The outputs of MSVR were the target values of design parameters.The steel structure of National Aquatic Center for Beijing Olympic Games was introduced as a case for finite element model updating.The results show that the proposed method can avoid solving the problem of complicated calculation.Both the estimated values and associated uncertainties of the structure parameters can be obtained by the method.The static and dynamic characteristics of the updated finite element model are in good agreement with the measured data.

  9. Managing Information Uncertainty in Wave Height Modeling for the Offshore Structural Analysis through Random Set

    Directory of Open Access Journals (Sweden)

    Keqin Yan

    2017-01-01

    Full Text Available This chapter presents a reliability study for an offshore jacket structure with emphasis on the features of nonconventional modeling. Firstly, a random set model is formulated for modeling the random waves in an ocean site. Then, a jacket structure is investigated in a pushover analysis to identify the critical wave direction and key structural elements. This is based on the ultimate base shear strength. The selected probabilistic models are adopted for the important structural members and the wave direction is specified in the weakest direction of the structure for a conservative safety analysis. The wave height model is processed in a P-box format when it is used in the numerical analysis. The models are applied to find the bounds of the failure probabilities for the jacket structure. The propagation of this wave model to the uncertainty in results is investigated in both an interval analysis and Monte Carlo simulation. The results are compared in context of information content and numerical accuracy. Further, the failure probability bounds are compared with the conventional probabilistic approach.

  10. Effect of in-structure damping uncertainty on semi-active control performance: a modeling perspective

    Science.gov (United States)

    Puthanpurayil, Arun M.; Reynolds, Paul; Nyawako, Donald

    2013-04-01

    The mathematical model of a vibrating structure includes mass, damping and stiffness; out of which mass and stiffness could be defined as a function of the system geometry, whereas damping is more of an observed phenomenon. Despite having a large literature on the subject, the underlying physics is only known in a phenomenological ad-hoc manner, making damping an overall mystery in the general dynamic analysis of structures. A major reason of this could be the fact that there is no single universally accepted model for damping. Common practice is to use the classical viscous damping model originated by Rayleigh, through his famous `Rayleigh dissipation function', with a preconceived damping ratio, irrespective of the purpose or type of analysis involved. This paper investigates the effect of this modelling uncertainty on the analytical prediction of the required control force in a semi-active control application for civil structures. Global classical Rayleigh damping models and global non-viscous damping models are used in the present study. Responses of a laboratory slab strip are simulated and are compared with experimental responses. The comparisons emphasises the fact that the choice of in-structure damping models has a significant effect in the computation of the required control force. The comparison also clearly indicates that mathematically sophisticated models have better prediction capability as compared to the classical Rayleigh model.

  11. Accounting for environmental variability, modeling errors, and parameter estimation uncertainties in structural identification

    Science.gov (United States)

    Behmanesh, Iman; Moaveni, Babak

    2016-07-01

    This paper presents a Hierarchical Bayesian model updating framework to account for the effects of ambient temperature and excitation amplitude. The proposed approach is applied for model calibration, response prediction and damage identification of a footbridge under changing environmental/ambient conditions. The concrete Young's modulus of the footbridge deck is the considered updating structural parameter with its mean and variance modeled as functions of temperature and excitation amplitude. The identified modal parameters over 27 months of continuous monitoring of the footbridge are used to calibrate the updating parameters. One of the objectives of this study is to show that by increasing the levels of information in the updating process, the posterior variation of the updating structural parameter (concrete Young's modulus) is reduced. To this end, the calibration is performed at three information levels using (1) the identified modal parameters, (2) modal parameters and ambient temperatures, and (3) modal parameters, ambient temperatures, and excitation amplitudes. The calibrated model is then validated by comparing the model-predicted natural frequencies and those identified from measured data after deliberate change to the structural mass. It is shown that accounting for modeling error uncertainties is crucial for reliable response prediction, and accounting only the estimated variability of the updating structural parameter is not sufficient for accurate response predictions. Finally, the calibrated model is used for damage identification of the footbridge.

  12. Modelling Inflation Uncertainty with Structural Breaks Case of Turkey (1994–2013

    Directory of Open Access Journals (Sweden)

    Pınar Göktaş

    2014-01-01

    Full Text Available In recent years, the importance attached to the concept of volatility has increased and become a phenomenon frequently encountered in every field ranging from financial markets to macroeconomic indicators. In this study, inflation data obtained from CPI index for the period of 1994:01–2013:12 in Turkey was used to determine the best representative of the inflation uncertainty. To realize this, both symmetric and asymmetric GARCH-type models were employed. Since there are many factors that may lead to structural change within the economic course of Turkey, a structural break in the series has first been investigated. By administering Bai-Perron structural break test, two different break points both in mean and variance have been detected to be in February 2002 and in June 2001, respectively. The inclusion of those break points to the related equations, appropriate forecasting models were projected. Moreover it was found that, while in the periods prior to the break in both variance and mean the inflation itself was the reason for inflation uncertainty, following the dates of the break, the relationship changed bidirectionally. In the meantime, when the series was taken as a whole without considering the break, bidirectional causality relationship was also detected in the series.

  13. Significance of uncertainties derived from settling tank model structure and parameters on predicting WWTP performance - A global sensitivity analysis study

    DEFF Research Database (Denmark)

    Ramin, Elham; Sin, Gürkan; Mikkelsen, Peter Steen

    2011-01-01

    Uncertainty derived from one of the process models – such as one-dimensional secondary settling tank (SST) models – can impact the output of the other process models, e.g., biokinetic (ASM1), as well as the integrated wastewater treatment plant (WWTP) models. The model structure and parameter...... uncertainty of settler models can therefore propagate, and add to the uncertainties in prediction of any plant performance criteria. Here we present an assessment of the relative significance of secondary settling model performance in WWTP simulations. We perform a global sensitivity analysis (GSA) based....... The outcome of this study contributes to a better understanding of uncertainty in WWTPs, and explicitly demonstrates the significance of secondary settling processes that are crucial elements of model prediction under dry and wet-weather loading conditions....

  14. Testing hypotheses of the functioning of a tropical catchment: evaluating the role of model-structural and observational uncertainties

    Science.gov (United States)

    Westerberg, I.; Birkel, C.

    2012-04-01

    Knowledge about hydrological processes and the spatial and temporal distribution of water resources is the basis for water management such as hydropower, agriculture and flood-protection. Conceptual hydrological models may be used to infer knowledge on catchment functioning but are affected by uncertainties in the model representation of reality as well as in the observational data used to drive the model and to evaluate model performance. Therefore, meaningful hypotheses testing of the hydrological functioning of a catchment requires such uncertainties to be carefully estimated and accounted for in model calibration and evaluation. We investigated the hydrological functioning of the relatively data-scarce tropical Sarapiqui catchment in Costa Rica, Central America, where water resources play a vital part for hydropower production and livelihood. Hypotheses on catchment functioning using different model structures were tested within an uncertainty estimation framework specifically accounting for observational uncertainties. The uncertainty in discharge data was estimated from a rating-curve analysis and precipitation measurement errors through scenarios relating the error to, for example, the elevation gradient. The suitability of the different model structures as hypotheses about the functioning of the catchment was evaluated in a posterior analysis of the simulations. The performance of each simulation relative to the observational uncertainties was analysed for the entire hydrograph as well as for different aspects of the hydrograph (e.g. peak flows, recession periods, and base flow). This analysis enabled the identification of periods of likely model-structural errors and periods of probable data errors. We conclude that accounting for observational uncertainties led to improved hypotheses testing, which resulted in less risk of rejecting an acceptable model structure because of uncertainties in the forcing and evaluation data.

  15. Impact of influent data frequency and model structure on the quality of WWTP model calibration and uncertainty.

    Science.gov (United States)

    Cierkens, Katrijn; Plano, Salvatore; Benedetti, Lorenzo; Weijers, Stefan; de Jonge, Jarno; Nopens, Ingmar

    2012-01-01

    Application of activated sludge models (ASMs) to full-scale wastewater treatment plants (WWTPs) is still hampered by the problem of model calibration of these over-parameterised models. This either requires expert knowledge or global methods that explore a large parameter space. However, a better balance in structure between the submodels (ASM, hydraulic, aeration, etc.) and improved quality of influent data result in much smaller calibration efforts. In this contribution, a methodology is proposed that links data frequency and model structure to calibration quality and output uncertainty. It is composed of defining the model structure, the input data, an automated calibration, confidence interval computation and uncertainty propagation to the model output. Apart from the last step, the methodology is applied to an existing WWTP using three models differing only in the aeration submodel. A sensitivity analysis was performed on all models, allowing the ranking of the most important parameters to select in the subsequent calibration step. The aeration submodel proved very important to get good NH(4) predictions. Finally, the impact of data frequency was explored. Lowering the frequency resulted in larger deviations of parameter estimates from their default values and larger confidence intervals. Autocorrelation due to high frequency calibration data has an opposite effect on the confidence intervals. The proposed methodology opens doors to facilitate and improve calibration efforts and to design measurement campaigns.

  16. Uncertainties in repository modeling

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, J.R.

    1996-12-31

    The distant future is ver difficult to predict. Unfortunately, our regulators are being enchouraged to extend ther regulatory period form the standard 10,000 years to 1 million years. Such overconfidence is not justified due to uncertainties in dating, calibration, and modeling.

  17. Measures of Model Uncertainty in the Assessment of Primary Stresses in Ship Structures

    DEFF Research Database (Denmark)

    Östergaard, Carsten; Dogliani, Mario; Guedes Soares, Carlos;

    1996-01-01

    The paper considers various models and methods commonly used for linear elastic stress analysis and assesses the uncertainty involved in their application to the analysis of the distribution of primary stresses in the hull of a containership example, through statistical evaluations of the results...

  18. Constructive epistemic modeling of groundwater flow with geological structure and boundary condition uncertainty under the Bayesian paradigm

    Science.gov (United States)

    Elshall, Ahmed S.; Tsai, Frank T.-C.

    2014-09-01

    Constructive epistemic modeling is the idea that our understanding of a natural system through a scientific model is a mental construct that continually develops through learning about and from the model. Using hierarchical Bayesian model averaging (BMA), this study shows that segregating different uncertain model components through a BMA tree of posterior model probability, model prediction, within-model variance, between-model variance and total model variance serves as a learning tool. First, the BMA tree of posterior model probabilities permits the comparative evaluation of the candidate propositions of each uncertain model component. Second, systemic model dissection is imperative for understanding the individual contribution of each uncertain model component to the model prediction and variance. Third, the hierarchical representation of the between-model variance facilitates the prioritization of the contribution of each uncertain model component to the overall model uncertainty. We illustrate these concepts using the groundwater flow model of a siliciclastic aquifer-fault system. We consider four uncertain model components. With respect to geological structure uncertainty, we consider three methods for reconstructing the hydrofacies architecture of the aquifer-fault system, and two formation dips. We consider two uncertain boundary conditions, each having two candidate propositions. Through combinatorial design, these four uncertain model components with their candidate propositions result in 24 base models. The study shows that hierarchical BMA analysis helps in advancing knowledge about the model rather than forcing the model to fit a particularly understanding or merely averaging several candidate models.

  19. Survival under uncertainty an introduction to probability models of social structure and evolution

    CERN Document Server

    Volchenkov, Dimitri

    2016-01-01

    This book introduces and studies a number of stochastic models of subsistence, communication, social evolution and political transition that will allow the reader to grasp the role of uncertainty as a fundamental property of our irreversible world. At the same time, it aims to bring about a more interdisciplinary and quantitative approach across very diverse fields of research in the humanities and social sciences. Through the examples treated in this work – including anthropology, demography, migration, geopolitics, management, and bioecology, among other things – evidence is gathered to show that volatile environments may change the rules of the evolutionary selection and dynamics of any social system, creating a situation of adaptive uncertainty, in particular, whenever the rate of change of the environment exceeds the rate of adaptation. Last but not least, it is hoped that this book will contribute to the understanding that inherent randomness can also be a great opportunity – for social systems an...

  20. DNDC Model Calibration, Validation and Quantification of Structural Uncertainty to Support Rice Methane Offset Protocols

    Science.gov (United States)

    Salas, W.; Ducey, M. J.; Li, C.

    2014-12-01

    Agriculture represents an important near-term option for GHG offsets. Currently, the most widely accepted low-cost approaches to quantify N2O and CH4 emissions are based on emission factors. Given that N2O and CH4 emissions from agricultural practices exhibit high spatial and temporal variability, emission factors are not very sensitive to estimate this variability in emissions at the farm level, even when the emission factors are regional. It is clear that if agricultural offset projects are going to include N2O and CH4 reductions, then process-based biogeochemical models are potentially important tools to quantify emission reductions within offset protocols. The question remains how good a model's performance is with respect to emission reductions. As PBM, are integrated into protocols for agricultural GHG offsets, comprehensive and systematic validation is needed to statistically quantify uncertainties in model-based estimates of GHG emission reductions that are obtained by standardized approach to parameterization and calibration that can be applied across a whole region. The DNDC model was validated against 88 datasets of rice methane emissions. Data were collected at sites in California and MidSouth. In addition to examining the magnitude of the measured versus modeled emissions, we analyzed model performance for estimating the changes in emissions associated with a change in management practices (e.g. dry versus wet seeded rice, different fertilizer rates, etc.). We analyzed 100 pairs of modeled and measured emission reductions. DNDC model performance and uncertainty was quantified using a suite of statistical measures. First, we examined how well the modeled emissions differences match the field-measured differences on a case-by-case basis and also on average, using a combination of Monte Carlo approaches and equivalence testing. Although modeled emissions for individual fields show a slight bias, emissions reductions for baseline:treatment pairs fall close

  1. Valuing structure, model uncertainty and model averaging in vector autoregressive processes

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2004-01-01

    textabstractEconomic policy decisions are often informed by empirical analysis based on accurate econometric modeling. However, a decision-maker is usually only interested in good estimates of outcomes, while an analyst must also be interested in estimating the model. Accurate inference on structura

  2. Wastewater treatment modelling: dealing with uncertainties

    DEFF Research Database (Denmark)

    Belia, E.; Amerlinck, Y.; Benedetti, L.;

    2009-01-01

    This paper serves as a problem statement of the issues surrounding uncertainty in wastewater treatment modelling. The paper proposes a structure for identifying the sources of uncertainty introduced during each step of an engineering project concerned with model-based design or optimisation...... of a wastewater treatment system. It briefly references the methods currently used to evaluate prediction accuracy and uncertainty and discusses the relevance of uncertainty evaluations in model applications. The paper aims to raise awareness and initiate a comprehensive discussion among professionals on model...

  3. Uncertainty quantification for environmental models

    Science.gov (United States)

    Hill, Mary C.; Lu, Dan; Kavetski, Dmitri; Clark, Martyn P.; Ye, Ming

    2012-01-01

    Environmental models are used to evaluate the fate of fertilizers in agricultural settings (including soil denitrification), the degradation of hydrocarbons at spill sites, and water supply for people and ecosystems in small to large basins and cities—to mention but a few applications of these models. They also play a role in understanding and diagnosing potential environmental impacts of global climate change. The models are typically mildly to extremely nonlinear. The persistent demand for enhanced dynamics and resolution to improve model realism [17] means that lengthy individual model execution times will remain common, notwithstanding continued enhancements in computer power. In addition, high-dimensional parameter spaces are often defined, which increases the number of model runs required to quantify uncertainty [2]. Some environmental modeling projects have access to extensive funding and computational resources; many do not. The many recent studies of uncertainty quantification in environmental model predictions have focused on uncertainties related to data error and sparsity of data, expert judgment expressed mathematically through prior information, poorly known parameter values, and model structure (see, for example, [1,7,9,10,13,18]). Approaches for quantifying uncertainty include frequentist (potentially with prior information [7,9]), Bayesian [13,18,19], and likelihood-based. A few of the numerous methods, including some sensitivity and inverse methods with consequences for understanding and quantifying uncertainty, are as follows: Bayesian hierarchical modeling and Bayesian model averaging; single-objective optimization with error-based weighting [7] and multi-objective optimization [3]; methods based on local derivatives [2,7,10]; screening methods like OAT (one at a time) and the method of Morris [14]; FAST (Fourier amplitude sensitivity testing) [14]; the Sobol' method [14]; randomized maximum likelihood [10]; Markov chain Monte Carlo (MCMC) [10

  4. Chemical model reduction under uncertainty

    KAUST Repository

    Najm, Habib

    2016-01-05

    We outline a strategy for chemical kinetic model reduction under uncertainty. We present highlights of our existing deterministic model reduction strategy, and describe the extension of the formulation to include parametric uncertainty in the detailed mechanism. We discuss the utility of this construction, as applied to hydrocarbon fuel-air kinetics, and the associated use of uncertainty-aware measures of error between predictions from detailed and simplified models.

  5. Uncertainty in Air Quality Modeling.

    Science.gov (United States)

    Fox, Douglas G.

    1984-01-01

    Under the direction of the AMS Steering Committee for the EPA Cooperative Agreement on Air Quality Modeling, a small group of scientists convened to consider the question of uncertainty in air quality modeling. Because the group was particularly concerned with the regulatory use of models, its discussion focused on modeling tall stack, point source emissions.The group agreed that air quality model results should be viewed as containing both reducible error and inherent uncertainty. Reducible error results from improper or inadequate meteorological and air quality data inputs, and from inadequacies in the models. Inherent uncertainty results from the basic stochastic nature of the turbulent atmospheric motions that are responsible for transport and diffusion of released materials. Modelers should acknowledge that all their predictions to date contain some associated uncertainty and strive also to quantify uncertainty.How can the uncertainty be quantified? There was no consensus from the group as to precisely how uncertainty should be calculated. One subgroup, which addressed statistical procedures, suggested that uncertainty information could be obtained from comparisons of observations and predictions. Following recommendations from a previous AMS workshop on performance evaluation (Fox. 1981), the subgroup suggested construction of probability distribution functions from the differences between observations and predictions. Further, they recommended that relatively new computer-intensive statistical procedures be considered to improve the quality of uncertainty estimates for the extreme value statistics of interest in regulatory applications.A second subgroup, which addressed the basic nature of uncertainty in a stochastic system, also recommended that uncertainty be quantified by consideration of the differences between observations and predictions. They suggested that the average of the difference squared was appropriate to isolate the inherent uncertainty that

  6. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...

  7. Uncertainties in Nuclear Proliferation Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chul Min; Yim, Man-Sung; Park, Hyeon Seok [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2015-05-15

    There have been various efforts in the research community to understand the determinants of nuclear proliferation and develop quantitative tools to predict nuclear proliferation events. Such systematic approaches have shown the possibility to provide warning for the international community to prevent nuclear proliferation activities. However, there are still large debates for the robustness of the actual effect of determinants and projection results. Some studies have shown that several factors can cause uncertainties in previous quantitative nuclear proliferation modeling works. This paper analyzes the uncertainties in the past approaches and suggests future works in the view of proliferation history, analysis methods, and variable selection. The research community still lacks the knowledge for the source of uncertainty in current models. Fundamental problems in modeling will remain even other advanced modeling method is developed. Before starting to develop fancy model based on the time dependent proliferation determinants' hypothesis, using graph theory, etc., it is important to analyze the uncertainty of current model to solve the fundamental problems of nuclear proliferation modeling. The uncertainty from different proliferation history coding is small. Serious problems are from limited analysis methods and correlation among the variables. Problems in regression analysis and survival analysis cause huge uncertainties when using the same dataset, which decreases the robustness of the result. Inaccurate variables for nuclear proliferation also increase the uncertainty. To overcome these problems, further quantitative research should focus on analyzing the knowledge suggested on the qualitative nuclear proliferation studies.

  8. Uncertainty analysis of the 2009 L'Aquila rupture model using one- and three-dimensional crustal structure

    Science.gov (United States)

    Razafindrakoto, H. N. T.; Imperatori, W.; Mai, P. M.

    2014-12-01

    Finite-fault rupture models for the 2009 L'Aquila earthquake reveal considerable variability among the published solutions of kinematic source parameters. One potential source of this variability arises from the non-unique choice of crustal structure. This earthquake occurred in an area of complex geology, including a small sedimentary basin and pronounced topography. Therefore, the use of a one-dimensional crustal structure may be insufficient to accurately infer the earthquake rupture process. In this study, we examine the effects of crustal structure variability on the inversion for the rupture process of the 2009 L'Aquila earthquake. We particularly assess the rupture model uncertainty related to one- and three-dimensional Earth models that are used to compute Green's functions. In doing so, we evaluate the role of using more realistic crustal structure in resolving the rupture model parameters. We apply Bayesian inference to quantitatively assess the characteristics of the space-time rupture evolution (peak slip-rate, rupture time, and rise time) in terms of posterior density functions. We find that the use of realistic 3D crustal structure, including topography and crustal heterogeneity improves the earthquake source imaging. We also investigate the sensitivity of rupture parameters with respect to the variations in crustal structure.

  9. Shall we upgrade one-dimensional secondary settler models used in WWTP simulators? – An assessment of model structure uncertainty and its propagation

    DEFF Research Database (Denmark)

    Plósz, Benedek; De Clercq, Jeriffa; Nopens, Ingmar;

    2011-01-01

    -wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer/winter sequence. The model prediction in terms of nitrogen removal, solids inventory in the bioreactors and solids retention time as a function...... results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant...

  10. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    Hydrological change modelling methodologies generally use climate models outputs to force hydrological simulations under changed conditions. There are nested sources of uncertainty throughout this methodology, including choice of climate model and subsequent bias correction methods. This Ph.......D. study evaluates the uncertainty of the impact of climate change in hydrological simulations given multiple climate models and bias correction methods of varying complexity. Three distribution based scaling methods (DBS) were developed and benchmarked against a more simplistic and commonly used delta...... change (DC) approach. These climate model projections were then used to force hydrological simulations under climate change for the island Sjælland in Denmark to analyse the contribution of different climate models and bias correction methods to overall uncertainty in the hydrological change modelling...

  11. Numerical modeling of economic uncertainty

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans

    2007-01-01

    Representation and modeling of economic uncertainty is addressed by different modeling methods, namely stochastic variables and probabilities, interval analysis, and fuzzy numbers, in particular triple estimates. Focusing on discounted cash flow analysis numerical results are presented, comparisons...... are made between alternative modeling methods, and characteristics of the methods are discussed....

  12. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...

  13. Adressing Replication and Model Uncertainty

    DEFF Research Database (Denmark)

    Ebersberger, Bernd; Galia, Fabrice; Laursen, Keld

    Many fields of strategic management are subject to an important degree of model uncertainty. This is because the true model, and therefore the selection of appropriate explanatory variables, is essentially unknown. Drawing on the literature on the determinants of innovation, and by analyzing inno...

  14. Model uncertainty in growth empirics

    NARCIS (Netherlands)

    Prüfer, P.

    2008-01-01

    This thesis applies so-called Bayesian model averaging (BMA) to three different economic questions substantially exposed to model uncertainty. Chapter 2 addresses a major issue of modern development economics: the analysis of the determinants of pro-poor growth (PPG), which seeks to combine high gro

  15. Uncertainties in SDSS galaxy parameter determination: 3D photometrical modelling of test galaxies and restoration of their structural parameters

    CERN Document Server

    Tempel, Elmo; Kipper, Rain; Tenjes, Peeter

    2012-01-01

    Is it realistic to recover the 3D structure of galaxies from their images? To answer this question, we generate a sample of idealised model galaxies consisting of a disc-like component and a spheroidal component (bulge) with varying luminosities, inclination angles and structural parameters, and component density following the Einasto distribution. We simulate these galaxies as if observed in the SDSS project through ugriz filters, thus gaining a set of images of galaxies with known intrinsic properties. We remodel the galaxies with a 3D galaxy modelling procedure and compare the restored parameters to the initial ones in order to determine the uncertainties of the models. Down to the r-band limiting magnitude 18, errors of the restored integral luminosities and colour indices remain within 0.05 mag and errors of the luminosities of individual components within 0.2 mag. Accuracy of the restored bulge-to-disc ratios (B/D) is within 40% in most cases, and becomes even worse for galaxies with low B/D due to diff...

  16. Parametric uncertainty modeling for robust control

    DEFF Research Database (Denmark)

    Rasmussen, K.H.; Jørgensen, Sten Bay

    1999-01-01

    The dynamic behaviour of a non-linear process can often be approximated with a time-varying linear model. In the presented methodology the dynamics is modeled non-conservatively as parametric uncertainty in linear lime invariant models. The obtained uncertainty description makes it possible...... method can be utilized in identification of a nominal model with uncertainty description. The method is demonstrated on a binary distillation column operating in the LV configuration. The dynamics of the column is approximated by a second order linear model, wherein the parameters vary as the operating...... to perform robustness analysis on a control system using the structured singular value. The idea behind the proposed method is to fit a rational function to the parameter variation. The parameter variation can then be expressed as a linear fractional transformation (LFT), It is discussed how the proposed...

  17. Numerical Modeling of Inverse Problems under Uncertainty for Damage Detection in Aircraft Structures

    Science.gov (United States)

    2013-08-01

    Banks, HT; Inman, DJ; Leo, DJ & Wang, Y (1996): An experimentally validated damage detection theory in smart structures, Journal of Sound and...Aeroelastic Coupling (in Portuguese) (Otimização Estocástica Multi- Objetivos em Estrutura Aeronáutica sujeita a Carregamentos Aerodinâmicos, com

  18. Structural Uncertainty in Model-Simulated Trends of Global Gross Primary Production

    Directory of Open Access Journals (Sweden)

    Zaichun Zhu

    2013-03-01

    Full Text Available Projected changes in the frequency and severity of droughts as a result of increase in greenhouse gases have a significant impact on the role of vegetation in regulating the global carbon cycle. Drought effect on vegetation Gross Primary Production (GPP is usually modeled as a function of Vapor Pressure Deficit (VPD and/or soil moisture. Climate projections suggest a strong likelihood of increasing trend in VPD, while regional changes in precipitation are less certain. This difference in projections between VPD and precipitation can cause considerable discrepancies in the predictions of vegetation behavior depending on how ecosystem models represent the drought effect. In this study, we scrutinized the model responses to drought using the 30-year record of Global Inventory Modeling and Mapping Studies (GIMMS 3g Normalized Difference Vegetation Index (NDVI dataset. A diagnostic ecosystem model, Terrestrial Observation and Prediction System (TOPS, was used to estimate global GPP from 1982 to 2009 under nine different experimental simulations. The control run of global GPP increased until 2000, but stayed constant after 2000. Among the simulations with single climate constraint (temperature, VPD, rainfall and solar radiation, only the VPD-driven simulation showed a decrease in 2000s, while the other scenarios simulated an increase in GPP. The diverging responses in 2000s can be attributed to the difference in the representation of the impact of water stress on vegetation in models, i.e., using VPD and/or precipitation. Spatial map of trend in simulated GPP using GIMMS 3g data is consistent with the GPP driven by soil moisture than the GPP driven by VPD, confirming the need for a soil moisture constraint in modeling global GPP.

  19. Statistical assessment of predictive modeling uncertainty

    Science.gov (United States)

    Barzaghi, Riccardo; Marotta, Anna Maria

    2017-04-01

    When the results of geophysical models are compared with data, the uncertainties of the model are typically disregarded. We propose a method for defining the uncertainty of a geophysical model based on a numerical procedure that estimates the empirical auto and cross-covariances of model-estimated quantities. These empirical values are then fitted by proper covariance functions and used to compute the covariance matrix associated with the model predictions. The method is tested using a geophysical finite element model in the Mediterranean region. Using a novel χ2 analysis in which both data and model uncertainties are taken into account, the model's estimated tectonic strain pattern due to the Africa-Eurasia convergence in the area that extends from the Calabrian Arc to the Alpine domain is compared with that estimated from GPS velocities while taking into account the model uncertainty through its covariance structure and the covariance of the GPS estimates. The results indicate that including the estimated model covariance in the testing procedure leads to lower observed χ2 values that have better statistical significance and might help a sharper identification of the best-fitting geophysical models.

  20. Structural Uncertainty in Antarctic sea ice simulations

    Science.gov (United States)

    Schneider, D. P.

    2016-12-01

    The inability of the vast majority of historical climate model simulations to reproduce the observed increase in Antarctic sea ice has motivated many studies about the quality of the observational record, the role of natural variability versus forced changes, and the possibility of missing or inadequate forcings in the models (such as freshwater discharge from thinning ice shelves or an inadequate magnitude of stratospheric ozone depletion). In this presentation I will highlight another source of uncertainty that has received comparatively little attention: Structural uncertainty, that is, the systematic uncertainty in simulated sea ice trends that arises from model physics and mean-state biases. Using two large ensembles of experiments from the Community Earth System Model (CESM), I will show that the model is predisposed towards producing negative Antarctic sea ice trends during 1979-present, and that this outcome is not simply because the model's decadal variability is out-of-synch with that in nature. In the "Tropical Pacific Pacemaker" ensemble, in which observed tropical Pacific SST anomalies are prescribed, the model produces very realistic atmospheric circulation trends over the Southern Ocean, yet the sea ice trend is negative in every ensemble member. However, if the ensemble-mean trend (commonly interpreted as the forced response) is removed, some ensemble members show a sea ice increase that is very similar to the observed. While this results does confirm the important role of natural variability, it also suggests a strong bias in the forced response. I will discuss the reasons for this systematic bias and explore possible remedies. This an important problem to solve because projections of 21st -Century changes in the Antarctic climate system (including ice sheet surface mass balance changes and related changes in the sea level budget) have a strong dependence on the mean state of and changes in the Antarctic sea ice cover. This problem is not unique to

  1. Methodology for characterizing modeling and discretization uncertainties in computational simulation

    Energy Technology Data Exchange (ETDEWEB)

    ALVIN,KENNETH F.; OBERKAMPF,WILLIAM L.; RUTHERFORD,BRIAN M.; DIEGERT,KATHLEEN V.

    2000-03-01

    This research effort focuses on methodology for quantifying the effects of model uncertainty and discretization error on computational modeling and simulation. The work is directed towards developing methodologies which treat model form assumptions within an overall framework for uncertainty quantification, for the purpose of developing estimates of total prediction uncertainty. The present effort consists of work in three areas: framework development for sources of uncertainty and error in the modeling and simulation process which impact model structure; model uncertainty assessment and propagation through Bayesian inference methods; and discretization error estimation within the context of non-deterministic analysis.

  2. Large-scale determinants of diversity across Spanish forest habitats: accounting for model uncertainty in compositional and structural indicators

    Energy Technology Data Exchange (ETDEWEB)

    Martin-Quller, E.; Torras, O.; Alberdi, I.; Solana, J.; Saura, S.

    2011-07-01

    An integral understanding of forest biodiversity requires the exploration of the many aspects it comprises and of the numerous potential determinants of their distribution. The landscape ecological approach provides a necessary complement to conventional local studies that focus on individual plots or forest ownerships. However, most previous landscape studies used equally-sized cells as units of analysis to identify the factors affecting forest biodiversity distribution. Stratification of the analysis by habitats with a relatively homogeneous forest composition might be more adequate to capture the underlying patterns associated to the formation and development of a particular ensemble of interacting forest species. Here we used a landscape perspective in order to improve our understanding on the influence of large-scale explanatory factors on forest biodiversity indicators in Spanish habitats, covering a wide latitudinal and attitudinal range. We considered six forest biodiversity indicators estimated from more than 30,000 field plots in the Spanish national forest inventory, distributed in 213 forest habitats over 16 Spanish provinces. We explored biodiversity response to various environmental (climate and topography) and landscape configuration (fragmentation and shape complexity) variables through multiple linear regression models (built and assessed through the Akaike Information Criterion). In particular, we took into account the inherent model uncertainty when dealing with a complex and large set of variables, and considered different plausible models and their probability of being the best candidate for the observed data. Our results showed that compositional indicators (species richness and diversity) were mostly explained by environmental factors. Models for structural indicators (standing deadwood and stand complexity) had the worst fits and selection uncertainties, but did show significant associations with some configuration metrics. In general

  3. Chemical model reduction under uncertainty

    KAUST Repository

    Malpica Galassi, Riccardo

    2017-03-06

    A general strategy for analysis and reduction of uncertain chemical kinetic models is presented, and its utility is illustrated in the context of ignition of hydrocarbon fuel–air mixtures. The strategy is based on a deterministic analysis and reduction method which employs computational singular perturbation analysis to generate simplified kinetic mechanisms, starting from a detailed reference mechanism. We model uncertain quantities in the reference mechanism, namely the Arrhenius rate parameters, as random variables with prescribed uncertainty factors. We propagate this uncertainty to obtain the probability of inclusion of each reaction in the simplified mechanism. We propose probabilistic error measures to compare predictions from the uncertain reference and simplified models, based on the comparison of the uncertain dynamics of the state variables, where the mixture entropy is chosen as progress variable. We employ the construction for the simplification of an uncertain mechanism in an n-butane–air mixture homogeneous ignition case, where a 176-species, 1111-reactions detailed kinetic model for the oxidation of n-butane is used with uncertainty factors assigned to each Arrhenius rate pre-exponential coefficient. This illustration is employed to highlight the utility of the construction, and the performance of a family of simplified models produced depending on chosen thresholds on importance and marginal probabilities of the reactions.

  4. Handling Unquantifiable Uncertainties in Landslide Modelling

    Science.gov (United States)

    Almeida, S.; Holcombe, E.; Pianosi, F.; Wagener, T.

    2015-12-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, there is no agreement on what probability distribution should be used to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform adequately under a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use and combine several GSA methods including the Method of Morris, Regional Sensitivity Analysis and CART, as well as advanced visualization tools. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in a scenario discovery framework.

  5. Uncertainty Quantification in Climate Modeling

    Science.gov (United States)

    Sargsyan, K.; Safta, C.; Berry, R.; Debusschere, B.; Najm, H.

    2011-12-01

    We address challenges that sensitivity analysis and uncertainty quantification methods face when dealing with complex computational models. In particular, climate models are computationally expensive and typically depend on a large number of input parameters. We consider the Community Land Model (CLM), which consists of a nested computational grid hierarchy designed to represent the spatial heterogeneity of the land surface. Each computational cell can be composed of multiple land types, and each land type can incorporate one or more sub-models describing the spatial and depth variability. Even for simulations at a regional scale, the computational cost of a single run is quite high and the number of parameters that control the model behavior is very large. Therefore, the parameter sensitivity analysis and uncertainty propagation face significant difficulties for climate models. This work employs several algorithmic avenues to address some of the challenges encountered by classical uncertainty quantification methodologies when dealing with expensive computational models, specifically focusing on the CLM as a primary application. First of all, since the available climate model predictions are extremely sparse due to the high computational cost of model runs, we adopt a Bayesian framework that effectively incorporates this lack-of-knowledge as a source of uncertainty, and produces robust predictions with quantified uncertainty even if the model runs are extremely sparse. In particular, we infer Polynomial Chaos spectral expansions that effectively encode the uncertain input-output relationship and allow efficient propagation of all sources of input uncertainties to outputs of interest. Secondly, the predictability analysis of climate models strongly suffers from the curse of dimensionality, i.e. the large number of input parameters. While single-parameter perturbation studies can be efficiently performed in a parallel fashion, the multivariate uncertainty analysis

  6. Uncertainty in spatially explicit animal dispersal models

    Science.gov (United States)

    Mooij, Wolf M.; DeAngelis, Donald L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three levels of complexity: (1) an event-based binomial model that considers only the occurrence of mortality or arrival, (2) a temporally explicit exponential model that employs mortality and arrival rates, and (3) a spatially explicit grid-walk model that simulates the movement of animals through an artificial landscape. Each model was fitted to the same set of field data. A first objective of the paper is to illustrate how the maximum-likelihood method can be used in all three cases to estimate the means and confidence limits for the relevant model parameters, given a particular set of data on dispersal survival. Using this framework we show that the structure of the uncertainty for all three models is strikingly similar. In fact, the results of our unified approach imply that spatially explicit dispersal models, which take advantage of information on landscape details, suffer less from uncertainly than do simpler models. Moreover, we show that the proposed strategy of model development safeguards one from error propagation in these more complex models. Finally, our approach shows that all models related to animal dispersal, ranging from simple to complex, can be related in a hierarchical fashion, so that the various approaches to modeling such dispersal can be viewed from a unified perspective.

  7. Optical Model and Cross Section Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Herman,M.W.; Pigni, M.T.; Dietrich, F.S.; Oblozinsky, P.

    2009-10-05

    Distinct minima and maxima in the neutron total cross section uncertainties were observed in model calculations using spherical optical potential. We found this oscillating structure to be a general feature of quantum mechanical wave scattering. Specifically, we analyzed neutron interaction with 56Fe from 1 keV up to 65 MeV, and investigated physical origin of the minima.We discuss their potential importance for practical applications as well as the implications for the uncertainties in total and absorption cross sections.

  8. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  9. Shall we upgrade one-dimensional secondary settler models used in WWTP simulators? - An assessment of model structure uncertainty and its propagation.

    Science.gov (United States)

    Plósz, Benedek Gy; De Clercq, Jeriffa; Nopens, Ingmar; Benedetti, Lorenzo; Vanrolleghem, Peter A

    2011-01-01

    In WWTP models, the accurate assessment of solids inventory in bioreactors equipped with solid-liquid separators, mostly described using one-dimensional (1-D) secondary settling tank (SST) models, is the most fundamental requirement of any calibration procedure. Scientific knowledge on characterising particulate organics in wastewater and on bacteria growth is well-established, whereas 1-D SST models and their impact on biomass concentration predictions are still poorly understood. A rigorous assessment of two 1-DSST models is thus presented: one based on hyperbolic (the widely used Takács-model) and one based on parabolic (the more recently presented Plósz-model) partial differential equations. The former model, using numerical approximation to yield realistic behaviour, is currently the most widely used by wastewater treatment process modellers. The latter is a convection-dispersion model that is solved in a numerically sound way. First, the explicit dispersion in the convection-dispersion model and the numerical dispersion for both SST models are calculated. Second, simulation results of effluent suspended solids concentration (XTSS,Eff), sludge recirculation stream (XTSS,RAS) and sludge blanket height (SBH) are used to demonstrate the distinct behaviour of the models. A thorough scenario analysis is carried out using SST feed flow rate, solids concentration, and overflow rate as degrees of freedom, spanning a broad loading spectrum. A comparison between the measurements and the simulation results demonstrates a considerably improved 1-D model realism using the convection-dispersion model in terms of SBH, XTSS,RAS and XTSS,Eff. Third, to assess the propagation of uncertainty derived from settler model structure to the biokinetic model, the impact of the SST model as sub-model in a plant-wide model on the general model performance is evaluated. A long-term simulation of a bulking event is conducted that spans temperature evolution throughout a summer

  10. Where does the uncertainty come from? Attributing Uncertainty in Conceptual Hydrologic Modelling

    Science.gov (United States)

    Abu Shoaib, S.; Marshall, L. A.; Sharma, A.

    2015-12-01

    Defining an appropriate forecasting model is a key phase in water resources planning and design. Quantification of uncertainty is an important step in the development and application of hydrologic models. In this study, we examine the dependency of hydrologic model uncertainty on the observed model inputs, defined model structure, parameter optimization identifiability and identified likelihood. We present here a new uncertainty metric, the Quantile Flow Deviation or QFD, to evaluate the relative uncertainty due to each of these sources under a range of catchment conditions. Through the metric, we may identify the potential spectrum of uncertainty and variability in model simulations. The QFD assesses uncertainty by estimating the deviation in flows at a given quantile across a range of scenarios. By using a quantile based metric, the change in uncertainty across individual percentiles can be assessed, thereby allowing uncertainty to be expressed as a function of time. The QFD method can be disaggregated to examine any part of the modelling process including the selection of certain model subroutines or forcing data. Case study results (including catchments in Australia and USA) suggest that model structure selection is vital irrespective of the flow percentile of interest or the catchment being studied. Examining the QFD across various quantiles additionally demonstrates that lower yielding catchments may have greater variation due to selected model structures. By incorporating multiple model structures, it is possible to assess (i) the relative importance of various sources of uncertainty, (ii) how these vary with the change in catchment location or hydrologic regime; and (iii) the impact of the length of available observations in uncertainty quantification.

  11. Systemic change increases model projection uncertainty

    Science.gov (United States)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floor; Faaij, André

    2014-05-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship resulting from e.g., climatic or societal changes, thereby overlooking a source of uncertainty. We define systemic change as a change in the system indicated by a system state change that cannot be simulated using a constant model structure. We have developed a method to detect systemic change, using a Bayesian data assimilation technique, the particle filter. The particle filter was used to update the prior knowledge about the model structure. In contrast to the traditional particle filter approach (e.g., Verstegen et al., 2014), we apply the filter separately for each point in time for which observations are available, obtaining the optimal model structure for each of the time periods in between. This allows us to create a time series of the evolution of the model structure. The Runs test (Wald and Wolfowitz, 1940), a stationarity test, is used to check whether variation in this time series can be attributed to randomness or not. If not, this indicates systemic change. The uncertainty that the systemic change adds to the existing model projection uncertainty can be determined by comparing model outcomes of a model with a stationary model structure and a model with a model structure changing according to the variation found in the time series. To test the systemic change detection methodology, we apply it to a land use change cellular automaton (CA) (Verstegen et al., 2012) and use observations of real land use from all years from 2004 to 2012 and associated uncertainty as observational data in the particle filter. A systemic change was detected for the period 2006 to 2008. In this period the influence on the location of sugar cane expansion of the driver sugar cane in

  12. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which function as risk-related decision support for the appraised transport infrastructure project....

  13. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Directory of Open Access Journals (Sweden)

    A. Valade

    2014-01-01

    Full Text Available Agro-Land Surface Models (agro-LSM have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management or to ORCHIDEE (other ecosystem variables including biomass through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested

  14. Modeling sugarcane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Caubel, A.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-06-01

    Agro-land surface models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugarcane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte Carlo sampling method associated with the calculation of partial ranked correlation coefficients is used to quantify the sensitivity of harvested biomass to input

  15. Modeling sugar cane yield with a process-based model from site to continental scale: uncertainties arising from model structure and parameter values

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Huth, N.; Marin, F.; Martiné, J.-F.

    2014-01-01

    Agro-Land Surface Models (agro-LSM) have been developed from the integration of specific crop processes into large-scale generic land surface models that allow calculating the spatial distribution and variability of energy, water and carbon fluxes within the soil-vegetation-atmosphere continuum. When developing agro-LSM models, a particular attention must be given to the effects of crop phenology and management on the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty of Agro-LSM models is related to their usually large number of parameters. In this study, we quantify the parameter-values uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS, using a multi-regional approach with data from sites in Australia, La Réunion and Brazil. In ORCHIDEE-STICS, two models are chained: STICS, an agronomy model that calculates phenology and management, and ORCHIDEE, a land surface model that calculates biomass and other ecosystem variables forced by STICS' phenology. First, the parameters that dominate the uncertainty of simulated biomass at harvest date are determined through a screening of 67 different parameters of both STICS and ORCHIDEE on a multi-site basis. Secondly, the uncertainty of harvested biomass attributable to those most sensitive parameters is quantified and specifically attributed to either STICS (phenology, management) or to ORCHIDEE (other ecosystem variables including biomass) through distinct Monte-Carlo runs. The uncertainty on parameter values is constrained using observations by calibrating the model independently at seven sites. In a third step, a sensitivity analysis is carried out by varying the most sensitive parameters to investigate their effects at continental scale. A Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used to quantify the sensitivity of harvested biomass to input

  16. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2009-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes...... to supplement Optimism Bias and the associated Reference Class Forecasting (RCF) technique with a new technique that makes use of a scenario-grid. We tentatively introduce and refer to this as Reference Scenario Forecasting (RSF). The final RSF output from the CBA-DK model consists of a set of scenario...

  17. Modelling of Transport Projects Uncertainties

    DEFF Research Database (Denmark)

    Salling, Kim Bang; Leleur, Steen

    2012-01-01

    This paper proposes a new way of handling the uncertainties present in transport decision making based on infrastructure appraisals. The paper suggests to combine the principle of Optimism Bias, which depicts the historical tendency of overestimating transport related benefits and underestimating...... investment costs, with a quantitative risk analysis based on Monte Carlo simulation and to make use of a set of exploratory scenarios. The analysis is carried out by using the CBA-DK model representing the Danish standard approach to socio-economic cost-benefit analysis. Specifically, the paper proposes......-based graphs which functions as risk-related decision support for the appraised transport infrastructure project. The presentation of RSF is demonstrated by using an appraisal case concerning a new airfield in the capital of Greenland, Nuuk....

  18. Uncertainty in tsunami sediment transport modeling

    Science.gov (United States)

    Jaffe, Bruce E.; Goto, Kazuhisa; Sugawara, Daisuke; Gelfenbaum, Guy R.; La Selle, SeanPaul M.

    2016-01-01

    Erosion and deposition from tsunamis record information about tsunami hydrodynamics and size that can be interpreted to improve tsunami hazard assessment. We explore sources and methods for quantifying uncertainty in tsunami sediment transport modeling. Uncertainty varies with tsunami, study site, available input data, sediment grain size, and model. Although uncertainty has the potential to be large, published case studies indicate that both forward and inverse tsunami sediment transport models perform well enough to be useful for deciphering tsunami characteristics, including size, from deposits. New techniques for quantifying uncertainty, such as Ensemble Kalman Filtering inversion, and more rigorous reporting of uncertainties will advance the science of tsunami sediment transport modeling. Uncertainty may be decreased with additional laboratory studies that increase our understanding of the semi-empirical parameters and physics of tsunami sediment transport, standardized benchmark tests to assess model performance, and development of hybrid modeling approaches to exploit the strengths of forward and inverse models.

  19. Bayesian Uncertainty Analyses Via Deterministic Model

    Science.gov (United States)

    Krzysztofowicz, R.

    2001-05-01

    Rational decision-making requires that the total uncertainty about a variate of interest (a predictand) be quantified in terms of a probability distribution, conditional on all available information and knowledge. Suppose the state-of-knowledge is embodied in a deterministic model, which is imperfect and outputs only an estimate of the predictand. Fundamentals are presented of three Bayesian approaches to producing a probability distribution of the predictand via any deterministic model. The Bayesian Processor of Output (BPO) quantifies the total uncertainty in terms of a posterior distribution, conditional on model output. The Bayesian Processor of Ensemble (BPE) quantifies the total uncertainty in terms of a posterior distribution, conditional on an ensemble of model output. The Bayesian Forecasting System (BFS) decomposes the total uncertainty into input uncertainty and model uncertainty, which are characterized independently and then integrated into a predictive distribution.

  20. Uncertainty propagation within the UNEDF models

    CERN Document Server

    Haverinen, T

    2016-01-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties on binding energies for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  1. Uncertainty propagation within the UNEDF models

    Science.gov (United States)

    Haverinen, T.; Kortelainen, M.

    2017-04-01

    The parameters of the nuclear energy density have to be adjusted to experimental data. As a result they carry certain uncertainty which then propagates to calculated values of observables. In the present work we quantify the statistical uncertainties of binding energies, proton quadrupole moments and proton matter radius for three UNEDF Skyrme energy density functionals by taking advantage of the knowledge of the model parameter uncertainties. We find that the uncertainty of UNEDF models increases rapidly when going towards proton or neutron rich nuclei. We also investigate the impact of each model parameter on the total error budget.

  2. Parameter and Uncertainty Estimation in Groundwater Modelling

    DEFF Research Database (Denmark)

    Jensen, Jacob Birk

    The data basis on which groundwater models are constructed is in general very incomplete, and this leads to uncertainty in model outcome. Groundwater models form the basis for many, often costly decisions and if these are to be made on solid grounds, the uncertainty attached to model results must...... be quantified. This study was motivated by the need to estimate the uncertainty involved in groundwater models.Chapter 2 presents an integrated surface/subsurface unstructured finite difference model that was developed and applied to a synthetic case study.The following two chapters concern calibration...... and uncertainty estimation. Essential issues relating to calibration are discussed. The classical regression methods are described; however, the main focus is on the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The next two chapters describe case studies in which the GLUE methodology...

  3. Intrinsic Uncertainties in Modeling Complex Systems.

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, Curtis S; Bramson, Aaron L.; Ames, Arlo L.

    2014-09-01

    Models are built to understand and predict the behaviors of both natural and artificial systems. Because it is always necessary to abstract away aspects of any non-trivial system being modeled, we know models can potentially leave out important, even critical elements. This reality of the modeling enterprise forces us to consider the prospective impacts of those effects completely left out of a model - either intentionally or unconsidered. Insensitivity to new structure is an indication of diminishing returns. In this work, we represent a hypothetical unknown effect on a validated model as a finite perturba- tion whose amplitude is constrained within a control region. We find robustly that without further constraints, no meaningful bounds can be placed on the amplitude of a perturbation outside of the control region. Thus, forecasting into unsampled regions is a very risky proposition. We also present inherent difficulties with proper time discretization of models and representing in- herently discrete quantities. We point out potentially worrisome uncertainties, arising from math- ematical formulation alone, which modelers can inadvertently introduce into models of complex systems. Acknowledgements This work has been funded under early-career LDRD project #170979, entitled "Quantify- ing Confidence in Complex Systems Models Having Structural Uncertainties", which ran from 04/2013 to 09/2014. We wish to express our gratitude to the many researchers at Sandia who con- tributed ideas to this work, as well as feedback on the manuscript. In particular, we would like to mention George Barr, Alexander Outkin, Walt Beyeler, Eric Vugrin, and Laura Swiler for provid- ing invaluable advice and guidance through the course of the project. We would also like to thank Steven Kleban, Amanda Gonzales, Trevor Manzanares, and Sarah Burwell for their assistance in managing project tasks and resources.

  4. Return Predictability, Model Uncertainty, and Robust Investment

    DEFF Research Database (Denmark)

    Lukas, Manuel

    Stock return predictability is subject to great uncertainty. In this paper we use the model confidence set approach to quantify uncertainty about expected utility from investment, accounting for potential return predictability. For monthly US data and six representative return prediction models, we...

  5. Uncertainty in spatially explicit animal dispersal models

    NARCIS (Netherlands)

    Mooij, W.M.; DeAngelis, D.L.

    2003-01-01

    Uncertainty in estimates of survival of dispersing animals is a vexing difficulty in conservation biology. The current notion is that this uncertainty decreases the usefulness of spatially explicit population models in particular. We examined this problem by comparing dispersal models of three level

  6. Addressing structural and observational uncertainty in resource management.

    Science.gov (United States)

    Fackler, Paul; Pacifici, Krishna

    2014-01-15

    Most natural resource management and conservation problems are plagued with high levels of uncertainties, which make good decision making difficult. Although some kinds of uncertainties are easily incorporated into decision making, two types of uncertainty present more formidable difficulties. The first, structural uncertainty, represents our imperfect knowledge about how a managed system behaves. The second, observational uncertainty, arises because the state of the system must be inferred from imperfect monitoring systems. The former type of uncertainty has been addressed in ecology using Adaptive Management (AM) and the latter using the Partially Observable Markov Decision Processes (POMDP) framework. Here we present a unifying framework that extends standard POMDPs and encompasses both standard POMDPs and AM. The approach allows any system variable to be observed or not observed and uses any relevant observed variable to update beliefs about unknown variables and parameters. This extends standard AM, which only uses realizations of the state variable to update beliefs and extends standard POMDP by allowing more general stochastic dependence among the observable variables and the state variables. This framework enables both structural and observational uncertainty to be simultaneously modeled. We illustrate the features of the extended POMDP framework with an example.

  7. Modeling uncertainty in geographic information and analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Uncertainty modeling and data quality for spatial data and spatial analyses are im-portant topics in geographic information science together with space and time in geography,as well as spatial analysis. In the past two decades,a lot of efforts have been made to research the uncertainty modeling for spatial data and analyses. This paper presents our work in the research. In particular,four progresses in the re-search are given out: (a) from determinedness-to uncertainty-based representation of geographic objects in GIS; (b) from uncertainty modeling for static data to dy-namic spatial analyses; (c) from modeling uncertainty for spatial data to models; and (d) from error descriptions to quality control for spatial data.

  8. Uncertainty "escalation" and use of machine learning to forecast residual and data model uncertainties

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    some variant of the Monte Carlo simulation when values of parameters or inputs are sampled from the assumed distributions and the model is run multiple times to generate multiple outputs. This is the most widely used approach. The data generated by Monte Carlo analysis can be used to build a machine learning model which will be able to make predictions of model uncertainty for the future his method is named MLUE (Machine Learning for Uncertainty Estimation) and is covered in [4,5] With this in mind, one may consider the following framework based on the stepwise "building up" (or "escalation") of the model uncertainty: • first consider the residual uncertainty of an optimal model M (X, p*) • then add and consider the model uncertainty due the parameters uncertainty (p) • then add and consider the model uncertainty due the data (mainly, input) uncertainty (X) • then add and consider the structural uncertainty of the model M (X, p). The paper presents the details of this framework and examples if its application in hydrological forecasting. This study is partly supported by the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/). References [1] Koenker, R., and G. Bassett (1978). Regression quantiles. Econometrica, 46(1), 33- 50, doi:10.2307/1913643. [2] D.L. Shrestha, D.P. Solomatine (2006). Machine learning approaches for estimation of prediction interval for the model output. Neural Networks J., 19(2), 225-235. [3] D.P. Solomatine, D.L. Shrestha (2009). A novel method to estimate model uncertainty using machine learning techniques. Water Resources Res. 45, W00B11. [4] D. L. Shrestha, N. Kayastha, and D. P. Solomatine. A novel approach to parameter uncertainty analysis of hydrological models using neural networks. Hydrol. Earth Syst. Sci., 13, 1235-1248, 2009. [5] F. Pianosi and L. Raso (2012). Dynamic modeling of predictive uncertainty by regression on absolute errors. WRR, 48, W03516. [6] Shrestha, D.L., Kayastha, N., Solomatine

  9. Multi-model ensemble hydrologic prediction and uncertainties analysis

    Directory of Open Access Journals (Sweden)

    S. Jiang

    2014-09-01

    Full Text Available Modelling uncertainties (i.e. input errors, parameter uncertainties and model structural errors inevitably exist in hydrological prediction. A lot of recent attention has focused on these, of which input error modelling, parameter optimization and multi-model ensemble strategies are the three most popular methods to demonstrate the impacts of modelling uncertainties. In this paper the Xinanjiang model, the Hybrid rainfall–runoff model and the HYMOD model were applied to the Mishui Basin, south China, for daily streamflow ensemble simulation and uncertainty analysis. The three models were first calibrated by two parameter optimization algorithms, namely, the Shuffled Complex Evolution method (SCE-UA and the Shuffled Complex Evolution Metropolis method (SCEM-UA; next, the input uncertainty was accounted for by introducing a normally-distributed error multiplier; then, the simulation sets calculated from the three models were combined by Bayesian model averaging (BMA. The results show that both these parameter optimization algorithms generate good streamflow simulations; specifically the SCEM-UA can imply parameter uncertainty and give the posterior distribution of the parameters. Considering the precipitation input uncertainty, the streamflow simulation precision does not improve very much. While the BMA combination not only improves the streamflow prediction precision, it also gives quantitative uncertainty bounds for the simulation sets. The SCEM-UA calculated prediction interval is better than the SCE-UA calculated one. These results suggest that considering the model parameters' uncertainties and doing multi-model ensemble simulations are very practical for streamflow prediction and flood forecasting, from which more precision prediction and more reliable uncertainty bounds can be generated.

  10. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...

  11. The role of observational uncertainties in testing model hypotheses

    Science.gov (United States)

    Westerberg, I. K.; Birkel, C.

    2012-12-01

    Knowledge about hydrological processes and the spatial and temporal distribution of water resources is needed as a basis for managing water for hydropower, agriculture and flood-protection. Conceptual hydrological models may be used to infer knowledge on catchment functioning but are affected by uncertainties in the model representation of reality as well as in the observational data used to drive the model and to evaluate model performance. Therefore, meaningful hypothesis testing of the hydrological functioning of a catchment requires such uncertainties to be carefully estimated and accounted for in model calibration and evaluation. The aim of this study was to investigate the role of observational uncertainties in hypothesis testing, in particular whether it was possible to detect model-structural representations that were wrong in an important way given the uncertainties in the observational data. We studied the relatively data-scarce tropical Sarapiqui catchment in Costa Rica, Central America, where water resources play a vital part for hydropower production and livelihood. We tested several model structures of varying complexity as hypotheses about catchment functioning, but also hypotheses about the nature of the modelling errors. The tests were made within a learning framework for uncertainty estimation which enabled insights into data uncertainties, suitable model-structural representations and appropriate likelihoods. The observational uncertainty in discharge data was estimated from a rating-curve analysis and precipitation measurement errors through scenarios relating the error to, for example, canopy interception, wind-driven rain and the elevation gradient. The hypotheses were evaluated in a posterior analysis of the simulations where the performance of each simulation was analysed relative to the observational uncertainties for the entire hydrograph as well as for different aspects of the hydrograph (e.g. peak flows, recession periods, and base flow

  12. Uncertainties in modelling CH4 emissions from northern wetlands in glacial climates: effect of hydrological model and CH4 model structure

    Directory of Open Access Journals (Sweden)

    J. van Huissteden

    2009-07-01

    Full Text Available Methane (CH4 fluxes from northern wetlands may have influenced atmospheric CH4 concentrations at climate warming phases during the last 800 000 years and during the present global warming. Including these CH4 fluxes in earth system models is essential to understand feedbacks between climate and atmospheric composition. Attempts to model CH4 fluxes from wetlands have previously been undertaken using various approaches. Here, we test a process-based wetland CH4 flux model (PEATLAND-VU which includes details of soil-atmosphere CH4 transport. The model has been used to simulate CH4 emissions from continental Europe in previous glacial climates and the current climate. This paper presents results regarding the sensitivity of modeling glacial terrestrial CH4 fluxes to (a basic tuning parameters of the model, (b different approaches in modeling of the water table, and (c model structure. In order to test the model structure, PEATLAND-VU was compared to a simpler modeling approach based on wetland primary production estimated from a vegetation model (BIOME 3.5. The tuning parameters are the CH4 production rate from labile organic carbon and its temperature sensitivity. The modelled fluxes prove comparatively insensitive to hydrology representation, while sensitive to microbial parameters and model structure. Glacial climate emissions are also highly sensitive to assumptions about the extent of ice cover and exposed seafloor. Wetland expansion over low relief exposed seafloor areas have compensated for a decrease of wetland area due to continental ice cover.

  13. Uncertainties in modeling CH4 emissions from northern wetlands in glacial climates: effect of hydrological model and CH4 model structure

    Directory of Open Access Journals (Sweden)

    J. van Huissteden

    2009-03-01

    Full Text Available Methane (CH4 fluxes from northern wetlands may have influenced atmospheric CH4 concentrations at climate warming phases during the 800 000 years and at present global warming. Including these CH4 fluxes in earth system models is essential to understand feedbacks between climate and atmospheric composition. Attempts to model CH4 fluxes from wetlands have been undertaken previously using various approaches. Here, we test a process-based wetland CH4 flux model (PEATLAND-VU which includes details of soil-atmosphere CH4 transport. The model has been used to simulate CH4 emissions from continental Europe in different glacial climates and the present climate. This paper displays results on the sensitivity of modeling glacial terrestrial CH4 fluxes to basic tuning parameters of the model, to different approaches in modeling of the water table, and to model structure. For testing the model structure, PEATLAND-VU has been compared to a simpler modeling approach based on wetland primary production estimated from a vegetation model (BIOME. The tuning parameters are the CH4 production rate from labile organic carbon and its temperature sensitivity. The modelled fluxes prove comparatively insensitive to hydrology representation, and sensitive to microbial parameters and model structure. Glacial climate emissions are also highly sensitive to assumptions on the extent of ice cover and exposed seafloors. Wetland expansion on low relief exposed seafloor areas, may have compensated for a decrease of wetland area due to continental ice cover.

  14. Structure of the transport uncertainty in mesoscale inversions of CO2 sources and sinks using ensemble model simulations

    Directory of Open Access Journals (Sweden)

    J. Noilhan

    2009-06-01

    Full Text Available We study the characteristics of a statistical ensemble of mesoscale simulations in order to estimate the model error in the simulation of CO2 concentrations. The ensemble consists of ten members and the reference simulation using the operationnal short range forecast PEARP, perturbed using the Singular Vector technique. We then used this ensemble of simulations as the initial and boundary conditions for the meso scale model (Méso-NH simulations, which uses CO2 fluxes from the ISBA-A-gs land surface model. The final ensemble represents the model dependence to the boundary conditions, conserving the physical properties of the dynamical schemes, but excluding the intrinsic error of the model. First, the variance of our ensemble is estimated over the domain, with associated spatial and temporal correlations. Second, we extract the signal from noisy horizontal correlations, due to the limited size ensemble, using diffusion equation modelling. The computational cost of such ensemble limits the number of members (simulations especially when running online the carbon flux and the atmospheric models. In the theory, 50 to 100 members would be required to explore the overall sensitivity of the ensemble. The present diffusion model allows us to extract a significant part of the noisy error, and makes this study feasable with a limited number of simulations. Finally, we compute the diagonal and non-diagonal terms of the observation error covariance matrix and introduced it into our CO2 flux matrix inversion for 18 days of the 2005 intensive campaign CERES over the South West of France. Variances are based on model-data mismatch to ensure we treat model bias as well as ensemble dispersion, whereas spatial and temporal covariances are estimated with our method. The horizontal structure of the ensemble variance manifests the discontinuities of the mesoscale structures during the day, but remains locally driven during the night. On the vertical, surface layer

  15. Assessing damping uncertainty in space structures with fuzzy sets

    Science.gov (United States)

    Ross, Timothy J.; Hasselman, Timothy K.

    1991-01-01

    NASA has been interested in the development of methods for evaluating the predictive accuracy of structural dynamic models. This interest stems from the use of mathematical models in evaluating the structural integrity of all spacecraft prior to flight. Space structures are often too large and too weak to be tested fully assembled in a ground test lab. The predictive accuracy of a model depends on the nature and extent of its experimental verification. The further the test conditions depart from in-service conditions, the less accurate the model will be. Structural damping is known to be one source of uncertainty in models. The uncertainty in damping is explored in order to evaluate the accuracy of dynamic models. A simple mass-spring-dashpot system is used to illustrate a comparison among three methods for propagating uncertainty in structural dynamics models: the First Order Method, the Numerical Simulation Method, and the Fuzzy Set Method. The Fuzzy Set Method is shown to bound the range of possible responses and thus to provide a valuable limiting check on the First Order Method near resonant conditions. Fuzzy Methods are a relative inexpensive alternative to numerical simulation.

  16. Development of a Prototype Model-Form Uncertainty Knowledge Base

    Science.gov (United States)

    Green, Lawrence L.

    2016-01-01

    Uncertainties are generally classified as either aleatory or epistemic. Aleatory uncertainties are those attributed to random variation, either naturally or through manufacturing processes. Epistemic uncertainties are generally attributed to a lack of knowledge. One type of epistemic uncertainty is called model-form uncertainty. The term model-form means that among the choices to be made during a design process within an analysis, there are different forms of the analysis process, which each give different results for the same configuration at the same flight conditions. Examples of model-form uncertainties include the grid density, grid type, and solver type used within a computational fluid dynamics code, or the choice of the number and type of model elements within a structures analysis. The objectives of this work are to identify and quantify a representative set of model-form uncertainties and to make this information available to designers through an interactive knowledge base (KB). The KB can then be used during probabilistic design sessions, so as to enable the possible reduction of uncertainties in the design process through resource investment. An extensive literature search has been conducted to identify and quantify typical model-form uncertainties present within aerospace design. An initial attempt has been made to assemble the results of this literature search into a searchable KB, usable in real time during probabilistic design sessions. A concept of operations and the basic structure of a model-form uncertainty KB are described. Key operations within the KB are illustrated. Current limitations in the KB, and possible workarounds are explained.

  17. Structure of the transport uncertainty in mesoscale inversions of CO2 sources and sinks using ensemble model simulations

    Directory of Open Access Journals (Sweden)

    J. Noilhan

    2008-12-01

    Full Text Available We study the characteristics of a statistical ensemble of mesoscale simulations in order to estimate the model error in the simulation of CO2 concentrations. The ensemble consists of ten members and the reference simulation using the operationnal short range forecast PEARP, perturbed by Singular Vector (SV technic. We then used this ensemble of simulations as the initial and boundary conditions for the meso scale model simulations, here the atmospheric transport model Méso-NH, transporting CO2 fluxes from the ISBA-A-gs land surface model. The final ensemble represents the model dependence to the boundary conditions, conserving the physical properties of the dynamical schemes. First, the variance of our ensemble is estimated over the domain, with associated spatial and temporal correlations. Second, we extract the signal from noisy horizontal correlations, due to the limited size ensemble, using diffusion equation modelling. Finally, we compute the diagonal and non-diagonal terms of the observation error covariance matrix and introduced it into our CO2 flux matrix inversion over 18 days of the 2005 intensive campaign CERES over the South West of France. On the horizontal plane, variance of the ensemble follows the discontinuities of the mesoscale structures during the day, but remain locally driven during the night. On the vertical, surface layer variance shows large correlations with the upper levels in the boundary layer (>0.6, down to 0.4 with the low free troposphere. Large temporal correlations were found during the afternoon (>0.5 for several hours, reduced during the night. Diffusion equation model extracted relevant error covariance signals on the horizontal space, and shows reduced correlations over mountain area and during the night over the continent. The posterior error reduction on the inverted CO2 fluxes accounting for the model error correlations illustrates finally the predominance of the temporal over the spatial correlations

  18. Urban drainage models - making uncertainty analysis simple

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana;

    2012-01-01

    There is increasing awareness about uncertainties in modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a modif...... probability distributions (often used for sensitivity analyses) and prediction intervals. To demonstrate the new method, it is applied to a conceptual rainfall-runoff model using a dataset collected from Melbourne, Australia....

  19. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jakob Laigaard; Brincker, Rune; Rytter, Anders

    In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the param......In this paper the uncertainties of identified modal parameters such as eigenfrequencies and damping ratios are assessed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty...... by a simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been chosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  20. Photovoltaic System Modeling. Uncertainty and Sensitivity Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Martin, Curtis E. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    We report an uncertainty and sensitivity analysis for modeling AC energy from ph otovoltaic systems . Output from a PV system is predicted by a sequence of models. We quantify u ncertainty i n the output of each model using empirical distribution s of each model's residuals. We propagate uncertainty through the sequence of models by sampli ng these distributions to obtain a n empirical distribution of a PV system's output. We consider models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane - of - array irradiance; (2) estimate effective irradiance; (3) predict cell temperature; (4) estimate DC voltage, current and power ; (5) reduce DC power for losses due to inefficient maximum power point tracking or mismatch among modules; and (6) convert DC to AC power . O ur analysis consider s a notional PV system com prising an array of FirstSolar FS - 387 modules and a 250 kW AC inverter ; we use measured irradiance and weather at Albuquerque, NM. We found the uncertainty in PV syste m output to be relatively small, on the order of 1% for daily energy. We found that unce rtainty in the models for POA irradiance and effective irradiance to be the dominant contributors to uncertainty in predicted daily energy. Our analysis indicates that efforts to reduce the uncertainty in PV system output predictions may yield the greatest improvements by focusing on the POA and effective irradiance models.

  1. Selective Maintenance Model Considering Time Uncertainty

    OpenAIRE

    Le Chen; Zhengping Shu; Yuan Li; Xuezhi Lv

    2012-01-01

    This study proposes a selective maintenance model for weapon system during mission interval. First, it gives relevant definitions and operational process of material support system. Then, it introduces current research on selective maintenance modeling. Finally, it establishes numerical model for selecting corrective and preventive maintenance tasks, considering time uncertainty brought by unpredictability of maintenance procedure, indetermination of downtime for spares and difference of skil...

  2. Uncertainty calculation in transport models and forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Prato, Carlo Giacomo

    in a four-stage transport model related to different variable distributions (to be used in a Monte Carlo simulation procedure), assignment procedures and levels of congestion, at both the link and the network level. The analysis used as case study the Næstved model, referring to the Danish town of Næstved2...... the uncertainty propagation pattern over time specific for key model outputs becomes strategically important. 1 Manzo, S., Nielsen, O. A. & Prato, C. G. (2014). The Effects of uncertainty in speed-flow curve parameters on a large-scale model. Transportation Research Record, 1, 30-37. 2 Manzo, S., Nielsen, O. A...

  3. Uncertainty modeling process for semantic technology

    Directory of Open Access Journals (Sweden)

    Rommel N. Carvalho

    2016-08-01

    Full Text Available The ubiquity of uncertainty across application domains generates a need for principled support for uncertainty management in semantically aware systems. A probabilistic ontology provides constructs for representing uncertainty in domain ontologies. While the literature has been growing on formalisms for representing uncertainty in ontologies, there remains little guidance in the knowledge engineering literature for how to design probabilistic ontologies. To address the gap, this paper presents the Uncertainty Modeling Process for Semantic Technology (UMP-ST, a new methodology for modeling probabilistic ontologies. To explain how the methodology works and to verify that it can be applied to different scenarios, this paper describes step-by-step the construction of a proof-of-concept probabilistic ontology. The resulting domain model can be used to support identification of fraud in public procurements in Brazil. While the case study illustrates the development of a probabilistic ontology in the PR-OWL probabilistic ontology language, the methodology is applicable to any ontology formalism that properly integrates uncertainty with domain semantics.

  4. Uncertainty Quantification in Climate Modeling and Projection

    Energy Technology Data Exchange (ETDEWEB)

    Qian, Yun; Jackson, Charles; Giorgi, Filippo; Booth, Ben; Duan, Qingyun; Forest, Chris; Higdon, Dave; Hou, Z. Jason; Huerta, Gabriel

    2016-05-01

    The projection of future climate is one of the most complex problems undertaken by the scientific community. Although scientists have been striving to better understand the physical basis of the climate system and to improve climate models, the overall uncertainty in projections of future climate has not been significantly reduced (e.g., from the IPCC AR4 to AR5). With the rapid increase of complexity in Earth system models, reducing uncertainties in climate projections becomes extremely challenging. Since uncertainties always exist in climate models, interpreting the strengths and limitations of future climate projections is key to evaluating risks, and climate change information for use in Vulnerability, Impact, and Adaptation (VIA) studies should be provided with both well-characterized and well-quantified uncertainty. The workshop aimed at providing participants, many of them from developing countries, information on strategies to quantify the uncertainty in climate model projections and assess the reliability of climate change information for decision-making. The program included a mixture of lectures on fundamental concepts in Bayesian inference and sampling, applications, and hands-on computer laboratory exercises employing software packages for Bayesian inference, Markov Chain Monte Carlo methods, and global sensitivity analyses. The lectures covered a range of scientific issues underlying the evaluation of uncertainties in climate projections, such as the effects of uncertain initial and boundary conditions, uncertain physics, and limitations of observational records. Progress in quantitatively estimating uncertainties in hydrologic, land surface, and atmospheric models at both regional and global scales was also reviewed. The application of Uncertainty Quantification (UQ) concepts to coupled climate system models is still in its infancy. The Coupled Model Intercomparison Project (CMIP) multi-model ensemble currently represents the primary data for

  5. Uncertainty of GIA models across the Greenland

    Science.gov (United States)

    Ruggieri, Gabriella

    2013-04-01

    In the last years various remote sensing techniques have been employed to estimate the current mass balance of the Greenland ice sheet (GIS). In this regards GRACE, laser and radar altimetry observations, employed to constrain the mass balance, consider the glacial isostatic adjustment (GIA) a source of noise. Several GIA models have been elaborated for the Greenland but they differ from each other for mantle viscosity profile and for time history of ice melting. In this work we use the well know ICE-5G (VM2) ice model by Peltier (2004) and two others alternative scenarios of ice melting, ANU05 by Lambeck et al. (1998) and the new regional ice model HUY2 by Simpson et al. (2009) in order to asses the amplitude of the uncertainty related to the GIA predictions. In particular we focus on rates of vertical displacement field, sea surface variations and sea-level change at regional scale. The GIA predictions are estimated using an improved version of SELEN code that solve the sea-level equation for a spherical self-gravitating, incompressible and viscoelastic Earth structure. GIA uncertainty shows a highly variable geographic distribution across the Greenland. Considering the spatial pattern of the GIA predictions related to the three ice models, the western sector of the Greenland Ice Sheets (GrIS) between Thule and Upernavik and around the area of Paamiut, show good agreement while the northeast portion of the Greenland is characterized by a large discrepancy of the GIA predictions inferred by the ice models tested in this work. These differences are ultimately the consequence of the different sets of global relative sea level data and modern geodetic observations used by the authors to constrain the model parameters. Finally GPS Network project (GNET), recently installed around the periphery of the GrIS, are used as a tool to discuss the discrepancies among the GIA models. Comparing the geodetic analysis recently available, appears that among the GPS sites the

  6. Model Uncertainty for Bilinear Hysteretic Systems

    DEFF Research Database (Denmark)

    1984-01-01

    is related to the concept of a failure surface (or limit state surface) in the n-dimensional basic variable space then model uncertainty is at least due to the neglected variables, the modelling of the failure surface and the computational technique used. A more precise definition is given in section 2...

  7. Coping with Uncertainty Modeling and Policy Issues

    CERN Document Server

    Marti, Kurt; Makowski, Marek

    2006-01-01

    Ongoing global changes bring fundamentally new scientific problems requiring new concepts and tools. The complexity of new problems does not allow to achieve enough certainty by increasing the resolution of models or by bringing in more links. This book talks about new tools for modeling and management of uncertainty.

  8. Uncertainty in biology a computational modeling approach

    CERN Document Server

    Gomez-Cabrero, David

    2016-01-01

    Computational modeling of biomedical processes is gaining more and more weight in the current research into the etiology of biomedical problems and potential treatment strategies.  Computational modeling allows to reduce, refine and replace animal experimentation as well as to translate findings obtained in these experiments to the human background. However these biomedical problems are inherently complex with a myriad of influencing factors, which strongly complicates the model building and validation process.  This book wants to address four main issues related to the building and validation of computational models of biomedical processes: Modeling establishment under uncertainty Model selection and parameter fitting Sensitivity analysis and model adaptation Model predictions under uncertainty In each of the abovementioned areas, the book discusses a number of key-techniques by means of a general theoretical description followed by one or more practical examples.  This book is intended for graduate stude...

  9. Modelling of data uncertainties on hybrid computers

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Anke (ed.)

    2016-06-15

    The codes d{sup 3}f and r{sup 3}t are well established for modelling density-driven flow and nuclide transport in the far field of repositories for hazardous material in deep geological formations. They are applicable in porous media as well as in fractured rock or mudstone, for modelling salt- and heat transport as well as a free groundwater surface. Development of the basic framework of d{sup 3}f and r{sup 3}t had begun more than 20 years ago. Since that time significant advancements took place in the requirements for safety assessment as well as for computer hardware development. The period of safety assessment for a repository of high-level radioactive waste was extended to 1 million years, and the complexity of the models is steadily growing. Concurrently, the demands on accuracy increase. Additionally, model and parameter uncertainties become more and more important for an increased understanding of prediction reliability. All this leads to a growing demand for computational power that requires a considerable software speed-up. An effective way to achieve this is the use of modern, hybrid computer architectures which requires basically the set-up of new data structures and a corresponding code revision but offers a potential speed-up by several orders of magnitude. The original codes d{sup 3}f and r{sup 3}t were applications of the software platform UG /BAS 94/ whose development had begun in the early nineteennineties. However, UG had recently been advanced to the C++ based, substantially revised version UG4 /VOG 13/. To benefit also in the future from state-of-the-art numerical algorithms and to use hybrid computer architectures, the codes d{sup 3}f and r{sup 3}t were transferred to this new code platform. Making use of the fact that coupling between different sets of equations is natively supported in UG4, d{sup 3}f and r{sup 3}t were combined to one conjoint code d{sup 3}f++. A direct estimation of uncertainties for complex groundwater flow models with the

  10. Uncertainty in hydrological change modelling

    DEFF Research Database (Denmark)

    Seaby, Lauren Paige

    methodology for basin discharge and groundwater heads. The ensemble of 11 climate models varied in strength, significance, and sometimes in direction of the climate change signal. The more complex daily DBS correction methods were more accurate at transferring precipitation changes in mean as well...... as the variance, and improving the characterisation of day to day variation as well as heavy events. However, the most highly parameterised of the DBS methods were less robust under climate change conditions. The spatial characteristics of groundwater head and stream discharge were best represented by DBS methods...... applied at the grid scale. Flux and state hydrological outputs which integrate responses over time and space showed more sensitivity to precipitation mean spatial biases and less so on extremes. In the investigated catchments, the projected change of groundwater levels and basin discharge between current...

  11. Uncertainties in Surface Layer Modeling

    Science.gov (United States)

    Pendergrass, W.

    2015-12-01

    A central problem for micrometeorologists has been the relationship of air-surface exchange rates of momentum and heat to quantities that can be predicted with confidence. The flux-gradient profile developed through Monin-Obukhov Similarity Theory (MOST) provides an integration of the dimensionless wind shear expression where is an empirically derived expression for stable and unstable atmospheric conditions. Empirically derived expressions are far from universally accepted (Garratt, 1992, Table A5). Regardless of what form of these relationships might be used, their significance over any short period of time is questionable since all of these relationships between fluxes and gradients apply to averages that might rarely occur. It is well accepted that the assumption of stationarity and homogeneity do not reflect the true chaotic nature of the processes that control the variables considered in these relationships, with the net consequence that the levels of predictability theoretically attainable might never be realized in practice. This matter is of direct relevance to modern prognostic models which construct forecasts by assuming the universal applicability of relationships among averages for the lower atmosphere, which rarely maintains an average state. Under a Cooperative research and Development Agreement between NOAA and Duke Energy Generation, NOAA/ATDD conducted atmospheric boundary layer (ABL) research using Duke renewable energy sites as research testbeds. One aspect of this research has been the evaluation of legacy flux-gradient formulations (the ϕ functions, see Monin and Obukhov, 1954) for the exchange of heat and momentum. At the Duke Energy Ocotillo site, NOAA/ATDD installed sonic anemometers reporting wind and temperature fluctuations at 10Hz at eight elevations. From these observations, ϕM and ϕH were derived from a two-year database of mean and turbulent wind and temperature observations. From this extensive measurement database, using a

  12. Resolving structural uncertainty in natural resources management using POMDP approaches

    Science.gov (United States)

    Williams, B.K.

    2011-01-01

    In recent years there has been a growing focus on the uncertainties of natural resources management, and the importance of accounting for uncertainty in assessing management effectiveness. This paper focuses on uncertainty in resource management in terms of discrete-state Markov decision processes (MDP) under structural uncertainty and partial observability. It describes the treatment of structural uncertainty with approaches developed for partially observable resource systems. In particular, I show how value iteration for partially observable MDPs (POMDP) can be extended to structurally uncertain MDPs. A key difference between these process classes is that structurally uncertain MDPs require the tracking of system state as well as a probability structure for the structure uncertainty, whereas with POMDPs require only a probability structure for the observation uncertainty. The added complexity of the optimization problem under structural uncertainty is compensated by reduced dimensionality in the search for optimal strategy. A solution algorithm for structurally uncertain processes is outlined for a simple example in conservation biology. By building on the conceptual framework developed for POMDPs, natural resource analysts and decision makers who confront structural uncertainties in natural resources can take advantage of the rapid growth in POMDP methods and approaches, and thereby produce better conservation strategies over a larger class of resource problems. ?? 2011.

  13. Are models, uncertainty, and dispute resolution compatible?

    Science.gov (United States)

    Anderson, J. D.; Wilson, J. L.

    2013-12-01

    Models and their uncertainty often move from an objective use in planning and decision making into the regulatory environment, then sometimes on to dispute resolution through litigation or other legal forums. Through this last transition whatever objectivity the models and uncertainty assessment may have once possessed becomes biased (or more biased) as each party chooses to exaggerate either the goodness of a model, or its worthlessness, depending on which view is in its best interest. If worthlessness is desired, then what was uncertain becomes unknown, or even unknowable. If goodness is desired, then precision and accuracy are often exaggerated and uncertainty, if it is explicitly recognized, encompasses only some parameters or conceptual issues, ignores others, and may minimize the uncertainty that it accounts for. In dispute resolution, how well is the adversarial process able to deal with these biases? The challenge is that they are often cloaked in computer graphics and animations that appear to lend realism to what could be mostly fancy, or even a manufactured outcome. While junk science can be challenged through appropriate motions in federal court, and in most state courts, it not unusual for biased or even incorrect modeling results, or conclusions based on incorrect results, to be permitted to be presented at trial. Courts allow opinions that are based on a "reasonable degree of scientific certainty," but when that 'certainty' is grossly exaggerated by an expert, one way or the other, how well do the courts determine that someone has stepped over the line? Trials are based on the adversary system of justice, so opposing and often irreconcilable views are commonly allowed, leaving it to the judge or jury to sort out the truth. Can advances in scientific theory and engineering practice, related to both modeling and uncertainty, help address this situation and better ensure that juries and judges see more objective modeling results, or at least see

  14. Uncertainty quantification and stochastic modeling with Matlab

    CERN Document Server

    Souza de Cursi, Eduardo

    2015-01-01

    Uncertainty Quantification (UQ) is a relatively new research area which describes the methods and approaches used to supply quantitative descriptions of the effects of uncertainty, variability and errors in simulation problems and models. It is rapidly becoming a field of increasing importance, with many real-world applications within statistics, mathematics, probability and engineering, but also within the natural sciences. Literature on the topic has up until now been largely based on polynomial chaos, which raises difficulties when considering different types of approximation and does no

  15. Uncertainty quantification for Markov chain models.

    Science.gov (United States)

    Meidani, Hadi; Ghanem, Roger

    2012-12-01

    Transition probabilities serve to parameterize Markov chains and control their evolution and associated decisions and controls. Uncertainties in these parameters can be associated with inherent fluctuations in the medium through which a chain evolves, or with insufficient data such that the inferential value of the chain is jeopardized. The behavior of Markov chains associated with such uncertainties is described using a probabilistic model for the transition matrices. The principle of maximum entropy is used to characterize the probability measure of the transition rates. The formalism is demonstrated on a Markov chain describing the spread of disease, and a number of quantities of interest, pertaining to different aspects of decision-making, are investigated.

  16. On structural design optimization under uncertainty and risk

    Science.gov (United States)

    Teófilo Beck, André; José de Santana Gomes, Wellison

    2010-06-01

    In this paper, the effects of uncertainty and risk on structural design optimization are investigated, by comparing results of Deterministic Design Optimization (DDO), Reliability-based Design Optimization (RBDO) and Reliability-based Risk Optimization (RBRO). DDO yields a structural topology (or shape) which is optimum in terms of mechanics, but does not explicitly address parameter uncertainties and their effects on structural safety. RBDO properly models safety-under-uncertainty, allowing the optimum structure to maintain an acceptable level of safety. Results, however, are dependent on the failure probability used as constraint. Risk optimization (RBRO) increases the scope of the problem, by addressing the compromising goals of economy and safety. This is accomplished by quantifying the costs associated to construction, operation and maintenance, as well as the monetary consequences of failure. RBRO yields the optimum topology and the optimum point of balance between economy and safety. Results are compared for some example problems. The broader RBRO solution is found first, and optimum results are used as constraints in DDO and RBDO. Results show that even when the optimum safety coefficients are used as constraint in DDO, the formulation leads to optimum configurations which respect these design constraints, reduce manufacturing costs but increase total expected costs (including expected cost of failure). If the (optimum) system failure probability is used as constraint in RBDO, the optimum solution reduces manufacturing costs, but by increasing total expected costs. This happens when the costs associated to different failure modes are distinct.

  17. Representing uncertainty on model analysis plots

    Science.gov (United States)

    Smith, Trevor I.

    2016-12-01

    Model analysis provides a mechanism for representing student learning as measured by standard multiple-choice surveys. The model plot contains information regarding both how likely students in a particular class are to choose the correct answer and how likely they are to choose an answer consistent with a well-documented conceptual model. Unfortunately, Bao's original presentation of the model plot did not include a way to represent uncertainty in these measurements. I present details of a method to add error bars to model plots by expanding the work of Sommer and Lindell. I also provide a template for generating model plots with error bars.

  18. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    Groundwater modeling plays an essential role in modern subsurface hydrology research. It’s generally recognized that simulations and predictions by groundwater models are associated with uncertainties that originate from various sources. The two major uncertainty sources are related to model...... parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...

  19. Uncertainty Quantification for Optical Model Parameters

    CERN Document Server

    Lovell, A E; Sarich, J; Wild, S M

    2016-01-01

    Although uncertainty quantification has been making its way into nuclear theory, these methods have yet to be explored in the context of reaction theory. For example, it is well known that different parameterizations of the optical potential can result in different cross sections, but these differences have not been systematically studied and quantified. The purpose of this work is to investigate the uncertainties in nuclear reactions that result from fitting a given model to elastic-scattering data, as well as to study how these uncertainties propagate to the inelastic and transfer channels. We use statistical methods to determine a best fit and create corresponding 95\\% confidence bands. A simple model of the process is fit to elastic-scattering data and used to predict either inelastic or transfer cross sections. In this initial work, we assume that our model is correct, and the only uncertainties come from the variation of the fit parameters. We study a number of reactions involving neutron and deuteron p...

  20. Uncertainty of Modal Parameters Estimated by ARMA Models

    DEFF Research Database (Denmark)

    Jensen, Jacob Laigaard; Brincker, Rune; Rytter, Anders

    1990-01-01

    In this paper the uncertainties of identified modal parameters such as eidenfrequencies and damping ratios are assed. From the measured response of dynamic excited structures the modal parameters may be identified and provide important structural knowledge. However the uncertainty of the parameters...... by simulation study of a lightly damped single degree of freedom system. Identification by ARMA models has been choosen as system identification method. It is concluded that both the sampling interval and number of sampled points may play a significant role with respect to the statistical errors. Furthermore...

  1. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    The FP7 funded UncertWeb project aims to create the "uncertainty enabled model web". The central concept here is that geospatial models and data resources are exposed via standard web service interfaces, such as the Open Geospatial Consortium (OGC) suite of encodings and interface standards, allowing the creation of complex workflows combining both data and models. The focus of UncertWeb is on the issue of managing uncertainty in such workflows, and providing the standards, architecture, tools and software support necessary to realise the "uncertainty enabled model web". In this paper we summarise the developments in the first two years of UncertWeb, illustrating several key points with examples taken from the use case requirements that motivate the project. Firstly we address the issue of encoding specifications. We explain the usage of UncertML 2.0, a flexible encoding for representing uncertainty based on a probabilistic approach. This is designed to be used within existing standards such as Observations and Measurements (O&M) and data quality elements of ISO19115 / 19139 (geographic information metadata and encoding specifications) as well as more broadly outside the OGC domain. We show profiles of O&M that have been developed within UncertWeb and how UncertML 2.0 is used within these. We also show encodings based on NetCDF and discuss possible future directions for encodings in JSON. We then discuss the issues of workflow construction, considering discovery of resources (both data and models). We discuss why a brokering approach to service composition is necessary in a world where the web service interfaces remain relatively heterogeneous, including many non-OGC approaches, in particular the more mainstream SOAP and WSDL approaches. We discuss the trade-offs between delegating uncertainty management functions to the service interfaces themselves and integrating the functions in the workflow management system. We describe two utility services to address

  2. Uncertainty in Regional Air Quality Modeling

    Science.gov (United States)

    Digar, Antara

    Effective pollution mitigation is the key to successful air quality management. Although states invest millions of dollars to predict future air quality, the regulatory modeling and analysis process to inform pollution control strategy remains uncertain. Traditionally deterministic ‘bright-line’ tests are applied to evaluate the sufficiency of a control strategy to attain an air quality standard. A critical part of regulatory attainment demonstration is the prediction of future pollutant levels using photochemical air quality models. However, because models are uncertain, they yield a false sense of precision that pollutant response to emission controls is perfectly known and may eventually mislead the selection of control policies. These uncertainties in turn affect the health impact assessment of air pollution control strategies. This thesis explores beyond the conventional practice of deterministic attainment demonstration and presents novel approaches to yield probabilistic representations of pollutant response to emission controls by accounting for uncertainties in regional air quality planning. Computationally-efficient methods are developed and validated to characterize uncertainty in the prediction of secondary pollutant (ozone and particulate matter) sensitivities to precursor emissions in the presence of uncertainties in model assumptions and input parameters. We also introduce impact factors that enable identification of model inputs and scenarios that strongly influence pollutant concentrations and sensitivity to precursor emissions. We demonstrate how these probabilistic approaches could be applied to determine the likelihood that any control measure will yield regulatory attainment, or could be extended to evaluate probabilistic health benefits of emission controls, considering uncertainties in both air quality models and epidemiological concentration-response relationships. Finally, ground-level observations for pollutant (ozone) and precursor

  3. [Application of an uncertainty model for fibromyalgia].

    Science.gov (United States)

    Triviño Martínez, Ángeles; Solano Ruiz, M Carmen; Siles González, José

    2016-04-01

    Finding out women's experiences diagnosed with fibromyalgia applying the Theory of Uncertainty proposed by M. Mishel. A qualitative study was conducted, using a phenomenological approach. An Association of patients in the province of Alicante during the months of June 2012 to November 2013. A total of 14 women diagnosed with fibromyalgia participated in the study as volunteers, aged between 45 and 65 years. Information generated through structured interviews with recording and transcription, prior confidentiality pledge and informed consent. Analysis content by extracting different categories according to the theory proposed. The study patients perceive a high level of uncertainty related to the difficulty to deal with symptoms, uncertainty about diagnosis and treatment complexity. Moreover, the ability of coping with the disease it is influenced by social support, relationships with health professionals and help and information attending to patient associations. The health professional must provide clear information on the pathology to the fibromyalgia suffers, the larger lever of knowledge of the patients about their disease and the better the quality of the information provided, it is reported to be the less anxiety and uncertainty in the experience of the disease. Likewise patient associations should have health professionals in order to avoid bias in the information and advice with scientific evidence. Copyright © 2015 Elsevier España, S.L.U. All rights reserved.

  4. Uncertainty and Sensitivity in Surface Dynamics Modeling

    Science.gov (United States)

    Kettner, Albert J.; Syvitski, James P. M.

    2016-05-01

    Papers for this special issue on 'Uncertainty and Sensitivity in Surface Dynamics Modeling' heralds from papers submitted after the 2014 annual meeting of the Community Surface Dynamics Modeling System or CSDMS. CSDMS facilitates a diverse community of experts (now in 68 countries) that collectively investigate the Earth's surface-the dynamic interface between lithosphere, hydrosphere, cryosphere, and atmosphere, by promoting, developing, supporting and disseminating integrated open source software modules. By organizing more than 1500 researchers, CSDMS has the privilege of identifying community strengths and weaknesses in the practice of software development. We recognize, for example, that progress has been slow on identifying and quantifying uncertainty and sensitivity in numerical modeling of earth's surface dynamics. This special issue is meant to raise awareness for these important subjects and highlight state-of-the-art progress.

  5. Physical and Model Uncertainty for Fatigue Design of Composite Material

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    The main aim of the present report is to establish stochastic models for the uncertainties related to fatigue design of composite materials. The uncertainties considered are the physical uncertainty related to the static and fatigue strength and the model uncertainty related to Miners rule...

  6. Influence of model reduction on uncertainty of flood inundation predictions

    Science.gov (United States)

    Romanowicz, R. J.; Kiczko, A.; Osuch, M.

    2012-04-01

    Derivation of flood risk maps requires an estimation of the maximum inundation extent for a flood with an assumed probability of exceedence, e.g. a 100 or 500 year flood. The results of numerical simulations of flood wave propagation are used to overcome the lack of relevant observations. In practice, deterministic 1-D models are used for flow routing, giving a simplified image of a flood wave propagation process. The solution of a 1-D model depends on the simplifications to the model structure, the initial and boundary conditions and the estimates of model parameters which are usually identified using the inverse problem based on the available noisy observations. Therefore, there is a large uncertainty involved in the derivation of flood risk maps. In this study we examine the influence of model structure simplifications on estimates of flood extent for the urban river reach. As the study area we chose the Warsaw reach of the River Vistula, where nine bridges and several dikes are located. The aim of the study is to examine the influence of water structures on the derived model roughness parameters, with all the bridges and dikes taken into account, with a reduced number and without any water infrastructure. The results indicate that roughness parameter values of a 1-D HEC-RAS model can be adjusted for the reduction in model structure. However, the price we pay is the model robustness. Apart from a relatively simple question regarding reducing model structure, we also try to answer more fundamental questions regarding the relative importance of input, model structure simplification, parametric and rating curve uncertainty to the uncertainty of flood extent estimates. We apply pseudo-Bayesian methods of uncertainty estimation and Global Sensitivity Analysis as the main methodological tools. The results indicate that the uncertainties have a substantial influence on flood risk assessment. In the paper we present a simplified methodology allowing the influence of

  7. Quantifying uncertainty in stable isotope mixing models

    Science.gov (United States)

    Davis, Paul; Syme, James; Heikoop, Jeffrey; Fessenden-Rahn, Julianna; Perkins, George; Newman, Brent; Chrystal, Abbey E.; Hagerty, Shannon B.

    2015-05-01

    Mixing models are powerful tools for identifying biogeochemical sources and determining mixing fractions in a sample. However, identification of actual source contributors is often not simple, and source compositions typically vary or even overlap, significantly increasing model uncertainty in calculated mixing fractions. This study compares three probabilistic methods, Stable Isotope Analysis in R (SIAR), a pure Monte Carlo technique (PMC), and Stable Isotope Reference Source (SIRS) mixing model, a new technique that estimates mixing in systems with more than three sources and/or uncertain source compositions. In this paper, we use nitrate stable isotope examples (δ15N and δ18O) but all methods tested are applicable to other tracers. In Phase I of a three-phase blind test, we compared methods for a set of six-source nitrate problems. PMC was unable to find solutions for two of the target water samples. The Bayesian method, SIAR, experienced anchoring problems, and SIRS calculated mixing fractions that most closely approximated the known mixing fractions. For that reason, SIRS was the only approach used in the next phase of testing. In Phase II, the problem was broadened where any subset of the six sources could be a possible solution to the mixing problem. Results showed a high rate of Type I errors where solutions included sources that were not contributing to the sample. In Phase III some sources were eliminated based on assumed site knowledge and assumed nitrate concentrations, substantially reduced mixing fraction uncertainties and lowered the Type I error rate. These results demonstrate that valuable insights into stable isotope mixing problems result from probabilistic mixing model approaches like SIRS. The results also emphasize the importance of identifying a minimal set of potential sources and quantifying uncertainties in source isotopic composition as well as demonstrating the value of additional information in reducing the uncertainty in calculated

  8. Uncertainty modelling of critical column buckling for reinforced concrete buildings

    Indian Academy of Sciences (India)

    Kasim A Korkmaz; Fuat Demir; Hamide Tekeli

    2011-04-01

    Buckling is a critical issue for structural stability in structural design. In most of the buckling analyses, applied loads, structural and material properties are considered certain. However, in reality, these parameters are uncertain. Therefore, a prognostic solution is necessary and uncertainties have to be considered. Fuzzy logic algorithms can be a solution to generate more dependable results. This study investigates the material uncertainties on column design and proposes an uncertainty model for critical column buckling reinforced concrete buildings. Fuzzy logic algorithm was employed in the study. Lower and upper bounds of elastic modulus representing material properties were defined to take uncertainties into account. The results show that uncertainties play an important role in stability analyses and should be considered in the design. The proposed approach is applicable to both future numerical and experimental researches. According to the study results, it is seen that, calculated buckling load values are stayed in lower and upper bounds while the load values are different for same concrete strength values by using different code formula.

  9. Representing Turbulence Model Uncertainty with Stochastic PDEs

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2012-11-01

    Validation of and uncertainty quantification for extrapolative predictions of RANS turbulence models are necessary to ensure that the models are not used outside of their domain of applicability and to properly inform decisions based on such predictions. In previous work, we have developed and calibrated statistical models for these purposes, but it has been found that incorporating all the knowledge of a domain expert--e.g., realizability, spatial smoothness, and known scalings--in such models is difficult. Here, we explore the use of stochastic PDEs for this purpose. The goal of this formulation is to pose the uncertainty model in a setting where it is easier for physical modelers to express what is known. To explore the approach, multiple stochastic models describing the error in the Reynolds stress are coupled with multiple deterministic turbulence models to make uncertain predictions of channel flow. These predictions are compared with DNS data to assess their credibility. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  10. Modeling and inverse problems in the presence of uncertainty

    CERN Document Server

    Banks, H T; Thompson, W Clayton

    2014-01-01

    Modeling and Inverse Problems in the Presence of Uncertainty collects recent research-including the authors' own substantial projects-on uncertainty propagation and quantification. It covers two sources of uncertainty: where uncertainty is present primarily due to measurement errors and where uncertainty is present due to the modeling formulation itself. After a useful review of relevant probability and statistical concepts, the book summarizes mathematical and statistical aspects of inverse problem methodology, including ordinary, weighted, and generalized least-squares formulations. It then

  11. Fault Detection under Fuzzy Model Uncertainty

    Institute of Scientific and Technical Information of China (English)

    Marek Kowal; Józef Korbicz

    2007-01-01

    The paper tackles the problem of robust fault detection using Takagi-Sugeno fuzzy models. A model-based strategy is employed to generate residuals in order to make a decision about the state of the process. Unfortunately, such a method is corrupted by model uncertainty due to the fact that in real applications there exists a model-reality mismatch. In order to ensure reliable fault detection the adaptive threshold technique is used to deal with the mentioned problem. The paper focuses also on fuzzy model design procedure. The bounded-error approach is applied to generating the rules for the model using available measurements. The proposed approach is applied to fault detection in the DC laboratory engine.

  12. Facets of Uncertainty in Digital Elevation and Slope Modeling

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jingxiong; LI Deren

    2005-01-01

    This paper investigates the differences that result from applying different approaches to uncertainty modeling and reports an experimental examining error estimation and propagation in elevation and slope,with the latter derived from the former. It is confirmed that significant differences exist between uncertainty descriptors, and propagation of uncertainty to end products is immensely affected by the specification of source uncertainty.

  13. Structural uncertainty in air mass factor calculation for NO

    NARCIS (Netherlands)

    Lorente Delgado, Alba; Folkert Boersma, K.; Yu, Huan; Dörner, Steffen; Hilboll, Andreas; Richter, Andreas; Liu, Mengyao; Lamsal, Lok N.; Barkley, Michael; Smedt, De Isabelle; Roozendael, Van Michel; Wang, Yang; Wagner, Thomas; Beirle, Steffen; Lin, Jin Tai; Krotkov, Nickolay; Stammes, Piet; Wang, Ping; Eskes, Henk J.; Krol, Maarten

    2017-01-01

    Air mass factor (AMF) calculation is the largest source of uncertainty in NO2 and HCHO satellite retrievals in situations with enhanced trace gas concentrations in the lower troposphere. Structural uncertainty arises when different retrieval methodologies are applied within the scientific community

  14. Emotions That Associate With Uncertainty Lead to Structured Ideation

    NARCIS (Netherlands)

    Baas, Matthijs; de Dreu, Carsten; Nijstad, Bernard A.

    2012-01-01

    This study tested the role of emotion in structured ideation, a process in which newly generated ideas and insights closely follow previously generated ideas and insights. Emotions can be differentiated on a number of underlying dimensions, including uncertainty, and uncertainty can influence inform

  15. Using dynamical uncertainty models estimating uncertainty bounds on power plant performance prediction

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Stoustrup, Jakob; Mataji, B.

    2007-01-01

    Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models of th...... models, is applied to two different sets of measured plant data. The computed uncertainty bounds cover the measured plant output, while the nominal prediction is outside these uncertainty bounds for some samples in these examples.  ......Predicting the performance of large scale plants can be difficult due to model uncertainties etc, meaning that one can be almost certain that the prediction will diverge from the plant performance with time. In this paper output multiplicative uncertainty models are used as dynamical models...... of the prediction error. These proposed dynamical uncertainty models result in an upper and lower bound on the predicted performance of the plant. The dynamical uncertainty models are used to estimate the uncertainty of the predicted performance of a coal-fired power plant. The proposed scheme, which uses dynamical...

  16. Review of strategies for handling geological uncertainty in groundwater flow and transport modeling

    DEFF Research Database (Denmark)

    Refsgaard, Jens Christian; Christensen, Steen; Sonnenborg, Torben O.;

    2012-01-01

    The geologically related uncertainty in groundwater modeling originates from two main sources: geological structures and hydraulic parameter values within these structures. Within a geological structural element the parameter values will always exhibit local scale heterogeneity, which can...... be accounted for, but is often neglected, in assessments of prediction uncertainties. Strategies for assessing prediction uncertainty due to geologically related uncertainty may be divided into three main categories, accounting for uncertainty due to: (a) the geological structure; (b) effective model...... parameters; and (c) model parameters including local scale heterogeneity. The most common methodologies for uncertainty assessments within each of these categories, such as multiple modeling, Monte Carlo analysis, regression analysis and moment equation approach, are briefly described with emphasis...

  17. Robust stabilization of general nonlinear systems with structural uncertainty

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper deals with the robust stabilization and passivity of general nonlinear systems with structural uncertainty. By using Lyapunov function, it verifies that under some conditions the robust passivity implies the zero-state detectability, Furthermore, it also implies the robust stabilization for such nonlinear systems. We then establish a stabilization method for the nonlinear systems with structural uncertainty. The smooth state feedback law can be constructed with the solution of an equation. Finally, it is worth noting that the main contribution of the paper establishes the relation between robust passivity and feedback stabilization for the general nonlinear systems with structural uncertainty. The simulation shows the effectiveness of the method.

  18. Model uncertainty and Bayesian model averaging in vector autoregressive processes

    NARCIS (Netherlands)

    R.W. Strachan (Rodney); H.K. van Dijk (Herman)

    2006-01-01

    textabstractEconomic forecasts and policy decisions are often informed by empirical analysis based on econometric models. However, inference based upon a single model, when several viable models exist, limits its usefulness. Taking account of model uncertainty, a Bayesian model averaging procedure i

  19. Uncertainty Quantification in Experimental Structural Dynamics Identification of Composite Material Structures

    DEFF Research Database (Denmark)

    Luczak, Marcin; Peeters, Bart; Kahsin, Maciej

    2014-01-01

    Aerospace and wind energy structures are extensively using components made of composite materials. Since these structures are subjected to dynamic environments with time-varying loading conditions, it is important to model their dynamic behavior and validate these models by means of vibration...... for uncertainty evaluation in experimentally estimated models. Investigated structures are plates, fuselage panels and helicopter main rotor blades as they represent different complexity levels ranging from coupon, through sub-component up to fully assembled structures made of composite materials. To evaluate...

  20. Uncertainty Assessment in Urban Storm Water Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    The object of this paper is to make an overall description of the author's PhD study, concerning uncertainties in numerical urban storm water drainage models. Initially an uncertainty localization and assessment of model inputs and parameters as well as uncertainties caused by different model...

  1. Uncertainty and the Conceptual Site Model

    Science.gov (United States)

    Price, V.; Nicholson, T. J.

    2007-12-01

    Our focus is on uncertainties in the underlying conceptual framework upon which all subsequent steps in numerical and/or analytical modeling efforts depend. Experienced environmental modelers recognize the value of selecting an optimal conceptual model from several competing site models, but usually do not formally explore possible alternative models, in part due to incomplete or missing site data, as well as relevant regional data for establishing boundary conditions. The value in and approach for developing alternative conceptual site models (CSM) is demonstrated by analysis of case histories. These studies are based on reported flow or transport modeling in which alternative site models are formulated using data that were not available to, or not used by, the original modelers. An important concept inherent to model abstraction of these alternative conceptual models is that it is "Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise." (Tukey, 1962) The case histories discussed here illustrate the value of formulating alternative models and evaluating them using site-specific data: (1) Charleston Naval Site where seismic characterization data allowed significant revision of the CSM and subsequent contaminant transport modeling; (2) Hanford 300-Area where surface- and ground-water interactions affecting the unsaturated zone suggested an alternative component to the site model; (3) Savannah River C-Area where a characterization report for a waste site within the modeled area was not available to the modelers, but provided significant new information requiring changes to the underlying geologic and hydrogeologic CSM's used; (4) Amargosa Desert Research Site (ADRS) where re-interpretation of resistivity sounding data and water-level data suggested an alternative geologic model. Simple 2-D spreadsheet modeling of the ADRS with the revised CSM provided an improved

  2. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  3. [Proposal] Addressing structural uncertainty in a decision-making framework to inform scaup conservation planning

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — We are proposing to examine the role of harvest in annual and seasonal survival of lesser scaup, a key structural uncertainty in current population models identified...

  4. Assessing uncertainties in solute transport models: Upper Narew case study

    Science.gov (United States)

    Osuch, M.; Romanowicz, R.; Napiórkowski, J. J.

    2009-04-01

    This paper evaluates uncertainties in two solute transport models based on tracer experiment data from the Upper River Narew. Data Based Mechanistic and transient storage models were applied to Rhodamine WT tracer observations. We focus on the analysis of uncertainty and the sensitivity of model predictions to varying physical parameters, such as dispersion and channel geometry. An advection-dispersion model with dead zones (Transient Storage model) adequately describes the transport of pollutants in a single channel river with multiple storage. The applied transient storage model is deterministic; it assumes that observations are free of errors and the model structure perfectly describes the process of transport of conservative pollutants. In order to take into account the model and observation errors, an uncertainty analysis is required. In this study we used a combination of the Generalized Likelihood Uncertainty Estimation technique (GLUE) and the variance based Global Sensitivity Analysis (GSA). The combination is straightforward as the same samples (Sobol samples) were generated for GLUE analysis and for sensitivity assessment. Additionally, the results of the sensitivity analysis were used to specify the best parameter ranges and their prior distributions for the evaluation of predictive model uncertainty using the GLUE methodology. Apart from predictions of pollutant transport trajectories, two ecological indicators were also studied (time over the threshold concentration and maximum concentration). In particular, a sensitivity analysis of the length of "over the threshold" period shows an interesting multi-modal dependence on model parameters. This behavior is a result of the direct influence of parameters on different parts of the dynamic response of the system. As an alternative to the transient storage model, a Data Based Mechanistic approach was tested. Here, the model is identified and the parameters are estimated from available time series data using

  5. Advances in the study of uncertainty quantification of large-scale hydrological modeling system

    Institute of Scientific and Technical Information of China (English)

    SONG Xiaomeng; ZHAN Chesheng; KONG Fanzhe; XIA Jun

    2011-01-01

    The regional hydrological system is extremely complex because it is affected not only by physical factors but also by human dimensions.And the hydrological models play a very important role in simulating the complex system.However,there have not been effective methods for the model reliability and uncertainty analysis due to its complexity and difficulty.The uncertainties in hydrological modeling come from four important aspects:uncertainties in input data and parameters,uncertainties in model structure,uncertainties in analysis method and the initial and boundary conditions.This paper systematically reviewed the recent advances in the study of the uncertainty analysis approaches in the large-scale complex hydrological model on the basis of uncertainty sources.Also,the shortcomings and insufficiencies in the uncertainty analysis for complex hydrological models are pointed out.And then a new uncertainty quantification platform PSUADE and its uncertainty quantification methods were introduced,which will be a powerful tool and platform for uncertainty analysis of large-scale complex hydrological models.Finally,some future perspectives on uncertainty quantification are put forward.

  6. Systematic Uncertainties in High-Energy Hadronic Interaction Models

    Science.gov (United States)

    Zha, M.; Knapp, J.; Ostapchenko, S.

    2003-07-01

    Hadronic interaction models for cosmic ray energies are uncertain since our knowledge of hadronic interactions is extrap olated from accelerator experiments at much lower energies. At present most high-energy models are based on Grib ov-Regge theory of multi-Pomeron exchange, which provides a theoretical framework to evaluate cross-sections and particle production. While experimental data constrain some of the model parameters, others are not well determined and are therefore a source of systematic uncertainties. In this paper we evaluate the variation of results obtained with the QGSJET model, when modifying parameters relating to three ma jor sources of uncertainty: the form of the parton structure function, the role of diffractive interactions, and the string hadronisation. Results on inelastic cross sections, on secondary particle production and on the air shower development are discussed.

  7. Management of California Oak Woodlands: Uncertainties and Modeling

    Science.gov (United States)

    Jay E. Noel; Richard P. Thompson

    1995-01-01

    A mathematical policy model of oak woodlands is presented. The model illustrates the policy uncertainties that exist in the management of oak woodlands. These uncertainties include: (1) selection of a policy criterion function, (2) woodland dynamics, (3) initial and final state of the woodland stock. The paper provides a review of each of the uncertainty issues. The...

  8. Representing and managing uncertainty in qualitative ecological models

    NARCIS (Netherlands)

    Nuttle, T.; Bredeweg, B.; Salles, P.; Neumann, M.

    2009-01-01

    Ecologists and decision makers need ways to understand systems, test ideas, and make predictions and explanations about systems. However, uncertainty about causes and effects of processes and parameter values is pervasive in models of ecological systems. Uncertainty associated with incomplete

  9. Gaze categorization under uncertainty: psychophysics and modeling.

    Science.gov (United States)

    Mareschal, Isabelle; Calder, Andrew J; Dadds, Mark R; Clifford, Colin W G

    2013-04-22

    The accurate perception of another person's gaze direction underlies most social interactions and provides important information about his or her future intentions. As a first step to measuring gaze perception, most experiments determine the range of gaze directions that observers judge as being direct: the cone of direct gaze. This measurement has revealed the flexibility of observers' perception of gaze and provides a useful benchmark against which to test clinical populations with abnormal gaze behavior. Here, we manipulated effective signal strength by adding noise to the eyes of synthetic face stimuli or removing face information. We sought to move beyond a descriptive account of gaze categorization by fitting a model to the data that relies on changing the uncertainty associated with an estimate of gaze direction as a function of the signal strength. This model accounts for all the data and provides useful insight into the visual processes underlying normal gaze perception.

  10. Verification and Uncertainty Reduction of Amchitka Underground Nuclear Testing Models

    Energy Technology Data Exchange (ETDEWEB)

    Ahmed Hassan; Jenny Chapman

    2006-02-01

    The modeling of Amchitka underground nuclear tests conducted in 2002 is verified and uncertainty in model input parameters, as well as predictions, has been reduced using newly collected data obtained by the summer 2004 field expedition of CRESP. Newly collected data that pertain to the groundwater model include magnetotelluric (MT) surveys conducted on the island to determine the subsurface salinity and porosity structure of the subsurface, and bathymetric surveys to determine the bathymetric maps of the areas offshore from the Long Shot and Cannikin Sites. Analysis and interpretation of the MT data yielded information on the location of the transition zone, and porosity profiles showing porosity values decaying with depth. These new data sets are used to verify the original model in terms of model parameters, model structure, and model output verification. In addition, by using the new data along with the existing data (chemistry and head data), the uncertainty in model input and output is decreased by conditioning on all the available data. A Markov Chain Monte Carlo (MCMC) approach is adapted for developing new input parameter distributions conditioned on prior knowledge and new data. The MCMC approach is a form of Bayesian conditioning that is constructed in such a way that it produces samples of the model parameters that eventually converge to a stationary posterior distribution. The Bayesian MCMC approach enhances probabilistic assessment. Instead of simply propagating uncertainty forward from input parameters into model predictions (i.e., traditional Monte Carlo approach), MCMC propagates uncertainty backward from data onto parameters, and then forward from parameters into predictions. Comparisons between new data and the original model, and conditioning on all available data using MCMC method, yield the following results and conclusions: (1) Model structure is verified at Long Shot and Cannikin where the high-resolution bathymetric data collected by CRESP

  11. Concurrent Structural Fatigue Damage Prognosis Under Uncertainty

    Science.gov (United States)

    2014-04-30

    level is approximately 57% of the maximum loading without effects left by the overload. The forth diamond symbol is slightly smaller than before (the...Rackwitz, R.a.F., B, Structural Reliablity Under Combined Random Load Sequences. Computers & Structures, 1978. 9(5): p. 484-494. 67. Porter , T.R...mechanically polished (one micron diamond suspension in the last step), which provides a mirror-like surface to facilitate the optical crack length

  12. A market model: uncertainty and reachable sets

    Directory of Open Access Journals (Sweden)

    Raczynski Stanislaw

    2015-01-01

    Full Text Available Uncertain parameters are always present in models that include human factor. In marketing the uncertain consumer behavior makes it difficult to predict the future events and elaborate good marketing strategies. Sometimes uncertainty is being modeled using stochastic variables. Our approach is quite different. The dynamic market with uncertain parameters is treated using differential inclusions, which permits to determine the corresponding reachable sets. This is not a statistical analysis. We are looking for solutions to the differential inclusions. The purpose of the research is to find the way to obtain and visualise the reachable sets, in order to know the limits for the important marketing variables. The modeling method consists in defining the differential inclusion and find its solution, using the differential inclusion solver developed by the author. As the result we obtain images of the reachable sets where the main control parameter is the share of investment, being a part of the revenue. As an additional result we also can define the optimal investment strategy. The conclusion is that the differential inclusion solver can be a useful tool in market model analysis.

  13. A framework for modeling uncertainty in regional climate change

    Science.gov (United States)

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  14. Uncertainty propagation in urban hydrology water quality modelling

    NARCIS (Netherlands)

    Torres Matallana, Arturo; Leopold, U.; Heuvelink, G.B.M.

    2016-01-01

    Uncertainty is often ignored in urban hydrology modelling. Engineering practice typically ignores uncertainties and uncertainty propagation. This can have large impacts, such as the wrong dimensioning of urban drainage systems and the inaccurate estimation of pollution in the environment caused by c

  15. Structural interpretation of seismic data and inherent uncertainties

    Science.gov (United States)

    Bond, Clare

    2013-04-01

    associated further interpretation and analysis of the techniques and strategies employed. This resource will be of use to undergraduate, post-graduate, industry and academic professionals seeking to improve their seismic interpretation skills, develop reasoning strategies for dealing with incomplete datasets, and for assessing the uncertainty in these interpretations. Bond, C.E. et al. (2012). 'What makes an expert effective at interpreting seismic images?' Geology, 40, 75-78. Bond, C. E. et al. (2011). 'When there isn't a right answer: interpretation and reasoning, key skills for 21st century geoscience'. International Journal of Science Education, 33, 629-652. Bond, C. E. et al. (2008). 'Structural models: Optimizing risk analysis by understanding conceptual uncertainty'. First Break, 26, 65-71. Bond, C. E. et al., (2007). 'What do you think this is?: "Conceptual uncertainty" In geoscience interpretation'. GSA Today, 17, 4-10.

  16. Uncertainty in a spatial evacuation model

    Science.gov (United States)

    Mohd Ibrahim, Azhar; Venkat, Ibrahim; Wilde, Philippe De

    2017-08-01

    Pedestrian movements in crowd motion can be perceived in terms of agents who basically exhibit patient or impatient behavior. We model crowd motion subject to exit congestion under uncertainty conditions in a continuous space and compare the proposed model via simulations with the classical social force model. During a typical emergency evacuation scenario, agents might not be able to perceive with certainty the strategies of opponents (other agents) owing to the dynamic changes entailed by the neighborhood of opponents. In such uncertain scenarios, agents will try to update their strategy based on their own rules or their intrinsic behavior. We study risk seeking, risk averse and risk neutral behaviors of such agents via certain game theory notions. We found that risk averse agents tend to achieve faster evacuation time whenever the time delay in conflicts appears to be longer. The results of our simulations also comply with previous work and conform to the fact that evacuation time of agents becomes shorter once mutual cooperation among agents is achieved. Although the impatient strategy appears to be the rational strategy that might lead to faster evacuation times, our study scientifically shows that the more the agents are impatient, the slower is the egress time.

  17. Identification and communication of uncertainties of phenomenological models in PSA

    Energy Technology Data Exchange (ETDEWEB)

    Pulkkinen, U.; Simola, K. [VTT Automation (Finland)

    2001-11-01

    This report aims at presenting a view upon uncertainty analysis of phenomenological models with an emphasis on the identification and documentation of various types of uncertainties and assumptions in the modelling of the phenomena. In an uncertainty analysis, it is essential to include and document all unclear issues, in order to obtain a maximal coverage of unresolved issues. This holds independently on their nature or type of the issues. The classification of uncertainties is needed in the decomposition of the problem and it helps in the identification of means for uncertainty reduction. Further, an enhanced documentation serves to evaluate the applicability of the results to various risk-informed applications. (au)

  18. Systemic change increases model projection uncertainty

    NARCIS (Netherlands)

    Verstegen, Judith; Karssenberg, Derek; van der Hilst, Floortje; Faaij, André

    2014-01-01

    Most spatio-temporal models are based on the assumption that the relationship between system state change and its explanatory processes is stationary. This means that model structure and parameterization are usually kept constant over time, ignoring potential systemic changes in this relationship re

  19. Uncertainty in surface water flood risk modelling

    Science.gov (United States)

    Butler, J. B.; Martin, D. N.; Roberts, E.; Domuah, R.

    2009-04-01

    uniform flow formulae (Manning's Equation) to direct flow over the model domain, sourcing water from the channel or sea so as to provide a detailed representation of river and coastal flood risk. The initial development step was to include spatially-distributed rainfall as a new source term within the model domain. This required optimisation to improve computational efficiency, given the ubiquity of ‘wet' cells early on in the simulation. Collaboration with UK water companies has provided detailed drainage information, and from this a simplified representation of the drainage system has been included in the model via the inclusion of sinks and sources of water from the drainage network. This approach has clear advantages relative to a fully coupled method both in terms of reduced input data requirements and computational overhead. Further, given the difficulties associated with obtaining drainage information over large areas, tests were conducted to evaluate uncertainties associated with excluding drainage information and the impact that this has upon flood model predictions. This information can be used, for example, to inform insurance underwriting strategies and loss estimation as well as for emergency response and planning purposes. The Flowroute surface-water flood risk platform enables efficient mapping of areas sensitive to flooding from high-intensity rainfall events due to topography and drainage infrastructure. As such, the technology has widespread potential for use as a risk mapping tool by the UK Environment Agency, European Member States, water authorities, local governments and the insurance industry. Keywords: Surface water flooding, Model Uncertainty, Insurance Underwriting, Flood inundation modelling, Risk mapping.

  20. Uncertainty quantification for quantum chemical models of complex reaction networks.

    Science.gov (United States)

    Proppe, Jonny; Husch, Tamara; Simm, Gregor N; Reiher, Markus

    2016-12-22

    For the quantitative understanding of complex chemical reaction mechanisms, it is, in general, necessary to accurately determine the corresponding free energy surface and to solve the resulting continuous-time reaction rate equations for a continuous state space. For a general (complex) reaction network, it is computationally hard to fulfill these two requirements. However, it is possible to approximately address these challenges in a physically consistent way. On the one hand, it may be sufficient to consider approximate free energies if a reliable uncertainty measure can be provided. On the other hand, a highly resolved time evolution may not be necessary to still determine quantitative fluxes in a reaction network if one is interested in specific time scales. In this paper, we present discrete-time kinetic simulations in discrete state space taking free energy uncertainties into account. The method builds upon thermo-chemical data obtained from electronic structure calculations in a condensed-phase model. Our kinetic approach supports the analysis of general reaction networks spanning multiple time scales, which is here demonstrated for the example of the formose reaction. An important application of our approach is the detection of regions in a reaction network which require further investigation, given the uncertainties introduced by both approximate electronic structure methods and kinetic models. Such cases can then be studied in greater detail with more sophisticated first-principles calculations and kinetic simulations.

  1. Quantifying uncertainty in LCA-modelling of waste management systems.

    Science.gov (United States)

    Clavreul, Julie; Guyonnet, Dominique; Christensen, Thomas H

    2012-12-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining the selected methods: (Step 1) a sensitivity analysis evaluating the sensitivities of the results with respect to the input uncertainties, (Step 2) an uncertainty propagation providing appropriate tools for representing uncertainties and calculating the overall uncertainty of the model results, (Step 3) an uncertainty contribution analysis quantifying the contribution of each parameter uncertainty to the final uncertainty and (Step 4) as a new approach, a combined sensitivity analysis providing a visualisation of the shift in the ranking of different options due to variations of selected key parameters. This tiered approach optimises the resources available to LCA practitioners by only propagating the most influential uncertainties.

  2. Using CV-GLUE procedure in analysis of wetland model predictive uncertainty.

    Science.gov (United States)

    Huang, Chun-Wei; Lin, Yu-Pin; Chiang, Li-Chi; Wang, Yung-Chieh

    2014-07-01

    This study develops a procedure that is related to Generalized Likelihood Uncertainty Estimation (GLUE), called the CV-GLUE procedure, for assessing the predictive uncertainty that is associated with different model structures with varying degrees of complexity. The proposed procedure comprises model calibration, validation, and predictive uncertainty estimation in terms of a characteristic coefficient of variation (characteristic CV). The procedure first performed two-stage Monte-Carlo simulations to ensure predictive accuracy by obtaining behavior parameter sets, and then the estimation of CV-values of the model outcomes, which represent the predictive uncertainties for a model structure of interest with its associated behavior parameter sets. Three commonly used wetland models (the first-order K-C model, the plug flow with dispersion model, and the Wetland Water Quality Model; WWQM) were compared based on data that were collected from a free water surface constructed wetland with paddy cultivation in Taipei, Taiwan. The results show that the first-order K-C model, which is simpler than the other two models, has greater predictive uncertainty. This finding shows that predictive uncertainty does not necessarily increase with the complexity of the model structure because in this case, the more simplistic representation (first-order K-C model) of reality results in a higher uncertainty in the prediction made by the model. The CV-GLUE procedure is suggested to be a useful tool not only for designing constructed wetlands but also for other aspects of environmental management.

  3. Aerosol model selection and uncertainty modelling by adaptive MCMC technique

    Directory of Open Access Journals (Sweden)

    M. Laine

    2008-12-01

    Full Text Available We present a new technique for model selection problem in atmospheric remote sensing. The technique is based on Monte Carlo sampling and it allows model selection, calculation of model posterior probabilities and model averaging in Bayesian way.

    The algorithm developed here is called Adaptive Automatic Reversible Jump Markov chain Monte Carlo method (AARJ. It uses Markov chain Monte Carlo (MCMC technique and its extension called Reversible Jump MCMC. Both of these techniques have been used extensively in statistical parameter estimation problems in wide area of applications since late 1990's. The novel feature in our algorithm is the fact that it is fully automatic and easy to use.

    We show how the AARJ algorithm can be implemented and used for model selection and averaging, and to directly incorporate the model uncertainty. We demonstrate the technique by applying it to the statistical inversion problem of gas profile retrieval of GOMOS instrument on board the ENVISAT satellite. Four simple models are used simultaneously to describe the dependence of the aerosol cross-sections on wavelength. During the AARJ estimation all the models are used and we obtain a probability distribution characterizing how probable each model is. By using model averaging, the uncertainty related to selecting the aerosol model can be taken into account in assessing the uncertainty of the estimates.

  4. The Relationship Communication Structure - Uncertainty Avoidance

    Directory of Open Access Journals (Sweden)

    Doru Alexandru Pleşea

    2011-11-01

    Full Text Available As today’s society heads towards digitalization, the virtual environment gains a growing importance. Shaping the e-environment in accordance to the real environment in order to favour the activities and processes going to take place there requires a thorough design. However, cultural attributes of reflected inherently by design play a core part in how the information displayed on websites is perceived. The present paper aims to bring a perspective about transposing the proper communication structure into the website design, from the cultural point of view and from genders point of view, as it resulted from a research of Romanian students from Bucharest Academy of Economic Studies

  5. How well can we forecast future model error and uncertainty by mining past model performance data

    Science.gov (United States)

    Solomatine, Dimitri

    2016-04-01

    ) method by Koenker and Basset in which linear regression is used to build predictive models for distribution quantiles [1] (b) the UNEEC method [2,3,7] which takes into account the input variables influencing such uncertainty and uses more advanced machine learning (non-linear) methods (e.g. neural networks or k-NN method) (c) the recent DUBRAE method (Dynamic Uncertainty Model By Regression on Absolute Error), a autoregressive model of model residuals which first corrects the model residual and then employs an autoregressive statistical model for uncertainty prediction) [5] 2. The data uncertainty (parametric and/or input) - in this case we study the propagation of uncertainty (presented typically probabilistically) from parameters or inputs to the model outputs. For real complex non-linear functions (models) implemented in software various versions of the Monte Carlo simulation are used: values of parameters or inputs are sampled from the assumed distributions and the model is run multiple times to generate multiple outputs. The data generated by Monte Carlo analysis can be used to build a machine learning model which will be able to make predictions of model uncertainty for the future his method is named MLUE (Machine Learning for Uncertainty Estimation) and is covered in [4,6]. 3. Structural uncertainty stemming from inadequate model structure. The paper discusses the possibilities and experiences of building the models able to forecast (rather than analyse) residual and parametric uncertainty of hydrological models. References [1] Koenker, R., and G. Bassett (1978). Regression quantiles. Econometrica, 46(1), 33- 50, doi:10.2307/1913643. [2] D.L. Shrestha, D.P. Solomatine (2006). Machine learning approaches for estimation of prediction interval for the model output. Neural Networks J., 19(2), 225-235. [3] D.P. Solomatine, D.L. Shrestha (2009). A novel method to estimate model uncertainty using machine learning techniques. Water Resources Res. 45, W00B11. [4] D. L

  6. Aerodynamic Modeling with Heterogeneous Data Assimilation and Uncertainty Quantification Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Clear Science Corp. proposes to develop an aerodynamic modeling tool that assimilates data from different sources and facilitates uncertainty quantification. The...

  7. Output Consensus of Heterogeneous Linear Discrete-Time Multiagent Systems With Structural Uncertainties.

    Science.gov (United States)

    Li, Shaobao; Feng, Gang; Luo, Xiaoyuan; Guan, Xinping

    2015-12-01

    This paper investigates the output consensus problem of heterogeneous discrete-time multiagent systems with individual agents subject to structural uncertainties and different disturbances. A novel distributed control law based on internal reference models is first presented for output consensus of heterogeneous discrete-time multiagent systems without structural uncertainties, where internal reference models embedded in controllers are designed with the objective of reducing communication costs. Then based on the distributed internal reference models and the well-known internal model principle, a distributed control law is further presented for output consensus of heterogeneous discrete-time multiagent systems with structural uncertainties. It is shown in both cases that the consensus trajectory of the internal reference models determines the output trajectories of agents. Finally, numerical simulation results are provided to illustrate the effectiveness of the proposed control schemes.

  8. Orientation Uncertainty of Structures Measured in Cored Boreholes: Methodology and Case Study of Swedish Crystalline Rock

    Science.gov (United States)

    Stigsson, Martin

    2016-11-01

    Many engineering applications in fractured crystalline rocks use measured orientations of structures such as rock contact and fractures, and lineated objects such as foliation and rock stress, mapped in boreholes as their foundation. Despite that these measurements are afflicted with uncertainties, very few attempts to quantify their magnitudes and effects on the inferred orientations have been reported. Only relying on the specification of tool imprecision may considerably underestimate the actual uncertainty space. The present work identifies nine sources of uncertainties, develops inference models of their magnitudes, and points out possible implications for the inference on orientation models and thereby effects on downstream models. The uncertainty analysis in this work builds on a unique data set from site investigations, performed by the Swedish Nuclear Fuel and Waste Management Co. (SKB). During these investigations, more than 70 boreholes with a maximum depth of 1 km were drilled in crystalline rock with a cumulative length of more than 34 km including almost 200,000 single fracture intercepts. The work presented, hence, relies on orientation of fractures. However, the techniques to infer the magnitude of orientation uncertainty may be applied to all types of structures and lineated objects in boreholes. The uncertainties are not solely detrimental, but can be valuable, provided that the reason for their presence is properly understood and the magnitudes correctly inferred. The main findings of this work are as follows: (1) knowledge of the orientation uncertainty is crucial in order to be able to infer correct orientation model and parameters coupled to the fracture sets; (2) it is important to perform multiple measurements to be able to infer the actual uncertainty instead of relying on the theoretical uncertainty provided by the manufacturers; (3) it is important to use the most appropriate tool for the prevailing circumstances; and (4) the single most

  9. Geostatistical simulation of geological architecture and uncertainty propagation in groundwater modeling

    DEFF Research Database (Denmark)

    He, Xiulan

    parameters and model structures, which are the primary focuses of this PhD research. Parameter uncertainty was analyzed using an optimization tool (PEST: Parameter ESTimation) in combination with a random sampling method (LHS: Latin Hypercube Sampling). Model structure, namely geological architecture...... was analyzed using both a traditional two-point based geostatistical approach and multiple-point geostatistics (MPS). Our results documented that model structure is as important as model parameter regarding groundwater modeling uncertainty. Under certain circumstances the inaccuracy on model structure can...

  10. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects, and thus…

  11. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  12. Imprecision and Uncertainty in the UFO Database Model.

    Science.gov (United States)

    Van Gyseghem, Nancy; De Caluwe, Rita

    1998-01-01

    Discusses how imprecision and uncertainty are dealt with in the UFO (Uncertainty and Fuzziness in an Object-oriented) database model. Such information is expressed by means of possibility distributions, and modeled by means of the proposed concept of "role objects." The role objects model uncertain, tentative information about objects,…

  13. Estimating the magnitude of prediction uncertainties for the APLE model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analysis for the Annual P ...

  14. Impact of uncertainty in attributing modeled North American terrestrial carbon fluxes to anthropogenic forcings

    Science.gov (United States)

    Ricciuto, D. M.

    2015-12-01

    Although much progress has been made in the past decade in constraining the net North American terrestrial carbon flux, considerable uncertainty remains in the sink magnitude and trend. Terrestrial carbon cycle models are increasing in spatial resolution, complexity and predictive skill, allowing for increased process-level understanding and attribution of net carbon fluxes to specific causes. Here we examine the various sources of uncertainty, including driver uncertainty, model parameter uncertainty, and structural uncertainty, and the contribution of each type uncertainty to the net sink, and the attribution of this sink to anthropogenic causes: Increasing CO2 concentrations, nitrogen deposition, land use change, and changing climate. To examine driver and parameter uncertainty, model simulations are performed using the Community Land Model version 4.5 (CLM4.5) with literature-based parameter ranges and three different reanalysis meteorological forcing datasets. We also examine structural uncertainty thorough analysis of the Multiscale Terrestrial Model Intercomparison (MsTMIP). Identififying major sources of uncertainty can help to guide future observations, experiments, and model development activities.

  15. A framework for propagation of uncertainty contributed by parameterization, input data, model structure, and calibration/validation data in watershed modeling

    Science.gov (United States)

    The progressive improvement of computer science and development of auto-calibration techniques means that calibration of simulation models is no longer a major challenge for watershed planning and management. Modelers now increasingly focus on challenges such as improved representation of watershed...

  16. Structural applications of metal foams considering material and geometrical uncertainty

    Science.gov (United States)

    Moradi, Mohammadreza

    ; convergence of estimates of the Sobol' decomposition with sample size using various sampling schemes; the possibility of model reduction guided by the results of the Sobol' decomposition. For the rest of the study the different structural applications of metal foam is investigated. In the first application, it is shown that metal foams have the potential to serve as hysteric dampers in the braces of braced building frames. Using metal foams in the structural braces decreases different dynamic responses such as roof drift, base shear and maximum moment in the columns. Optimum metal foam strengths are different for different earthquakes. In order to use metal foam in the structural braces, metal foams need to have stable cyclic response which might be achievable for metal foams with high relative density. The second application is to improve strength and ductility of a steel tube by filling it with steel foam. Steel tube beams and columns are able to provide significant strength for structures. They have an efficient shape with large second moment of inertia which leads to light elements with high bending strength. Steel foams with high strength to weight ratio are used to fill the steel tube to improves its mechanical behavior. The linear eigenvalue and plastic collapse finite element (FE) analysis are performed on steel foam filled tube under pure compression and three point bending simulation. It is shown that foam improves the maximum strength and the ability of energy absorption of the steel tubes significantly. Different configurations with different volume of steel foam and composite behavior are investigated. It is demonstrated that there are some optimum configurations with more efficient behavior. If composite action between steel foam and steel increases, the strength of the element will improve due to the change of the failure mode from local buckling to yielding. Moreover, the Sobol' decomposition is used to investigate uncertainty in the strength and ductility of

  17. Calibration of Uncertainty Analysis of the SWAT Model Using Genetic Algorithms and Bayesian Model Averaging

    Science.gov (United States)

    In this paper, the Genetic Algorithms (GA) and Bayesian model averaging (BMA) were combined to simultaneously conduct calibration and uncertainty analysis for the Soil and Water Assessment Tool (SWAT). In this hybrid method, several SWAT models with different structures are first selected; next GA i...

  18. Modeling of uncertainties for wind turbine blade design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, Henrik Stensgaard

    2014-01-01

    Wind turbine blades are designed by a combination of tests and numerical calculations using finite element models of the blade. The blades are typically composite structures with laminates of glass-fiber and/or carbon-fibers glued together by a matrix material. This paper presents a framework...... for stochastic modelling of the load bearing capacity of wind turbine blades incorporating physical, model, measurement and statistical uncertainties at the different scales and also discusses the possibility to define numerical tests that can be included in the statistical basis. The stochastic modelling takes...... basis in the JCSS framework for modelling material properties, Bayesian statistical methods allowing prior / expert knowledge to be accounted for and the Maximum Likelihood Method. The stochastic framework is illustrated using simulated tests which represent examples relevant for wind turbine blades....

  19. Control Lyapunov Stabilization of Nonlinear Systems with Structural Uncertainty

    Institute of Scientific and Technical Information of China (English)

    CAI Xiu-shan; HAN Zheng-zhi; TANG Hou-jun

    2005-01-01

    This paper deals with global stabilization problem for the nonlinear systems with structural uncertainty.Based on control Lyapunov function, a sufficient and necessary condition for the globally and asymptotically stabilizing the equailibrium of the closed system is given. Moreovery, an almost smooth state feedback control law is constructed. The simulation shows the effectiveness of the method.

  20. Uncertainty modelling of atmospheric dispersion by stochastic response surface method under aleatory and epistemic uncertainties

    Indian Academy of Sciences (India)

    Rituparna Chutia; Supahi Mahanta; D Datta

    2014-04-01

    The parameters associated to a environmental dispersion model may include different kinds of variability, imprecision and uncertainty. More often, it is seen that available information is interpreted in probabilistic sense. Probability theory is a well-established theory to measure such kind of variability. However, not all available information, data or model parameters affected by variability, imprecision and uncertainty, can be handled by traditional probability theory. Uncertainty or imprecision may occur due to incomplete information or data, measurement error or data obtained from expert judgement or subjective interpretation of available data or information. Thus for model parameters, data may be affected by subjective uncertainty. Traditional probability theory is inappropriate to represent subjective uncertainty. Possibility theory is used as a tool to describe parameters with insufficient knowledge. Based on the polynomial chaos expansion, stochastic response surface method has been utilized in this article for the uncertainty propagation of atmospheric dispersion model under consideration of both probabilistic and possibility information. The proposed method has been demonstrated through a hypothetical case study of atmospheric dispersion.

  1. Uncertainties in modelling the climate impact of irrigation

    Science.gov (United States)

    de Vrese, Philipp; Hagemann, Stefan

    2017-04-01

    Many issues related to the climate impact of irrigation are addressed in studies that apply a wide range of models. These involve uncertainties related to differences in the model's general structure and parametrizations on the one hand and the need for simplifying assumptions with respect to the representation of irrigation on the other hand. To address these uncertainties, we used the Max Planck Institute for Meteorology's Earth System model into which a simple irrigation scheme was implemented. In several simulations, we varied certain irrigation characteristics to estimate the resulting variations in irrigation's climate impact and found a large sensitivity with respect to the irrigation effectiveness. Here, the assumed effectiveness of the scheme is a combination of the target soil moisture and the degree to which water losses are accounted for. In general, the simulated impact of irrigation on the state of the land surface and the atmosphere is more than three times larger when assuming a low irrigation effectiveness compared to a high effectiveness. In an additional set of simulations, we varied certain aspects of the model's general structure, namely the land-surface-atmosphere coupling, to estimate the related uncertainties. Here we compared the impact of irrigation between simulations using a parameter aggregation, a simple flux aggregation scheme and a coupling scheme that also accounts for spatial heterogeneity within the lowest layers of the atmosphere. It was found that changes in the land-surface-atmosphere coupling do not only affect the magnitude of climate impacts but they can even affect the direction of the impacts.

  2. Uncertainty in a monthly water balance model using the generalized likelihood uncertainty estimation methodology

    Indian Academy of Sciences (India)

    Diego Rivera; Yessica Rivas; Alex Godoy

    2015-02-01

    Hydrological models are simplified representations of natural processes and subject to errors. Uncertainty bounds are a commonly used way to assess the impact of an input or model architecture uncertainty in model outputs. Different sets of parameters could have equally robust goodness-of-fit indicators, which is known as Equifinality. We assessed the outputs from a lumped conceptual hydrological model to an agricultural watershed in central Chile under strong interannual variability (coefficient of variability of 25%) by using the Equifinality concept and uncertainty bounds. The simulation period ran from January 1999 to December 2006. Equifinality and uncertainty bounds from GLUE methodology (Generalized Likelihood Uncertainty Estimation) were used to identify parameter sets as potential representations of the system. The aim of this paper is to exploit the use of uncertainty bounds to differentiate behavioural parameter sets in a simple hydrological model. Then, we analyze the presence of equifinality in order to improve the identification of relevant hydrological processes. The water balance model for Chillan River exhibits, at a first stage, equifinality. However, it was possible to narrow the range for the parameters and eventually identify a set of parameters representing the behaviour of the watershed (a behavioural model) in agreement with observational and soft data (calculation of areal precipitation over the watershed using an isohyetal map). The mean width of the uncertainty bound around the predicted runoff for the simulation period decreased from 50 to 20 m3s−1 after fixing the parameter controlling the areal precipitation over the watershed. This decrement is equivalent to decreasing the ratio between simulated and observed discharge from 5.2 to 2.5. Despite the criticisms against the GLUE methodology, such as the lack of statistical formality, it is identified as a useful tool assisting the modeller with the identification of critical parameters.

  3. Monte Carlo analysis of uncertainty propagation in a stratospheric model. 2: Uncertainties due to reaction rates

    Science.gov (United States)

    Stolarski, R. S.; Butler, D. M.; Rundel, R. D.

    1977-01-01

    A concise stratospheric model was used in a Monte-Carlo analysis of the propagation of reaction rate uncertainties through the calculation of an ozone perturbation due to the addition of chlorine. Two thousand Monte-Carlo cases were run with 55 reaction rates being varied. Excellent convergence was obtained in the output distributions because the model is sensitive to the uncertainties in only about 10 reactions. For a 1 ppby chlorine perturbation added to a 1.5 ppby chlorine background, the resultant 1 sigma uncertainty on the ozone perturbation is a factor of 1.69 on the high side and 1.80 on the low side. The corresponding 2 sigma factors are 2.86 and 3.23. Results are also given for the uncertainties, due to reaction rates, in the ambient concentrations of stratospheric species.

  4. Assessment of model uncertainty during the river export modelling of pesticides and transformation products

    Science.gov (United States)

    Gassmann, Matthias; Olsson, Oliver; Kümmerer, Klaus

    2013-04-01

    The modelling of organic pollutants in the environment is burdened by a load of uncertainties. Not only parameter values are uncertain but often also the mass and timing of pesticide application. By introducing transformation products (TPs) into modelling, further uncertainty coming from the dependence of these substances on their parent compounds and the introduction of new model parameters are likely. The purpose of this study was the investigation of the behaviour of a parsimonious catchment scale model for the assessment of river concentrations of the insecticide Chlorpyrifos (CP) and two of its TPs, Chlorpyrifos Oxon (CPO) and 3,5,6-trichloro-2-pyridinol (TCP) under the influence of uncertain input parameter values. Especially parameter uncertainty and pesticide application uncertainty were investigated by Global Sensitivity Analysis (GSA) and the Generalized Likelihood Uncertainty Estimation (GLUE) method, based on Monte-Carlo sampling. GSA revealed that half-lives and sorption parameters as well as half-lives and transformation parameters were correlated to each other. This means, that the concepts of modelling sorption and degradation/transformation were correlated. Thus, it may be difficult in modelling studies to optimize parameter values for these modules. Furthermore, we could show that erroneous pesticide application mass and timing were compensated during Monte-Carlo sampling by changing the half-life of CP. However, the introduction of TCP into the calculation of the objective function was able to enhance identifiability of pesticide application mass. The GLUE analysis showed that CP and TCP were modelled successfully, but CPO modelling failed with high uncertainty and insensitive parameters. We assumed a structural error of the model which was especially important for CPO assessment. This shows that there is the possibility that a chemical and some of its TPs can be modelled successfully by a specific model structure, but for other TPs, the model

  5. On the uncertainty of phenological responses to climate change and its implication for terrestrial biosphere models

    Directory of Open Access Journals (Sweden)

    M. Migliavacca

    2012-01-01

    Full Text Available Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate systems through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere. Land surface models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we analyzed the Harvard Forest phenology record to investigate and characterize the sources of uncertainty in phenological forecasts and the subsequent impacts on model forecasts of carbon and water cycling in the future. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species with 12 phenological models of different complexity to predict leaf bud-burst.

    The evaluation of different phenological models indicated support for spring warming models with photoperiod limitations and, though to a lesser extent, to chilling models based on the alternating model structure.

    We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario. Parameter uncertainty was the smallest (average 95% CI: 2.4 day century−1 for scenario B1 and 4.5 day century−1 for A1fi, whereas driver uncertainty was the largest (up to 8.4 day century−1 in the simulated trends. The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied somewhat among models (±7.7 day century−1 for A1fi, ±3.6 day century−1 for B1. The forecast sensitivity of bud-burst to

  6. Uncertainty models applied to the substation planning

    Energy Technology Data Exchange (ETDEWEB)

    Fontoura Filho, Roberto N. [ELETROBRAS, Rio de Janeiro, RJ (Brazil); Aires, Joao Carlos O.; Tortelly, Debora L.S. [Light Servicos de Eletricidade S.A., Rio de Janeiro, RJ (Brazil)

    1994-12-31

    The selection of the reinforcements for a power system expansion becomes a difficult task on an environment of uncertainties. These uncertainties can be classified according to their sources as exogenous and endogenous. The first one is associated to the elements of the generation, transmission and distribution systems. The exogenous uncertainly is associated to external aspects, as the financial resources, the time spent to build the installations, the equipment price and the load level. The load uncertainly is extremely sensible to the behaviour of the economic conditions. Although the impossibility to take out completely the uncertainty , the endogenous one can be convenient treated and the exogenous uncertainly can be compensated. This paper describes an uncertainty treatment methodology and a practical application to a group of substations belonging to LIGHT company, the Rio de Janeiro electric utility. The equipment performance uncertainty is treated by adopting a probabilistic approach. The uncertainly associated to the load increase is considered by using technical analysis of scenarios and choice criteria based on the Decision Theory. On this paper it was used the Savage Method and the Fuzzy Set Method, in order to select the best middle term reinforcements plan. (author) 7 refs., 4 figs., 6 tabs.

  7. Estimated Frequency Domain Model Uncertainties used in Robust Controller Design

    DEFF Research Database (Denmark)

    Tøffner-Clausen, S.; Andersen, Palle; Stoustrup, Jakob;

    1994-01-01

    This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are......This paper deals with the combination of system identification and robust controller design. Recent results on estimation of frequency domain model uncertainty are...

  8. Committee of machine learning predictors of hydrological models uncertainty

    Science.gov (United States)

    Kayastha, Nagendra; Solomatine, Dimitri

    2014-05-01

    In prediction of uncertainty based on machine learning methods, the results of various sampling schemes namely, Monte Carlo sampling (MCS), generalized likelihood uncertainty estimation (GLUE), Markov chain Monte Carlo (MCMC), shuffled complex evolution metropolis algorithm (SCEMUA), differential evolution adaptive metropolis (DREAM), particle swarm optimization (PSO) and adaptive cluster covering (ACCO)[1] used to build a predictive models. These models predict the uncertainty (quantiles of pdf) of a deterministic output from hydrological model [2]. Inputs to these models are the specially identified representative variables (past events precipitation and flows). The trained machine learning models are then employed to predict the model output uncertainty which is specific for the new input data. For each sampling scheme three machine learning methods namely, artificial neural networks, model tree, locally weighted regression are applied to predict output uncertainties. The problem here is that different sampling algorithms result in different data sets used to train different machine learning models which leads to several models (21 predictive uncertainty models). There is no clear evidence which model is the best since there is no basis for comparison. A solution could be to form a committee of all models and to sue a dynamic averaging scheme to generate the final output [3]. This approach is applied to estimate uncertainty of streamflows simulation from a conceptual hydrological model HBV in the Nzoia catchment in Kenya. [1] N. Kayastha, D. L. Shrestha and D. P. Solomatine. Experiments with several methods of parameter uncertainty estimation in hydrological modeling. Proc. 9th Intern. Conf. on Hydroinformatics, Tianjin, China, September 2010. [2] D. L. Shrestha, N. Kayastha, and D. P. Solomatine, and R. Price. Encapsulation of parameteric uncertainty statistics by various predictive machine learning models: MLUE method, Journal of Hydroinformatic, in press

  9. Model development and data uncertainty integration

    Energy Technology Data Exchange (ETDEWEB)

    Swinhoe, Martyn Thomas [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-12-02

    The effect of data uncertainties is discussed, with the epithermal neutron multiplicity counter as an illustrative example. Simulation using MCNP6, cross section perturbations and correlations are addressed, along with the effect of the 240Pu spontaneous fission neutron spectrum, the effect of P(ν) for 240Pu spontaneous fission, and the effect of spontaneous fission and (α,n) intensity. The effect of nuclear data is the product of the initial uncertainty and the sensitivity -- both need to be estimated. In conclusion, a multi-parameter variation method has been demonstrated, the most significant parameters are the basic emission rates of spontaneous fission and (α,n) processes, and uncertainties and important data depend on the analysis technique chosen.

  10. Spatial uncertainty model for visual features using a Kinect™ sensor.

    Science.gov (United States)

    Park, Jae-Han; Shin, Yong-Deuk; Bae, Ji-Hun; Baeg, Moon-Hong

    2012-01-01

    This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  11. Spatial Uncertainty Model for Visual Features Using a Kinect™ Sensor

    Directory of Open Access Journals (Sweden)

    Jae-Han Park

    2012-06-01

    Full Text Available This study proposes a mathematical uncertainty model for the spatial measurement of visual features using Kinect™ sensors. This model can provide qualitative and quantitative analysis for the utilization of Kinect™ sensors as 3D perception sensors. In order to achieve this objective, we derived the propagation relationship of the uncertainties between the disparity image space and the real Cartesian space with the mapping function between the two spaces. Using this propagation relationship, we obtained the mathematical model for the covariance matrix of the measurement error, which represents the uncertainty for spatial position of visual features from Kinect™ sensors. In order to derive the quantitative model of spatial uncertainty for visual features, we estimated the covariance matrix in the disparity image space using collected visual feature data. Further, we computed the spatial uncertainty information by applying the covariance matrix in the disparity image space and the calibrated sensor parameters to the proposed mathematical model. This spatial uncertainty model was verified by comparing the uncertainty ellipsoids for spatial covariance matrices and the distribution of scattered matching visual features. We expect that this spatial uncertainty model and its analyses will be useful in various Kinect™ sensor applications.

  12. Workshop on Model Uncertainty and its Statistical Implications

    CERN Document Server

    1988-01-01

    In this book problems related to the choice of models in such diverse fields as regression, covariance structure, time series analysis and multinomial experiments are discussed. The emphasis is on the statistical implications for model assessment when the assessment is done with the same data that generated the model. This is a problem of long standing, notorious for its difficulty. Some contributors discuss this problem in an illuminating way. Others, and this is a truly novel feature, investigate systematically whether sample re-use methods like the bootstrap can be used to assess the quality of estimators or predictors in a reliable way given the initial model uncertainty. The book should prove to be valuable for advanced practitioners and statistical methodologists alike.

  13. Uncertainty Categorization, Modeling, and Management for Regional Water Supply Planning

    Science.gov (United States)

    Fletcher, S.; Strzepek, K. M.; AlSaati, A.; Alhassan, A.

    2016-12-01

    Many water planners face increased pressure on water supply systems from growing demands, variability in supply and a changing climate. Short-term variation in water availability and demand; long-term uncertainty in climate, groundwater storage, and sectoral competition for water; and varying stakeholder perspectives on the impacts of water shortages make it difficult to assess the necessity of expensive infrastructure investments. We categorize these uncertainties on two dimensions: whether they are the result of stochastic variation or epistemic uncertainty, and whether the uncertainties can be described probabilistically or are deep uncertainties whose likelihood is unknown. We develop a decision framework that combines simulation for probabilistic uncertainty, sensitivity analysis for deep uncertainty and Bayesian decision analysis for uncertainties that are reduced over time with additional information. We apply this framework to two contrasting case studies - drought preparedness in Melbourne, Australia and fossil groundwater depletion in Riyadh, Saudi Arabia - to assess the impacts of different types of uncertainty on infrastructure decisions. Melbourne's water supply system relies on surface water, which is impacted by natural variation in rainfall, and a market-based system for managing water rights. Our results show that small, flexible investment increases can mitigate shortage risk considerably at reduced cost. Riyadh, by contrast, relies primarily on desalination for municipal use and fossil groundwater for agriculture, and a centralized planner makes allocation decisions. Poor regional groundwater measurement makes it difficult to know when groundwater pumping will become uneconomical, resulting in epistemic uncertainty. However, collecting more data can reduce the uncertainty, suggesting the need for different uncertainty modeling and management strategies in Riyadh than in Melbourne. We will categorize the two systems and propose appropriate

  14. Investigating the Propagation of Meteorological Model Uncertainty for Tracer Modeling

    Science.gov (United States)

    Lopez-Coto, I.; Ghosh, S.; Karion, A.; Martin, C.; Mueller, K. L.; Prasad, K.; Whetstone, J. R.

    2016-12-01

    The North-East Corridor project aims to use a top-down inversion method to quantify sources of Greenhouse Gas (GHG) emissions in the urban areas of Washington DC and Baltimore at approximately 1km2 resolutions. The aim of this project is to help establish reliable measurement methods for quantifying and validating GHG emissions independently of the inventory methods typically used to guide mitigation efforts. Since inversion methods depend strongly on atmospheric transport modeling, analyzing the uncertainties on the meteorological fields and their propagation through the sensitivities of observations to surface fluxes (footprints) is a fundamental step. To this end, six configurations of the Weather Research and Forecasting Model (WRF-ARW) version 3.8 were used to generate an ensemble of meteorological simulations. Specifically, we used 4 planetary boundary layer parameterizations (YSU, MYNN2, BOULAC, QNSE), 2 sources of initial and boundary conditions (NARR and HRRR) and 1 configuration including the building energy parameterization (BEP) urban canopy model. The simulations were compared with more than 150 meteorological surface stations, a wind profiler and radiosondes for a month (February) in 2016 to account for the uncertainties and the ensemble spread for wind speed, direction and mixing height. In addition, we used the Stochastic Time-Inverted Lagrangian Transport model (STILT) to derive the sensitivity of 12 hypothetical observations to surface emissions (footprints) with each WRF configuration. The footprints and integrated sensitivities were compared and the resulting uncertainties estimated.

  15. Operationalising uncertainty in data and models for integrated water resources management.

    Science.gov (United States)

    Blind, M W; Refsgaard, J C

    2007-01-01

    Key sources of uncertainty of importance for water resources management are (1) uncertainty in data; (2) uncertainty related to hydrological models (parameter values, model technique, model structure); and (3) uncertainty related to the context and the framing of the decision-making process. The European funded project 'Harmonised techniques and representative river basin data for assessment and use of uncertainty information in integrated water management (HarmoniRiB)' has resulted in a range of tools and methods to assess such uncertainties, focusing on items (1) and (2). The project also engaged in a number of discussions surrounding uncertainty and risk assessment in support of decision-making in water management. Based on the project's results and experiences, and on the subsequent discussions a number of conclusions can be drawn on the future needs for successful adoption of uncertainty analysis in decision support. These conclusions range from additional scientific research on specific uncertainties, dedicated guidelines for operational use to capacity building at all levels. The purpose of this paper is to elaborate on these conclusions and anchoring them in the broad objective of making uncertainty and risk assessment an essential and natural part in future decision-making processes.

  16. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-03-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the model representations. The likelihood of the simulated glacier mass balance and snow cover are used for further assessing model credibility. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  17. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  18. Modeling uncertainty in requirements engineering decision support

    Science.gov (United States)

    Feather, Martin S.; Maynard-Zhang, Pedrito; Kiper, James D.

    2005-01-01

    One inherent characteristic of requrements engineering is a lack of certainty during this early phase of a project. Nevertheless, decisions about requirements must be made in spite of this uncertainty. Here we describe the context in which we are exploring this, and some initial work to support elicitation of uncertain requirements, and to deal with the combination of such information from multiple stakeholders.

  19. Assessing Uncertainty of Interspecies Correlation Estimation Models for Aromatic Compounds

    Science.gov (United States)

    We developed Interspecies Correlation Estimation (ICE) models for aromatic compounds containing 1 to 4 benzene rings to assess uncertainty in toxicity extrapolation in two data compilation approaches. ICE models are mathematical relationships between surrogate and predicted test ...

  20. Reservoir management under geological uncertainty using fast model update

    NARCIS (Netherlands)

    Hanea, R.; Evensen, G.; Hustoft, L.; Ek, T.; Chitu, A.; Wilschut, F.

    2015-01-01

    Statoil is implementing "Fast Model Update (FMU)," an integrated and automated workflow for reservoir modeling and characterization. FMU connects all steps and disciplines from seismic depth conversion to prediction and reservoir management taking into account relevant reservoir uncertainty. FMU del

  1. Possibilistic uncertainty analysis of a conceptual model of snowmelt runoff

    Directory of Open Access Journals (Sweden)

    A. P. Jacquin

    2010-08-01

    Full Text Available This study presents the analysis of predictive uncertainty of a conceptual type snowmelt runoff model. The method applied uses possibilistic rather than probabilistic calculus for the evaluation of predictive uncertainty. Possibility theory is an information theory meant to model uncertainties caused by imprecise or incomplete knowledge about a real system rather than by randomness. A snow dominated catchment in the Chilean Andes is used as case study. Predictive uncertainty arising from parameter uncertainties of the watershed model is assessed. Model performance is evaluated according to several criteria, in order to define the possibility distribution of the parameter vector. The plausibility of the simulated glacier mass balance and snow cover are used for further constraining the model representations. Possibility distributions of the discharge estimates and prediction uncertainty bounds are subsequently derived. The results of the study indicate that the use of additional information allows a reduction of predictive uncertainty. In particular, the assessment of the simulated glacier mass balance and snow cover helps to reduce the width of the uncertainty bounds without a significant increment in the number of unbounded observations.

  2. Protein flexibility: coordinate uncertainties and interpretation of structural differences

    Energy Technology Data Exchange (ETDEWEB)

    Rashin, Alexander A., E-mail: alexander-rashin@hotmail.com [BioChemComp Inc., 543 Sagamore Avenue, Teaneck, NJ 07666 (United States); LH Baker Center for Bioinformatics and Department of Biochemistry, Biophysics and Molecular Biology, 112 Office and Lab Building, Iowa State University, Ames, IA 50011-3020 (United States); Rashin, Abraham H. L. [BioChemComp Inc., 543 Sagamore Avenue, Teaneck, NJ 07666 (United States); Rutgers, The State University of New Jersey, 22371 BPO WAY, Piscataway, NJ 08854-8123 (United States); Jernigan, Robert L. [LH Baker Center for Bioinformatics and Department of Biochemistry, Biophysics and Molecular Biology, 112 Office and Lab Building, Iowa State University, Ames, IA 50011-3020 (United States); BioChemComp Inc., 543 Sagamore Avenue, Teaneck, NJ 07666 (United States)

    2009-11-01

    Criteria for the interpretability of coordinate differences and a new method for identifying rigid-body motions and nonrigid deformations in protein conformational changes are developed and applied to functionally induced and crystallization-induced conformational changes. Valid interpretations of conformational movements in protein structures determined by X-ray crystallography require that the movement magnitudes exceed their uncertainty threshold. Here, it is shown that such thresholds can be obtained from the distance difference matrices (DDMs) of 1014 pairs of independently determined structures of bovine ribonuclease A and sperm whale myoglobin, with no explanations provided for reportedly minor coordinate differences. The smallest magnitudes of reportedly functional motions are just above these thresholds. Uncertainty thresholds can provide objective criteria that distinguish between true conformational changes and apparent ‘noise’, showing that some previous interpretations of protein coordinate changes attributed to external conditions or mutations may be doubtful or erroneous. The use of uncertainty thresholds, DDMs, the newly introduced CDDMs (contact distance difference matrices) and a novel simple rotation algorithm allows a more meaningful classification and description of protein motions, distinguishing between various rigid-fragment motions and nonrigid conformational deformations. It is also shown that half of 75 pairs of identical molecules, each from the same asymmetric crystallographic cell, exhibit coordinate differences that range from just outside the coordinate uncertainty threshold to the full magnitude of large functional movements. Thus, crystallization might often induce protein conformational changes that are comparable to those related to or induced by the protein function.

  3. Quantifying structural uncertainty on fault networks using a marked point process within a Bayesian framework

    Science.gov (United States)

    Aydin, Orhun; Caers, Jef Karel

    2017-08-01

    Faults are one of the building-blocks for subsurface modeling studies. Incomplete observations of subsurface fault networks lead to uncertainty pertaining to location, geometry and existence of faults. In practice, gaps in incomplete fault network observations are filled based on tectonic knowledge and interpreter's intuition pertaining to fault relationships. Modeling fault network uncertainty with realistic models that represent tectonic knowledge is still a challenge. Although methods that address specific sources of fault network uncertainty and complexities of fault modeling exists, a unifying framework is still lacking. In this paper, we propose a rigorous approach to quantify fault network uncertainty. Fault pattern and intensity information are expressed by means of a marked point process, marked Strauss point process. Fault network information is constrained to fault surface observations (complete or partial) within a Bayesian framework. A structural prior model is defined to quantitatively express fault patterns, geometries and relationships within the Bayesian framework. Structural relationships between faults, in particular fault abutting relations, are represented with a level-set based approach. A Markov Chain Monte Carlo sampler is used to sample posterior fault network realizations that reflect tectonic knowledge and honor fault observations. We apply the methodology to a field study from Nankai Trough & Kumano Basin. The target for uncertainty quantification is a deep site with attenuated seismic data with only partially visible faults and many faults missing from the survey or interpretation. A structural prior model is built from shallow analog sites that are believed to have undergone similar tectonics compared to the site of study. Fault network uncertainty for the field is quantified with fault network realizations that are conditioned to structural rules, tectonic information and partially observed fault surfaces. We show the proposed

  4. A Stochastic Nonlinear Water Wave Model for Efficient Uncertainty Quantification

    CERN Document Server

    Bigoni, Daniele; Eskilsson, Claes

    2014-01-01

    A major challenge in next-generation industrial applications is to improve numerical analysis by quantifying uncertainties in predictions. In this work we present a stochastic formulation of a fully nonlinear and dispersive potential flow water wave model for the probabilistic description of the evolution waves. This model is discretized using the Stochastic Collocation Method (SCM), which provides an approximate surrogate of the model. This can be used to accurately and efficiently estimate the probability distribution of the unknown time dependent stochastic solution after the forward propagation of uncertainties. We revisit experimental benchmarks often used for validation of deterministic water wave models. We do this using a fully nonlinear and dispersive model and show how uncertainty in the model input can influence the model output. Based on numerical experiments and assumed uncertainties in boundary data, our analysis reveals that some of the known discrepancies from deterministic simulation in compa...

  5. Development of Property Models with Uncertainty Estimate for Process Design under Uncertainty

    DEFF Research Database (Denmark)

    Hukkerikar, Amol; Sarup, Bent; Abildskov, Jens

    the results of uncertainty analysis to predict the uncertainties in process design. For parameter estimation, large data-sets of experimentally measured property values for a wide range of pure compounds are taken from the CAPEC database. Classical frequentist approach i.e., least square method is adopted...... parameter, octanol/water partition coefficient, aqueous solubility, acentric factor, and liquid molar volume at 298 K. The performance of property models for these properties with the revised set of model parameters is highlighted through a set of compounds not considered in the regression step...... sensitive properties for each unit operation are also identified. This analysis can be used to reduce the uncertainties in property estimates for the properties of critical importance (by performing additional experiments to get better experimental data and better model parameter values). Thus...

  6. Urban drainage models simplifying uncertainty analysis for practitioners

    DEFF Research Database (Denmark)

    Vezzaro, Luca; Mikkelsen, Peter Steen; Deletic, Ana

    2013-01-01

    There is increasing awareness about uncertainties in the modelling of urban drainage systems and, as such, many new methods for uncertainty analyses have been developed. Despite this, all available methods have limitations which restrict their widespread application among practitioners. Here, a m...

  7. On the uncertainty of phenological responses to climate change, and implications for a terrestrial biosphere model

    Directory of Open Access Journals (Sweden)

    M. Migliavacca

    2012-06-01

    Full Text Available Phenology, the timing of recurring life cycle events, controls numerous land surface feedbacks to the climate system through the regulation of exchanges of carbon, water and energy between the biosphere and atmosphere.

    Terrestrial biosphere models, however, are known to have systematic errors in the simulation of spring phenology, which potentially could propagate to uncertainty in modeled responses to future climate change. Here, we used the Harvard Forest phenology record to investigate and characterize sources of uncertainty in predicting phenology, and the subsequent impacts on model forecasts of carbon and water cycling. Using a model-data fusion approach, we combined information from 20 yr of phenological observations of 11 North American woody species, with 12 leaf bud-burst models that varied in complexity.

    Akaike's Information Criterion indicated support for spring warming models with photoperiod limitations and, to a lesser extent, models that included chilling requirements.

    We assessed three different sources of uncertainty in phenological forecasts: parameter uncertainty, model uncertainty, and driver uncertainty. The latter was characterized running the models to 2099 using 2 different IPCC climate scenarios (A1fi vs. B1, i.e. high CO2 emissions vs. low CO2 emissions scenario. Parameter uncertainty was the smallest (average 95% Confidence Interval – CI: 2.4 days century−1 for scenario B1 and 4.5 days century−1 for A1fi, whereas driver uncertainty was the largest (up to 8.4 days century−1 in the simulated trends. The uncertainty related to model structure is also large and the predicted bud-burst trends as well as the shape of the smoothed projections varied among models (±7.7 days century−1 for A1fi, ±3.6 days century−1 for B1. The forecast sensitivity of bud-burst to temperature (i.e. days bud-burst advanced per

  8. Uncertainty and error in complex plasma chemistry models

    Science.gov (United States)

    Turner, Miles M.

    2015-06-01

    Chemistry models that include dozens of species and hundreds to thousands of reactions are common in low-temperature plasma physics. The rate constants used in such models are uncertain, because they are obtained from some combination of experiments and approximate theories. Since the predictions of these models are a function of the rate constants, these predictions must also be uncertain. However, systematic investigations of the influence of uncertain rate constants on model predictions are rare to non-existent. In this work we examine a particular chemistry model, for helium-oxygen plasmas. This chemistry is of topical interest because of its relevance to biomedical applications of atmospheric pressure plasmas. We trace the primary sources for every rate constant in the model, and hence associate an error bar (or equivalently, an uncertainty) with each. We then use a Monte Carlo procedure to quantify the uncertainty in predicted plasma species densities caused by the uncertainty in the rate constants. Under the conditions investigated, the range of uncertainty in most species densities is a factor of two to five. However, the uncertainty can vary strongly for different species, over time, and with other plasma conditions. There are extreme (pathological) cases where the uncertainty is more than a factor of ten. One should therefore be cautious in drawing any conclusion from plasma chemistry modelling, without first ensuring that the conclusion in question survives an examination of the related uncertainty.

  9. IAEA CRP on HTGR Uncertainties in Modeling: Assessment of Phase I Lattice to Core Model Uncertainties

    Energy Technology Data Exchange (ETDEWEB)

    Rouxelin, Pascal Nicolas [Idaho National Lab. (INL), Idaho Falls, ID (United States); Strydom, Gerhard [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Best-estimate plus uncertainty analysis of reactors is replacing the traditional conservative (stacked uncertainty) method for safety and licensing analysis. To facilitate uncertainty analysis applications, a comprehensive approach and methodology must be developed and applied. High temperature gas cooled reactors (HTGRs) have several features that require techniques not used in light-water reactor analysis (e.g., coated-particle design and large graphite quantities at high temperatures). The International Atomic Energy Agency has therefore launched the Coordinated Research Project on HTGR Uncertainty Analysis in Modeling to study uncertainty propagation in the HTGR analysis chain. The benchmark problem defined for the prismatic design is represented by the General Atomics Modular HTGR 350. The main focus of this report is the compilation and discussion of the results obtained for various permutations of Exercise I 2c and the use of the cross section data in Exercise II 1a of the prismatic benchmark, which is defined as the last and first steps of the lattice and core simulation phases, respectively. The report summarizes the Idaho National Laboratory (INL) best estimate results obtained for Exercise I 2a (fresh single-fuel block), Exercise I 2b (depleted single-fuel block), and Exercise I 2c (super cell) in addition to the first results of an investigation into the cross section generation effects for the super-cell problem. The two dimensional deterministic code known as the New ESC based Weighting Transport (NEWT) included in the Standardized Computer Analyses for Licensing Evaluation (SCALE) 6.1.2 package was used for the cross section evaluation, and the results obtained were compared to the three dimensional stochastic SCALE module KENO VI. The NEWT cross section libraries were generated for several permutations of the current benchmark super-cell geometry and were then provided as input to the Phase II core calculation of the stand alone neutronics Exercise

  10. Uncertainties in stellar evolution models: convective overshoot

    CERN Document Server

    Bressan, Alessandro; Marigo, Paola; Rosenfield, Philip; Tang, Jing

    2014-01-01

    In spite of the great effort made in the last decades to improve our understanding of stellar evolution, significant uncertainties remain due to our poor knowledge of some complex physical processes that require an empirical calibration, such as the efficiency of the interior mixing related to convective overshoot. Here we review the impact of convective overshoot on the evolution of stars during the main Hydrogen and Helium burning phases.

  11. Uncertainties in Stellar Evolution Models: Convective Overshoot

    Science.gov (United States)

    Bressan, Alessandro; Girardi, Léo; Marigo, Paola; Rosenfield, Philip; Tang, Jing

    In spite of the great effort made in the last decades to improve our understanding of stellar evolution, significant uncertainties remain due to our poor knowledge of some complex physical processes that require an empirical calibration, such as the efficiency of the interior mixing related to convective overshoot. Here we review the impact of convective overshoot on the evolution of stars during the main Hydrogen and Helium burning phases.

  12. Modeling Uncertainty when Estimating IT Projects Costs

    OpenAIRE

    Winter, Michel; Mirbel, Isabelle; Crescenzo, Pierre

    2014-01-01

    In the current economic context, optimizing projects' cost is an obligation for a company to remain competitive in its market. Introducing statistical uncertainty in cost estimation is a good way to tackle the risk of going too far while minimizing the project budget: it allows the company to determine the best possible trade-off between estimated cost and acceptable risk. In this paper, we present new statistical estimators derived from the way IT companies estimate the projects' costs. In t...

  13. Uncertainties in environmental radiological assessment models and their implications

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, F.O.; Miller, C.W.

    1983-01-01

    Environmental radiological assessments rely heavily on the use of mathematical models. The predictions of these models are inherently uncertain because these models are inexact representations of real systems. The major sources of this uncertainty are related to biases in model formulation and parameter estimation. The best approach for estimating the actual extent of over- or underprediction is model validation, a procedure that requires testing over the range of the intended realm of model application. Other approaches discussed are the use of screening procedures, sensitivity and stochastic analyses, and model comparison. The magnitude of uncertainty in model predictions is a function of the questions asked of the model and the specific radionuclides and exposure pathways of dominant importance. Estimates are made of the relative magnitude of uncertainty for situations requiring predictions of individual and collective risks for both chronic and acute releases of radionuclides. It is concluded that models developed as research tools should be distinguished from models developed for assessment applications. Furthermore, increased model complexity does not necessarily guarantee increased accuracy. To improve the realism of assessment modeling, stochastic procedures are recommended that translate uncertain parameter estimates into a distribution of predicted values. These procedures also permit the importance of model parameters to be ranked according to their relative contribution to the overall predicted uncertainty. Although confidence in model predictions can be improved through site-specific parameter estimation and increased model validation, risk factors and internal dosimetry models will probably remain important contributors to the amount of uncertainty that is irreducible.

  14. Structural validation as an input into seismic depth conversion to decrease assigned structural uncertainty

    Science.gov (United States)

    Totake, Yukitsugu; Butler, Robert W. H.; Bond, Clare E.

    2017-02-01

    While the interpretation of seismic reflection imagery is powerful and well established for evaluating subsurface structures it is never perfectly accurate. Structural validation techniques are widely used to geometrically test geological interpretations of seismic reflection data. Commonly these techniques are performed on depth sections converted from seismic time-based data using velocity models. Velocity model choices in seismic depth conversion have an impact on the final depth image and hence the structural geometry of interpretations. The impact of these choices in depth conversion on structural validation is rarely examined. Here we explore how multiple versions of a depth section, converted using different velocity models, influence the performance of structural validations for a fold-thrust structure from the deep water Niger Delta. The example illustrates that a range of kinematic models can validate the depth-converted profiles, regardless of the depth conversion choice and are thus poor diagnostic tools. Area-depth-strain (ADS) analysis can constrain the choice both of a kinematic model and the depth conversion, provided the seismic data allow the detachment level and excess areas to be recognised. Incorporation of ADS analysis within an interpretation-depth conversion workflow helps reduce assigned uncertainty in depth conversion, the seismic interpretation, and in the implicit geological model.

  15. A Peep into the Uncertainty-Complexity-Relevance Modeling Trilemma through Global Sensitivity and Uncertainty Analysis

    Science.gov (United States)

    Munoz-Carpena, R.; Muller, S. J.; Chu, M.; Kiker, G. A.; Perz, S. G.

    2014-12-01

    Model Model complexity resulting from the need to integrate environmental system components cannot be understated. In particular, additional emphasis is urgently needed on rational approaches to guide decision making through uncertainties surrounding the integrated system across decision-relevant scales. However, in spite of the difficulties that the consideration of modeling uncertainty represent for the decision process, it should not be avoided or the value and science behind the models will be undermined. These two issues; i.e., the need for coupled models that can answer the pertinent questions and the need for models that do so with sufficient certainty, are the key indicators of a model's relevance. Model relevance is inextricably linked with model complexity. Although model complexity has advanced greatly in recent years there has been little work to rigorously characterize the threshold of relevance in integrated and complex models. Formally assessing the relevance of the model in the face of increasing complexity would be valuable because there is growing unease among developers and users of complex models about the cumulative effects of various sources of uncertainty on model outputs. In particular, this issue has prompted doubt over whether the considerable effort going into further elaborating complex models will in fact yield the expected payback. New approaches have been proposed recently to evaluate the uncertainty-complexity-relevance modeling trilemma (Muller, Muñoz-Carpena and Kiker, 2011) by incorporating state-of-the-art global sensitivity and uncertainty analysis (GSA/UA) in every step of the model development so as to quantify not only the uncertainty introduced by the addition of new environmental components, but the effect that these new components have over existing components (interactions, non-linear responses). Outputs from the analysis can also be used to quantify system resilience (stability, alternative states, thresholds or tipping

  16. Uncertainty Analysis of Multi-Model Flood Forecasts

    Directory of Open Access Journals (Sweden)

    Erich J. Plate

    2015-12-01

    Full Text Available This paper demonstrates, by means of a systematic uncertainty analysis, that the use of outputs from more than one model can significantly improve conditional forecasts of discharges or water stages, provided the models are structurally different. Discharge forecasts from two models and the actual forecasted discharge are assumed to form a three-dimensional joint probability density distribution (jpdf, calibrated on long time series of data. The jpdf is decomposed into conditional probability density distributions (cpdf by means of Bayes formula, as suggested and explored by Krzysztofowicz in a series of papers. In this paper his approach is simplified to optimize conditional forecasts for any set of two forecast models. Its application is demonstrated by means of models developed in a study of flood forecasting for station Stung Treng on the middle reach of the Mekong River in South-East Asia. Four different forecast models were used and pairwise combined: forecast with no model, with persistence model, with a regression model, and with a rainfall-runoff model. Working with cpdfs requires determination of dependency among variables, for which linear regressions are required, as was done by Krzysztofowicz. His Bayesian approach based on transforming observed probability distributions of discharges and forecasts into normal distributions is also explored. Results obtained with his method for normal prior and likelihood distributions are identical to results from direct multiple regressions. Furthermore, it is shown that in the present case forecast accuracy is only marginally improved, if Weibull distributed basic data were converted into normally distributed variables.

  17. Bayesian methods for model uncertainty analysis with application to future sea level rise

    Energy Technology Data Exchange (ETDEWEB)

    Patwardhan, A.; Small, M.J. (Carnegie Mellon Univ., Pittsburgh, PA (United States))

    1992-12-01

    This paper addresses the use of data for identifying and characterizing uncertainties in model parameters and predictions. The Bayesian Monte Carlo method is formally presented and elaborated, and applied to the analysis of the uncertainty in a predictive model for global mean sea level change. The method uses observations of output variables, made with an assumed error structure, to determine a posterior distribution of model outputs. This is used to derive a posterior distribution for the model parameters. Results demonstrate the resolution of the uncertainty that is obtained as a result of the Bayesian analysis and also indicate the key contributors to the uncertainty in the sea level rise model. While the technique is illustrated with a simple, preliminary model, the analysis provides an iterative framework for model refinement. The methodology developed in this paper provides a mechanism for the incorporation of ongoing data collection and research in decision-making for problems involving uncertain environmental change.

  18. Explicitly integrating parameter, input, and structure uncertainties into Bayesian Neural Networks for probabilistic hydrologic forecasting

    KAUST Repository

    Zhang, Xuesong

    2011-11-01

    Estimating uncertainty of hydrologic forecasting is valuable to water resources and other relevant decision making processes. Recently, Bayesian Neural Networks (BNNs) have been proved powerful tools for quantifying uncertainty of streamflow forecasting. In this study, we propose a Markov Chain Monte Carlo (MCMC) framework (BNN-PIS) to incorporate the uncertainties associated with parameters, inputs, and structures into BNNs. This framework allows the structure of the neural networks to change by removing or adding connections between neurons and enables scaling of input data by using rainfall multipliers. The results show that the new BNNs outperform BNNs that only consider uncertainties associated with parameters and model structures. Critical evaluation of posterior distribution of neural network weights, number of effective connections, rainfall multipliers, and hyper-parameters shows that the assumptions held in our BNNs are not well supported. Further understanding of characteristics of and interactions among different uncertainty sources is expected to enhance the application of neural networks for uncertainty analysis of hydrologic forecasting. © 2011 Elsevier B.V.

  19. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the 'most likely' dispersion scenario...... of the meteorological model results. These uncertainties stem from e.g. limits in meteorological obser-vations used to initialise meteorological forecast series. By perturbing the initial state of an NWP model run in agreement with the available observa-tional data, an ensemble of meteorological forecasts is produced....... However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent uncertainties...

  20. Meteorological Uncertainty of atmospheric Dispersion model results (MUD)

    DEFF Research Database (Denmark)

    Havskov Sørensen, Jens; Amstrup, Bjarne; Feddersen, Henrik

    The MUD project addresses assessment of uncertainties of atmospheric dispersion model predictions, as well as possibilities for optimum presentation to decision makers. Previously, it has not been possible to estimate such uncertainties quantitatively, but merely to calculate the ‘most likely...... uncertainties of the meteorological model results. These uncertainties stem from e.g. limits in meteorological observations used to initialise meteorological forecast series. By perturbing e.g. the initial state of an NWP model run in agreement with the available observational data, an ensemble......’ dispersion scenario. However, recent developments in numerical weather prediction (NWP) include probabilistic forecasting techniques, which can be utilised also for long-range atmospheric dispersion models. The ensemble statistical methods developed and applied to NWP models aim at describing the inherent...

  1. Uncertainty analysis of fluvial outcrop data for stochastic reservoir modelling

    Energy Technology Data Exchange (ETDEWEB)

    Martinius, A.W. [Statoil Research Centre, Trondheim (Norway); Naess, A. [Statoil Exploration and Production, Stjoerdal (Norway)

    2005-07-01

    Uncertainty analysis and reduction is a crucial part of stochastic reservoir modelling and fluid flow simulation studies. Outcrop analogue studies are often employed to define reservoir model parameters but the analysis of uncertainties associated with sedimentological information is often neglected. In order to define uncertainty inherent in outcrop data more accurately, this paper presents geometrical and dimensional data from individual point bars and braid bars, from part of the low net:gross outcropping Tortola fluvial system (Spain) that has been subjected to a quantitative and qualitative assessment. Four types of primary outcrop uncertainties are discussed: (1) the definition of the conceptual depositional model; (2) the number of observations on sandstone body dimensions; (3) the accuracy and representativeness of observed three-dimensional (3D) sandstone body size data; and (4) sandstone body orientation. Uncertainties related to the depositional model are the most difficult to quantify but can be appreciated qualitatively if processes of deposition related to scales of time and the general lack of information are considered. Application of the N0 measure is suggested to assess quantitatively whether a statistically sufficient number of dimensional observations is obtained to reduce uncertainty to an acceptable level. The third type of uncertainty is evaluated in a qualitative sense and determined by accurate facies analysis. The orientation of sandstone bodies is shown to influence spatial connectivity. As a result, an insufficient number or quality of observations may have important consequences for estimated connected volumes. This study will give improved estimations for reservoir modelling. (author)

  2. Structural uncertainty in air mass factor calculation for NO2 and HCHO satellite retrievals

    Science.gov (United States)

    Lorente, Alba; Folkert Boersma, K.; Yu, Huan; Dörner, Steffen; Hilboll, Andreas; Richter, Andreas; Liu, Mengyao; Lamsal, Lok N.; Barkley, Michael; De Smedt, Isabelle; Van Roozendael, Michel; Wang, Yang; Wagner, Thomas; Beirle, Steffen; Lin, Jin-Tai; Krotkov, Nickolay; Stammes, Piet; Wang, Ping; Eskes, Henk J.; Krol, Maarten

    2017-03-01

    Air mass factor (AMF) calculation is the largest source of uncertainty in NO2 and HCHO satellite retrievals in situations with enhanced trace gas concentrations in the lower troposphere. Structural uncertainty arises when different retrieval methodologies are applied within the scientific community to the same satellite observations. Here, we address the issue of AMF structural uncertainty via a detailed comparison of AMF calculation methods that are structurally different between seven retrieval groups for measurements from the Ozone Monitoring Instrument (OMI). We estimate the escalation of structural uncertainty in every sub-step of the AMF calculation process. This goes beyond the algorithm uncertainty estimates provided in state-of-the-art retrievals, which address the theoretical propagation of uncertainties for one particular retrieval algorithm only. We find that top-of-atmosphere reflectances simulated by four radiative transfer models (RTMs) (DAK, McArtim, SCIATRAN and VLIDORT) agree within 1.5 %. We find that different retrieval groups agree well in the calculations of altitude resolved AMFs from different RTMs (to within 3 %), and in the tropospheric AMFs (to within 6 %) as long as identical ancillary data (surface albedo, terrain height, cloud parameters and trace gas profile) and cloud and aerosol correction procedures are being used. Structural uncertainty increases sharply when retrieval groups use their preference for ancillary data, cloud and aerosol correction. On average, we estimate the AMF structural uncertainty to be 42 % over polluted regions and 31 % over unpolluted regions, mostly driven by substantial differences in the a priori trace gas profiles, surface albedo and cloud parameters. Sensitivity studies for one particular algorithm indicate that different cloud correction approaches result in substantial AMF differences in polluted conditions (5 to 40 % depending on cloud fraction and cloud pressure, and 11 % on average) even for low

  3. Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations

    CERN Document Server

    Krauss, L M; White, M; Krauss, Lawrence M.; Gates, Evalyn; White, Martin

    1993-01-01

    We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.

  4. Modelling theoretical uncertainties in phenomenological analyses for particle physics

    CERN Document Server

    Charles, Jérôme; Niess, Valentin; Silva, Luiz Vale

    2016-01-01

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding $p$-values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive $p$-value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavour p...

  5. Modeling theoretical uncertainties in phenomenological analyses for particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Charles, Jerome [CNRS, Aix-Marseille Univ, Universite de Toulon, CPT UMR 7332, Marseille Cedex 9 (France); Descotes-Genon, Sebastien [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Niess, Valentin [CNRS/IN2P3, UMR 6533, Laboratoire de Physique Corpusculaire, Aubiere Cedex (France); Silva, Luiz Vale [CNRS, Univ. Paris-Sud, Universite Paris-Saclay, Laboratoire de Physique Theorique (UMR 8627), Orsay Cedex (France); Univ. Paris-Sud, CNRS/IN2P3, Universite Paris-Saclay, Groupe de Physique Theorique, Institut de Physique Nucleaire, Orsay Cedex (France); J. Stefan Institute, Jamova 39, P. O. Box 3000, Ljubljana (Slovenia)

    2017-04-15

    The determination of the fundamental parameters of the Standard Model (and its extensions) is often limited by the presence of statistical and theoretical uncertainties. We present several models for the latter uncertainties (random, nuisance, external) in the frequentist framework, and we derive the corresponding p values. In the case of the nuisance approach where theoretical uncertainties are modeled as biases, we highlight the important, but arbitrary, issue of the range of variation chosen for the bias parameters. We introduce the concept of adaptive p value, which is obtained by adjusting the range of variation for the bias according to the significance considered, and which allows us to tackle metrology and exclusion tests with a single and well-defined unified tool, which exhibits interesting frequentist properties. We discuss how the determination of fundamental parameters is impacted by the model chosen for theoretical uncertainties, illustrating several issues with examples from quark flavor physics. (orig.)

  6. An educational model for ensemble streamflow simulation and uncertainty analysis

    National Research Council Canada - National Science Library

    AghaKouchak, A; Nakhjiri, N; Habib, E

    2013-01-01

    ...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...

  7. Solar Neutrino Data, Solar Model Uncertainties and Neutrino Oscillations

    OpenAIRE

    1992-01-01

    We incorporate all existing solar neutrino flux measurements and take solar model flux uncertainties into account in deriving global fits to parameter space for the MSW and vacuum solutions of the solar neutrino problem.

  8. Why style matters - uncertainty and structural interpretation in thrust belts.

    Science.gov (United States)

    Butler, Rob; Bond, Clare; Watkins, Hannah

    2016-04-01

    Structural complexity together with challenging seismic imaging make for significant uncertainty in developing geometric interpretations of fold and thrust belts. Here we examine these issues and develop more realistic approaches to building interpretations. At all scales, the best tests of the internal consistency of individual interpretations come from structural restoration (section balancing), provided allowance is made for heterogeneity in stratigraphy and strain. However, many existing balancing approaches give misleading perceptions of interpretational risk - both on the scale of individual fold-thrust (trap) structures and in regional cross-sections. At the trap-scale, idealised models are widely cited - fault-bend-fold, fault-propagation folding and trishear. These make entirely arbitrary choices for fault localisation and layer-by-layer deformation: precise relationships between faults and fold geometry are generally invalidated by real-world conditions of stratigraphic variation and distributed strain. Furthermore, subsurface predictions made using these idealisations for hydrocarbon exploration commonly fail the test of drilling. Rarely acknowledged, the geometric reliability of seismic images depends on the assigned seismic velocity model, which in turn relies on geological interpretation. Thus iterative approaches are required between geology and geophysics. The portfolio of commonly cited outcrop analogues is strongly biased to examples that simply conform to idealised models - apparently abnormal structures are rarely described - or even photographed! Insight can come from gravity-driven deep-water fold-belts where part of the spectrum of fold-thrust complexity is resolved through seismic imaging. This imagery shows deformation complexity in fold forelimbs and backlimbs. However, the applicability of these, weakly lithified systems to well-lithified successions (e.g. carbonates) of many foreland thrust belts remains conjectural. Examples of

  9. Uncertainty and sensitivity analysis for photovoltaic system modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Clifford W.; Pohl, Andrew Phillip; Jordan, Dirk

    2013-12-01

    We report an uncertainty and sensitivity analysis for modeling DC energy from photovoltaic systems. We consider two systems, each comprised of a single module using either crystalline silicon or CdTe cells, and located either at Albuquerque, NM, or Golden, CO. Output from a PV system is predicted by a sequence of models. Uncertainty in the output of each model is quantified by empirical distributions of each model's residuals. We sample these distributions to propagate uncertainty through the sequence of models to obtain an empirical distribution for each PV system's output. We considered models that: (1) translate measured global horizontal, direct and global diffuse irradiance to plane-of-array irradiance; (2) estimate effective irradiance from plane-of-array irradiance; (3) predict cell temperature; and (4) estimate DC voltage, current and power. We found that the uncertainty in PV system output to be relatively small, on the order of 1% for daily energy. Four alternative models were considered for the POA irradiance modeling step; we did not find the choice of one of these models to be of great significance. However, we observed that the POA irradiance model introduced a bias of upwards of 5% of daily energy which translates directly to a systematic difference in predicted energy. Sensitivity analyses relate uncertainty in the PV system output to uncertainty arising from each model. We found that the residuals arising from the POA irradiance and the effective irradiance models to be the dominant contributors to residuals for daily energy, for either technology or location considered. This analysis indicates that efforts to reduce the uncertainty in PV system output should focus on improvements to the POA and effective irradiance models.

  10. Optimal design under uncertainty of a passive defense structure against snow avalanches: from a general Bayesian framework to a simple analytical model

    Directory of Open Access Journals (Sweden)

    N. Eckert

    2008-10-01

    Full Text Available For snow avalanches, passive defense structures are generally designed by considering high return period events. In this paper, taking inspiration from other natural hazards, an alternative method based on the maximization of the economic benefit of the defense structure is proposed. A general Bayesian framework is described first. Special attention is given to the problem of taking the poor local information into account in the decision-making process. Therefore, simplifying assumptions are made. The avalanche hazard is represented by a Peak Over Threshold (POT model. The influence of the dam is quantified in terms of runout distance reduction with a simple relation derived from small-scale experiments using granular media. The costs corresponding to dam construction and the damage to the element at risk are roughly evaluated for each dam height-hazard value pair, with damage evaluation corresponding to the maximal expected loss. Both the classical and the Bayesian risk functions can then be computed analytically. The results are illustrated with a case study from the French avalanche database. A sensitivity analysis is performed and modelling assumptions are discussed in addition to possible further developments.

  11. Calibration under uncertainty for finite element models of masonry monuments

    Energy Technology Data Exchange (ETDEWEB)

    Atamturktur, Sezer,; Hemez, Francois,; Unal, Cetin

    2010-02-01

    Historical unreinforced masonry buildings often include features such as load bearing unreinforced masonry vaults and their supporting framework of piers, fill, buttresses, and walls. The masonry vaults of such buildings are among the most vulnerable structural components and certainly among the most challenging to analyze. The versatility of finite element (FE) analyses in incorporating various constitutive laws, as well as practically all geometric configurations, has resulted in the widespread use of the FE method for the analysis of complex unreinforced masonry structures over the last three decades. However, an FE model is only as accurate as its input parameters, and there are two fundamental challenges while defining FE model input parameters: (1) material properties and (2) support conditions. The difficulties in defining these two aspects of the FE model arise from the lack of knowledge in the common engineering understanding of masonry behavior. As a result, engineers are unable to define these FE model input parameters with certainty, and, inevitably, uncertainties are introduced to the FE model.

  12. Robust Stability and Performance for Linear Systems with Structured and Unstructured Uncertainties

    Science.gov (United States)

    1990-06-01

    IEEE Transactions on Automatic Control , vol...34 IEEE Transactions on Automatic Control , vol. AC-30, pp. 577-579, June 1985. [10] Yedavalli, R.K., "Perturbation Bounds for Robust Stability in Linear...Zhou, K. and Khargonekar, Pl, "Stability Robustness Bounds for Linear State Space Models with Structured Uncertainty," IEEE Transactions on Automatic Control ,

  13. Model and parameter uncertainty in IDF relationships under climate change

    Science.gov (United States)

    Chandra, Rupa; Saha, Ujjwal; Mujumdar, P. P.

    2015-05-01

    Quantifying distributional behavior of extreme events is crucial in hydrologic designs. Intensity Duration Frequency (IDF) relationships are used extensively in engineering especially in urban hydrology, to obtain return level of extreme rainfall event for a specified return period and duration. Major sources of uncertainty in the IDF relationships are due to insufficient quantity and quality of data leading to parameter uncertainty due to the distribution fitted to the data and uncertainty as a result of using multiple GCMs. It is important to study these uncertainties and propagate them to future for accurate assessment of return levels for future. The objective of this study is to quantify the uncertainties arising from parameters of the distribution fitted to data and the multiple GCM models using Bayesian approach. Posterior distribution of parameters is obtained from Bayes rule and the parameters are transformed to obtain return levels for a specified return period. Markov Chain Monte Carlo (MCMC) method using Metropolis Hastings algorithm is used to obtain the posterior distribution of parameters. Twenty six CMIP5 GCMs along with four RCP scenarios are considered for studying the effects of climate change and to obtain projected IDF relationships for the case study of Bangalore city in India. GCM uncertainty due to the use of multiple GCMs is treated using Reliability Ensemble Averaging (REA) technique along with the parameter uncertainty. Scale invariance theory is employed for obtaining short duration return levels from daily data. It is observed that the uncertainty in short duration rainfall return levels is high when compared to the longer durations. Further it is observed that parameter uncertainty is large compared to the model uncertainty.

  14. Uncertainty analysis for a field-scale P loss model

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study we assessed the effect of model input error on predic...

  15. Assessment of parametric uncertainty for groundwater reactive transport modeling,

    Science.gov (United States)

    Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun

    2014-01-01

    The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood

  16. Uncertainty Quantification and Validation for RANS Turbulence Models

    Science.gov (United States)

    Oliver, Todd; Moser, Robert

    2011-11-01

    Uncertainty quantification and validation procedures for RANS turbulence models are developed and applied. The procedures used here rely on a Bayesian view of probability. In particular, the uncertainty quantification methodology requires stochastic model development, model calibration, and model comparison, all of which are pursued using tools from Bayesian statistics. Model validation is also pursued in a probabilistic framework. The ideas and processes are demonstrated on a channel flow example. Specifically, a set of RANS models--including Baldwin-Lomax, Spalart-Allmaras, k- ɛ, k- ω, and v2- f--and uncertainty representations are analyzed using DNS data for fully-developed channel flow. Predictions of various quantities of interest and the validity (or invalidity) of the various models for making those predictions will be examined. This work is supported by the Department of Energy [National Nuclear Security Administration] under Award Number [DE-FC52-08NA28615].

  17. Probabilistic Structural Evaluation of Uncertainties in Radiator Sandwich Panel Design

    Science.gov (United States)

    Kuguoglu, Latife; Ludwiczak, Damian

    2006-01-01

    The Jupiter Icy Moons Orbiter (JIMO) Space System is part of the NASA's Prometheus Program. As part of the JIMO engineering team at NASA Glenn Research Center, the structural design of the JIMO Heat Rejection Subsystem (HRS) is evaluated. An initial goal of this study was to perform sensitivity analyses to determine the relative importance of the input variables on the structural responses of the radiator panel. The desire was to let the sensitivity analysis information identify the important parameters. The probabilistic analysis methods illustrated here support this objective. The probabilistic structural performance evaluation of a HRS radiator sandwich panel was performed. The radiator panel structural performance was assessed in the presence of uncertainties in the loading, fabrication process variables, and material properties. The stress and displacement contours of the deterministic structural analysis at mean probability was performed and results presented. It is followed by a probabilistic evaluation to determine the effect of the primitive variables on the radiator panel structural performance. Based on uncertainties in material properties, structural geometry and loading, the results of the displacement and stress analysis are used as an input file for the probabilistic analysis of the panel. The sensitivity of the structural responses, such as maximum displacement and maximum tensile and compressive stresses of the facesheet in x and y directions and maximum VonMises stresses of the tube, to the loading and design variables is determined under the boundary condition where all edges of the radiator panel are pinned. Based on this study, design critical material and geometric parameters of the considered sandwich panel are identified.

  18. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-08-01

    Full Text Available Model evaluation is often performed at few locations due to the lack of spatially distributed data. Since the quantification of model sensitivities and uncertainties can be performed independently from ground truth measurements, these analyses are suitable to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainties of a physically based mountain permafrost model are quantified within an artificial topography. The setting consists of different elevations and exposures combined with six ground types characterized by porosity and hydraulic properties. The analyses are performed for a combination of all factors, that allows for quantification of the variability of model sensitivities and uncertainties within a whole modeling domain. We found that model sensitivities and uncertainties vary strongly depending on different input factors such as topography or different soil types. The analysis shows that model evaluation performed at single locations may not be representative for the whole modeling domain. For example, the sensitivity of modeled mean annual ground temperature to ground albedo ranges between 0.5 and 4 °C depending on elevation, aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter duration of the snow cover. The sensitivity in the hydraulic properties changes considerably for different ground types: rock or clay, for instance, are not sensitive to uncertainties in the hydraulic properties, while for gravel or peat, accurate estimates of the hydraulic properties significantly improve modeled ground temperatures. The discretization of ground, snow and time have an impact on modeled mean annual ground temperature (MAGT that cannot be neglected (more than 1 °C for several

  19. UNCERTAINTY SUPPLY CHAIN MODEL AND TRANSPORT IN ITS DEPLOYMENTS

    Directory of Open Access Journals (Sweden)

    Fabiana Lucena Oliveira

    2014-05-01

    Full Text Available This article discusses the Model Uncertainty of Supply Chain, and proposes a matrix with their transportation modes best suited to their chains. From the detailed analysis of the matrix of uncertainty, it is suggested transportation modes best suited to the management of these chains, so that transport is the most appropriate optimization of the gains previously proposed by the original model, particularly when supply chains are distant from suppliers of raw materials and / or supplies.Here we analyze in detail Agile Supply Chains, which is a result of Uncertainty Supply Chain Model, with special attention to Manaus Industrial Center. This research was done at Manaus Industrial Pole, which is a model of industrial agglomerations, based in Manaus, State of Amazonas (Brazil, which contemplates different supply chains and strategies sharing same infrastructure of transport, handling and storage and clearance process and uses inbound for suppliers of raw material.  The state of art contemplates supply chain management, uncertainty supply chain model, agile supply chains, Manaus Industrial Center (MIC and Brazilian legislation, as a business case, and presents concepts and features, of each one. The main goal is to present and discuss how transport is able to support Uncertainty Supply Chain Model, in order to complete management model. The results obtained confirms the hypothesis of integrated logistics processes are able to guarantee attractivity for industrial agglomerations, and open discussions when the suppliers are far from the manufacturer center, in a logistics management.

  20. Uncertainty quantification of squeal instability via surrogate modelling

    Science.gov (United States)

    Nobari, Amir; Ouyang, Huajiang; Bannister, Paul

    2015-08-01

    One of the major issues that car manufacturers are facing is the noise and vibration of brake systems. Of the different sorts of noise and vibration, which a brake system may generate, squeal as an irritating high-frequency noise costs the manufacturers significantly. Despite considerable research that has been conducted on brake squeal, the root cause of squeal is still not fully understood. The most common assumption, however, is mode-coupling. Complex eigenvalue analysis is the most widely used approach to the analysis of brake squeal problems. One of the major drawbacks of this technique, nevertheless, is that the effects of variability and uncertainty are not included in the results. Apparently, uncertainty and variability are two inseparable parts of any brake system. Uncertainty is mainly caused by friction, contact, wear and thermal effects while variability mostly stems from the manufacturing process, material properties and component geometries. Evaluating the effects of uncertainty and variability in the complex eigenvalue analysis improves the predictability of noise propensity and helps produce a more robust design. The biggest hurdle in the uncertainty analysis of brake systems is the computational cost and time. Most uncertainty analysis techniques rely on the results of many deterministic analyses. A full finite element model of a brake system typically consists of millions of degrees-of-freedom and many load cases. Running time of such models is so long that automotive industry is reluctant to do many deterministic analyses. This paper, instead, proposes an efficient method of uncertainty propagation via surrogate modelling. A surrogate model of a brake system is constructed in order to reproduce the outputs of the large-scale finite element model and overcome the issue of computational workloads. The probability distribution of the real part of an unstable mode can then be obtained by using the surrogate model with a massive saving of

  1. The cascade of uncertainty in modeling the impacts of climate change on Europe's forests

    Science.gov (United States)

    Reyer, Christopher; Lasch-Born, Petra; Suckow, Felicitas; Gutsch, Martin

    2015-04-01

    Projecting the impacts of global change on forest ecosystems is a cornerstone for designing sustainable forest management strategies and paramount for assessing the potential of Europe's forest to contribute to the EU bioeconomy. Research on climate change impacts on forests relies to a large extent on model applications along a model chain from Integrated Assessment Models to General and Regional Circulation Models that provide important driving variables for forest models. Or to decision support systems that synthesize findings of more detailed forest models to inform forest managers. At each step in the model chain, model-specific uncertainties about, amongst others, parameter values, input data or model structure accumulate, leading to a cascade of uncertainty. For example, climate change impacts on forests strongly depend on the in- or exclusion of CO2-effects or on the use of an ensemble of climate models rather than relying on one particular climate model. In the past, these uncertainties have not or only partly been considered in studies of climate change impacts on forests. This has left managers and decision-makers in doubt of how robust the projected impacts on forest ecosystems are. We deal with this cascade of uncertainty in a structured way and the objective of this presentation is to assess how different types of uncertainties affect projections of the effects of climate change on forest ecosystems. To address this objective we synthesized a large body of scientific literature on modeled productivity changes and the effects of extreme events on plant processes. Furthermore, we apply the process-based forest growth model 4C to forest stands all over Europe and assess how different climate models, emission scenarios and assumptions about the parameters and structure of 4C affect the uncertainty of the model projections. We show that there are consistent regional changes in forest productivity such as an increase in NPP in cold and wet regions while

  2. Impact of inherent meteorology uncertainty on air quality model predictions

    Science.gov (United States)

    It is well established that there are a number of different classifications and sources of uncertainties in environmental modeling systems. Air quality models rely on two key inputs, namely, meteorology and emissions. When using air quality models for decision making, it is impor...

  3. Quantification of Modelling Uncertainties in Turbulent Flow Simulations

    NARCIS (Netherlands)

    Edeling, W.N.

    2015-01-01

    The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest w

  4. Quantification of Modelling Uncertainties in Turbulent Flow Simulations

    NARCIS (Netherlands)

    Edeling, W.N.

    2015-01-01

    The goal of this thesis is to make predictive simulations with Reynolds-Averaged Navier-Stokes (RANS) turbulence models, i.e. simulations with a systematic treatment of model and data uncertainties and their propagation through a computational model to produce predictions of quantities of interest

  5. Uncertainty quantification in Rothermel's Model using an efficient sampling method

    Science.gov (United States)

    Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick

    2007-01-01

    The purpose of the present work is to quantify parametric uncertainty in Rothermel’s wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...

  6. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Science.gov (United States)

    Sikorska, A. E.; Scheidegger, A.; Banasik, K.; Rieckermann, J.

    2012-04-01

    Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  7. Modelling the Epistemic Uncertainty in the Vulnerability Assessment Component of an Earthquake Loss Model

    Science.gov (United States)

    Crowley, H.; Modica, A.

    2009-04-01

    Loss estimates have been shown in various studies to be highly sensitive to the methodology employed, the seismicity and ground-motion models, the vulnerability functions, and assumed replacement costs (e.g. Crowley et al., 2005; Molina and Lindholm, 2005; Grossi, 2000). It is clear that future loss models should explicitly account for these epistemic uncertainties. Indeed, a cause of frequent concern in the insurance and reinsurance industries is precisely the fact that for certain regions and perils, available commercial catastrophe models often yield significantly different loss estimates. Of equal relevance to many users is the fact that updates of the models sometimes lead to very significant changes in the losses compared to the previous version of the software. In order to model the epistemic uncertainties that are inherent in loss models, a number of different approaches for the hazard, vulnerability, exposure and loss components should be clearly and transparently applied, with the shortcomings and benefits of each method clearly exposed by the developers, such that the end-users can begin to compare the results and the uncertainty in these results from different models. This paper looks at an application of a logic-tree type methodology to model the epistemic uncertainty in the vulnerability component of a loss model for Tunisia. Unlike other countries which have been subjected to damaging earthquakes, there has not been a significant effort to undertake vulnerability studies for the building stock in Tunisia. Hence, when presented with the need to produce a loss model for a country like Tunisia, a number of different approaches can and should be applied to model the vulnerability. These include empirical procedures which utilise observed damage data, and mechanics-based methods where both the structural characteristics and response of the buildings are analytically modelled. Some preliminary applications of the methodology are presented and discussed

  8. Immersive Data Comprehension: Visualizing Uncertainty in Measurable Models

    Directory of Open Access Journals (Sweden)

    Pere eBrunet

    2015-09-01

    Full Text Available Recent advances in 3D scanning technologies have opened new possibilities in a broad range of applications includingcultural heritage, medicine, civil engineering and urban planning. Virtual Reality systems can provide new tools toprofessionals that want to understand acquired 3D models. In this paper, we review the concept of data comprehension with an emphasis on visualization and inspection tools on immersive setups. We claim that in most application fields, data comprehension requires model measurements which in turn should be based on the explicit visualization of uncertainty. As 3D digital representations are not faithful, information on their fidelity at local level should be included in the model itself as uncertainty bounds. We propose the concept of Measurable 3D Models as digital models that explicitly encode local uncertainty bounds related to their quality. We claim that professionals and experts can strongly benefit from immersive interaction through new specific, fidelity-aware measurement tools which can facilitate 3D data comprehension. Since noise and processing errors are ubiquitous in acquired datasets, we discuss the estimation, representation and visualization of data uncertainty. We show that, based on typical user requirements in Cultural Heritage and other domains, application-oriented measuring tools in 3D models must consider uncertainty and local error bounds. We also discuss the requirements of immersive interaction tools for the comprehension of huge 3D and nD datasets acquired from real objects.

  9. Numerical daemons in hydrological modeling: Effects on uncertainty assessment, sensitivity analysis and model predictions

    Science.gov (United States)

    Kavetski, D.; Clark, M. P.; Fenicia, F.

    2011-12-01

    Hydrologists often face sources of uncertainty that dwarf those normally encountered in many engineering and scientific disciplines. Especially when representing large scale integrated systems, internal heterogeneities such as stream networks, preferential flowpaths, vegetation, etc, are necessarily represented with a considerable degree of lumping. The inputs to these models are themselves often the products of sparse observational networks. Given the simplifications inherent in environmental models, especially lumped conceptual models, does it really matter how they are implemented? At the same time, given the complexities usually found in the response surfaces of hydrological models, increasingly sophisticated analysis methodologies are being proposed for sensitivity analysis, parameter calibration and uncertainty assessment. Quite remarkably, rather than being caused by the model structure/equations themselves, in many cases model analysis complexities are consequences of seemingly trivial aspects of the model implementation - often, literally, whether the start-of-step or end-of-step fluxes are used! The extent of problems can be staggering, including (i) degraded performance of parameter optimization and uncertainty analysis algorithms, (ii) erroneous and/or misleading conclusions of sensitivity analysis, parameter inference and model interpretations and, finally, (iii) poor reliability of a calibrated model in predictive applications. While the often nontrivial behavior of numerical approximations has long been recognized in applied mathematics and in physically-oriented fields of environmental sciences, it remains a problematic issue in many environmental modeling applications. Perhaps detailed attention to numerics is only warranted for complicated engineering models? Would not numerical errors be an insignificant component of total uncertainty when typical data and model approximations are present? Is this really a serious issue beyond some rare isolated

  10. Effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model output

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study analyses the effect of precipitation spatial distribution uncertainty on the uncertainty bounds of a snowmelt runoff model's discharge estimates. Prediction uncertainty bounds are derived using the Generalized Likelihood Uncertainty Estimation (GLUE) methodology. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation (at a single station within the catchment) and a precipitation factor FPi. Thus, these factors provide a simplified representation of the spatial variation of precipitation, specifically the shape of the functional relationship between precipitation and height. In the absence of information about appropriate values of the precipitation factors FPi, these are estimated through standard calibration procedures. The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. Monte Carlo samples of the model output are obtained by randomly varying the model parameters within their feasible ranges. In the first experiment, the precipitation factors FPi are considered unknown and thus included in the sampling process. The total number of unknown parameters in this case is 16. In the second experiment, precipitation factors FPi are estimated a priori, by means of a long term water balance between observed discharge at the catchment outlet, evapotranspiration estimates and observed precipitation. In this case, the number of unknown parameters reduces to 11. The feasible ranges assigned to the precipitation factors in the first experiment are slightly wider than the range of fixed precipitation factors used in the second experiment. The mean squared error of the Box-Cox transformed discharge during the calibration period is used for the evaluation of the

  11. Uncertainty Consideration in Watershed Scale Models

    Science.gov (United States)

    Watershed scale hydrologic and water quality models have been used with increasing frequency to devise alternative pollution control strategies. With recent reenactment of the 1972 Clean Water Act’s TMDL (total maximum daily load) component, some of the watershed scale models are being recommended ...

  12. Multiphysics modeling and uncertainty quantification for an active composite reflector

    Science.gov (United States)

    Peterson, Lee D.; Bradford, S. C.; Schiermeier, John E.; Agnes, Gregory S.; Basinger, Scott A.

    2013-09-01

    A multiphysics, high resolution simulation of an actively controlled, composite reflector panel is developed to extrapolate from ground test results to flight performance. The subject test article has previously demonstrated sub-micron corrected shape in a controlled laboratory thermal load. This paper develops a model of the on-orbit performance of the panel under realistic thermal loads, with an active heater control system, and performs an uncertainty quantification of the predicted response. The primary contribution of this paper is the first reported application of the Sandia developed Sierra mechanics simulation tools to a spacecraft multiphysics simulation of a closed-loop system, including uncertainty quantification. The simulation was developed so as to have sufficient resolution to capture the residual panel shape error that remains after the thermal and mechanical control loops are closed. An uncertainty quantification analysis was performed to assess the predicted tolerance in the closed-loop wavefront error. Key tools used for the uncertainty quantification are also described.

  13. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  14. On the Impact of Uncertainty in Initial Conditions of Hydrologic Models on Prediction

    Science.gov (United States)

    Razavi, S.; Sheikholeslami, R.

    2015-12-01

    Determining the initial conditions for predictive models remains a challenge due to the uncertainty in measurement/identification of the state variables at the scale of interest. However, the characterization of uncertainty in initial conditions has arguably attracted less attention compared with other sources of uncertainty in hydrologic modelling (e.g, parameter, data, and structural uncertainty). This is perhaps because it is commonly believed that: (1) hydrologic systems (relatively rapidly) forget their initial conditions over time, and (2) other sources of uncertainty (e.g., in data) are dominant. This presentation revisits the basic principles of the theory of nonlinear dynamical systems in the context of hydrologic systems. Through simple example case studies, we demonstrate how and under what circumstances different hydrologic processes represent a range of attracting limit sets in their evolution trajectory in state space over time, including fixed points, limit cycles (periodic behaviour), torus (quasi-periodic behaviour), and strange attractors (chaotic behaviour). Furthermore, the propagation (or dissipation) of uncertainty in initial conditions of several hydrologic models through time, under any of the possible attracting limit sets, is investigated. This study highlights that there are definite situations in hydrology where uncertainty in initial conditions remains of significance. The results and insights gained have important implications for hydrologic modelling under non-stationarity in climate and environment.

  15. An Algebraic Graphical Model for Decision with Uncertainties, Feasibilities, and Utilities

    CERN Document Server

    Pralet, C; Verfaillie, G; 10.1613/jair.2151

    2011-01-01

    Numerous formalisms and dedicated algorithms have been designed in the last decades to model and solve decision making problems. Some formalisms, such as constraint networks, can express "simple" decision problems, while others are designed to take into account uncertainties, unfeasible decisions, and utilities. Even in a single formalism, several variants are often proposed to model different types of uncertainty (probability, possibility...) or utility (additive or not). In this article, we introduce an algebraic graphical model that encompasses a large number of such formalisms: (1) we first adapt previous structures from Friedman, Chu and Halpern for representing uncertainty, utility, and expected utility in order to deal with generic forms of sequential decision making; (2) on these structures, we then introduce composite graphical models that express information via variables linked by "local" functions, thanks to conditional independence; (3) on these graphical models, we finally define a simple class ...

  16. Integration of inaccurate data into model building and uncertainty assessment

    Energy Technology Data Exchange (ETDEWEB)

    Coleou, Thierry

    1998-12-31

    Model building can be seen as integrating numerous measurements and mapping through data points considered as exact. As the exact data set is usually sparse, using additional non-exact data improves the modelling and reduces the uncertainties. Several examples of non-exact data are discussed and a methodology to honor them in a single pass, along with the exact data is presented. This automatic procedure is valid for both ``base case`` model building and stochastic simulations for uncertainty analysis. 5 refs., 3 figs.

  17. Enhancing uncertainty tolerance in modelling creep of ligaments.

    Science.gov (United States)

    Reda Taha, M M; Lucero, J

    2006-09-01

    The difficulty in performing biomechanical tests and the scarcity of biomechanical experimental databases necessitate extending the current knowledge base to allow efficient modelling using limited data sets. This study suggests a framework to reduce uncertainties in biomechanical systems using limited data sets. The study also shows how sparse data and epistemic input can be exploited using fuzzy logic to represent biomechanical relations. An example application to model collagen fibre recruitment in the medial collateral ligaments during time-dependent deformation under cyclic loading (creep) is presented. The study suggests a quality metric that can be employed to observe and enhance uncertainty tolerance in the modelling process.

  18. Spatial uncertainty assessment in modelling reference evapotranspiration at regional scale

    Directory of Open Access Journals (Sweden)

    G. Buttafuoco

    2010-07-01

    Full Text Available Evapotranspiration is one of the major components of the water balance and has been identified as a key factor in hydrological modelling. For this reason, several methods have been developed to calculate the reference evapotranspiration (ET0. In modelling reference evapotranspiration it is inevitable that both model and data input will present some uncertainty. Whatever model is used, the errors in the input will propagate to the output of the calculated ET0. Neglecting information about estimation uncertainty, however, may lead to improper decision-making and water resources management. One geostatistical approach to spatial analysis is stochastic simulation, which draws alternative and equally probable, realizations of a regionalized variable. Differences between the realizations provide a measure of spatial uncertainty and allow to carry out an error propagation analysis. Among the evapotranspiration models, the Hargreaves-Samani model was used.

    The aim of this paper was to assess spatial uncertainty of a monthly reference evapotranspiration model resulting from the uncertainties in the input attributes (mainly temperature at regional scale. A case study was presented for Calabria region (southern Italy. Temperature data were jointly simulated by conditional turning bands simulation with elevation as external drift and 500 realizations were generated.

    The ET0 was then estimated for each set of the 500 realizations of the input variables, and the ensemble of the model outputs was used to infer the reference evapotranspiration probability distribution function. This approach allowed to delineate the areas characterized by greater uncertainty, to improve supplementary sampling strategies and ET0 value predictions.

  19. Impact of rainfall temporal resolution on urban water quality modelling performance and uncertainties.

    Science.gov (United States)

    Manz, Bastian Johann; Rodríguez, Juan Pablo; Maksimović, Cedo; McIntyre, Neil

    2013-01-01

    A key control on the response of an urban drainage model is how well the observed rainfall records represent the real rainfall variability. Particularly in urban catchments with fast response flow regimes, the selection of temporal resolution in rainfall data collection is critical. Furthermore, the impact of the rainfall variability on the model response is amplified for water quality estimates, as uncertainty in rainfall intensity affects both the rainfall-runoff and pollutant wash-off sub-models, thus compounding uncertainties. A modelling study was designed to investigate the impact of altering rainfall temporal resolution on the magnitude and behaviour of uncertainties associated with the hydrological modelling compared with water quality modelling. The case study was an 85-ha combined sewer sub-catchment in Bogotá (Colombia). Water quality estimates showed greater sensitivity to the inter-event variability in rainfall hyetograph characteristics than to changes in the rainfall input temporal resolution. Overall, uncertainties from the water quality model were two- to five-fold those of the hydrological model. However, owing to the intrinsic scarcity of observations in urban water quality modelling, total model output uncertainties, especially from the water quality model, were too large to make recommendations for particular model structures or parameter values with respect to rainfall temporal resolution.

  20. Improving uncertainty estimation in urban hydrological modeling by statistically describing bias

    Directory of Open Access Journals (Sweden)

    D. Del Giudice

    2013-10-01

    Full Text Available Hydrodynamic models are useful tools for urban water management. Unfortunately, it is still challenging to obtain accurate results and plausible uncertainty estimates when using these models. In particular, with the currently applied statistical techniques, flow predictions are usually overconfident and biased. In this study, we present a flexible and relatively efficient methodology (i to obtain more reliable hydrological simulations in terms of coverage of validation data by the uncertainty bands and (ii to separate prediction uncertainty into its components. Our approach acknowledges that urban drainage predictions are biased. This is mostly due to input errors and structural deficits of the model. We address this issue by describing model bias in a Bayesian framework. The bias becomes an autoregressive term additional to white measurement noise, the only error type accounted for in traditional uncertainty analysis. To allow for bigger discrepancies during wet weather, we make the variance of bias dependent on the input (rainfall or/and output (runoff of the system. Specifically, we present a structured approach to select, among five variants, the optimal bias description for a given urban or natural case study. We tested the methodology in a small monitored stormwater system described with a parsimonious model. Our results clearly show that flow simulations are much more reliable when bias is accounted for than when it is neglected. Furthermore, our probabilistic predictions can discriminate between three uncertainty contributions: parametric uncertainty, bias, and measurement errors. In our case study, the best performing bias description is the output-dependent bias using a log-sinh transformation of data and model results. The limitations of the framework presented are some ambiguity due to the subjective choice of priors for bias parameters and its inability to address the causes of model discrepancies. Further research should focus on

  1. Assessing the relative importance of parameter and forcing uncertainty and their interactions in conceptual hydrological model simulations

    Science.gov (United States)

    Mockler, E. M.; Chun, K. P.; Sapriza-Azuri, G.; Bruen, M.; Wheater, H. S.

    2016-11-01

    Predictions of river flow dynamics provide vital information for many aspects of water management including water resource planning, climate adaptation, and flood and drought assessments. Many of the subjective choices that modellers make including model and criteria selection can have a significant impact on the magnitude and distribution of the output uncertainty. Hydrological modellers are tasked with understanding and minimising the uncertainty surrounding streamflow predictions before communicating the overall uncertainty to decision makers. Parameter uncertainty in conceptual rainfall-runoff models has been widely investigated, and model structural uncertainty and forcing data have been receiving increasing attention. This study aimed to assess uncertainties in streamflow predictions due to forcing data and the identification of behavioural parameter sets in 31 Irish catchments. By combining stochastic rainfall ensembles and multiple parameter sets for three conceptual rainfall-runoff models, an analysis of variance model was used to decompose the total uncertainty in streamflow simulations into contributions from (i) forcing data, (ii) identification of model parameters and (iii) interactions between the two. The analysis illustrates that, for our subjective choices, hydrological model selection had a greater contribution to overall uncertainty, while performance criteria selection influenced the relative intra-annual uncertainties in streamflow predictions. Uncertainties in streamflow predictions due to the method of determining parameters were relatively lower for wetter catchments, and more evenly distributed throughout the year when the Nash-Sutcliffe Efficiency of logarithmic values of flow (lnNSE) was the evaluation criterion.

  2. Land cover uncertainty generates substantial uncertainty in earth system model carbon and climate projections

    Science.gov (United States)

    Di Vittorio, Alan; Mao, Jiafu; Shi, Xiaoying

    2017-04-01

    Several climate adaptation and mitigation strategies incorporate Land Use and Land Cover Change (LULCC) to address global carbon balance and climate. However, LULCC is not consistent across the CMIP5 model simulations because only the land use input is harmonized. The associated LULCC uncertainty generates uncertainty in regional and global carbon and climate dynamics that obfuscates the evaluation of whether such strategies are effective in meeting their goals. For example, the integrated Earth System Model (iESM) overestimates 2004 atmospheric CO2 concentration by 14 ppmv, and we explore the contribution of historical LULCC uncertainty to this bias in relation to the effects of CO2 fertilization, climate change, and nitrogen deposition on terrestrial carbon. Using identical land use input, a chronologically referenced LULCC that accounts for pasture, as opposed to the default year-2000 referenced LULCC, increases this bias to 20 ppmv because more forest needs to be cleared for land use. Assuming maximum forest retention for all land conversion reduces the new bias to 19 ppmv, while minimum forest retention increases the new bias to 24 ppmv. There is a 33 Pg land carbon uncertainty range due to maximizing versus minimizing forest area, which is 80% of the estimated 41 PgC gain in land carbon due to CO2 fertilization combined with climate change from 1850-2004 and 150% of the estimated 22 PgC gain due to nitrogen deposition. These results demonstrate that LULCC accuracy and uncertainty are critical for estimating the carbon cycle, and also that LULCC may be an important lever for constraining global carbon estimates. Furthermore, different land conversion assumptions can generate local differences of over 1.0 °C between the two forest retention cases with less than 5% difference in tree cover within a grid cell. Whether these temperature differences are positive or negative depends more on region than on latitude. Sensible heat appears to be more sensitive than

  3. Estimation and uncertainty of reversible Markov models

    CERN Document Server

    Trendelkamp-Schroer, Benjamin; Paul, Fabian; Noé, Frank

    2015-01-01

    Reversibility is a key concept in the theory of Markov models, simplified kinetic models for the conforma- tion dynamics of molecules. The analysis and interpretation of the transition matrix encoding the kinetic properties of the model relies heavily on the reversibility property. The estimation of a reversible transition matrix from simulation data is therefore crucial to the successful application of the previously developed theory. In this work we discuss methods for the maximum likelihood estimation of transition matrices from finite simulation data and present a new algorithm for the estimation if reversibility with respect to a given stationary vector is desired. We also develop new methods for the Bayesian posterior inference of reversible transition matrices with and without given stationary vector taking into account the need for a suitable prior distribution preserving the meta-stable features of the observed process during posterior inference.

  4. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  5. Model uncertainty and systematic risk in US banking

    NARCIS (Netherlands)

    Baele, L.T.M.; De Bruyckere, Valerie; De Jonghe, O.G.; Vander Vennet, Rudi

    2015-01-01

    This paper uses Bayesian Model Averaging to examine the driving factors of equity returns of US Bank Holding Companies. BMA has as an advantage over OLS that it accounts for the considerable uncertainty about the correct set (model) of bank risk factors. We find that out of a broad set of 12 risk fa

  6. River meander modeling and confronting uncertainty.

    Energy Technology Data Exchange (ETDEWEB)

    Posner, Ari J. (University of Arizona Tucson, AZ)

    2011-05-01

    This study examines the meandering phenomenon as it occurs in media throughout terrestrial, glacial, atmospheric, and aquatic environments. Analysis of the minimum energy principle, along with theories of Coriolis forces (and random walks to explain the meandering phenomenon) found that these theories apply at different temporal and spatial scales. Coriolis forces might induce topological changes resulting in meandering planforms. The minimum energy principle might explain how these forces combine to limit the sinuosity to depth and width ratios that are common throughout various media. The study then compares the first order analytical solutions for flow field by Ikeda, et al. (1981) and Johannesson and Parker (1989b). Ikeda's et al. linear bank erosion model was implemented to predict the rate of bank erosion in which the bank erosion coefficient is treated as a stochastic variable that varies with physical properties of the bank (e.g., cohesiveness, stratigraphy, or vegetation density). The developed model was used to predict the evolution of meandering planforms. Then, the modeling results were analyzed and compared to the observed data. Since the migration of a meandering channel consists of downstream translation, lateral expansion, and downstream or upstream rotations several measures are formulated in order to determine which of the resulting planforms is closest to the experimental measured one. Results from the deterministic model highly depend on the calibrated erosion coefficient. Since field measurements are always limited, the stochastic model yielded more realistic predictions of meandering planform evolutions. Due to the random nature of bank erosion coefficient, the meandering planform evolution is a stochastic process that can only be accurately predicted by a stochastic model.

  7. Uncertainty Visualization in Forward and Inverse Cardiac Models.

    Science.gov (United States)

    Burton, Brett M; Erem, Burak; Potter, Kristin; Rosen, Paul; Johnson, Chris R; Brooks, Dana H; Macleod, Rob S

    2013-01-01

    Quantification and visualization of uncertainty in cardiac forward and inverse problems with complex geometries is subject to various challenges. Specific to visualization is the observation that occlusion and clutter obscure important regions of interest, making visual assessment difficult. In order to overcome these limitations in uncertainty visualization, we have developed and implemented a collection of novel approaches. To highlight the utility of these techniques, we evaluated the uncertainty associated with two examples of modeling myocardial activity. In one case we studied cardiac potentials during the repolarization phase as a function of variability in tissue conductivities of the ischemic heart (forward case). In a second case, we evaluated uncertainty in reconstructed activation times on the epicardium resulting from variation in the control parameter of Tikhonov regularization (inverse case). To overcome difficulties associated with uncertainty visualization, we implemented linked-view windows and interactive animation to the two respective cases. Through dimensionality reduction and superimposed mean and standard deviation measures over time, we were able to display key features in large ensembles of data and highlight regions of interest where larger uncertainties exist.

  8. Estimation of a multivariate mean under model selection uncertainty

    Directory of Open Access Journals (Sweden)

    Georges Nguefack-Tsague

    2014-05-01

    Full Text Available Model selection uncertainty would occur if we selected a model based on one data set and subsequently applied it for statistical inferences, because the "correct" model would not be selected with certainty.  When the selection and inference are based on the same dataset, some additional problems arise due to the correlation of the two stages (selection and inference. In this paper model selection uncertainty is considered and model averaging is proposed. The proposal is related to the theory of James and Stein of estimating more than three parameters from independent normal observations. We suggest that a model averaging scheme taking into account the selection procedure could be more appropriate than model selection alone. Some properties of this model averaging estimator are investigated; in particular we show using Stein's results that it is a minimax estimator and can outperform Stein-type estimators.

  9. Uncertainty Analysis in Population-Based Disease Microsimulation Models

    Directory of Open Access Journals (Sweden)

    Behnam Sharif

    2012-01-01

    Full Text Available Objective. Uncertainty analysis (UA is an important part of simulation model validation. However, literature is imprecise as to how UA should be performed in the context of population-based microsimulation (PMS models. In this expository paper, we discuss a practical approach to UA for such models. Methods. By adapting common concepts from published UA guidelines, we developed a comprehensive, step-by-step approach to UA in PMS models, including sample size calculation to reduce the computational time. As an illustration, we performed UA for POHEM-OA, a microsimulation model of osteoarthritis (OA in Canada. Results. The resulting sample size of the simulated population was 500,000 and the number of Monte Carlo (MC runs was 785 for 12-hour computational time. The estimated 95% uncertainty intervals for the prevalence of OA in Canada in 2021 were 0.09 to 0.18 for men and 0.15 to 0.23 for women. The uncertainty surrounding the sex-specific prevalence of OA increased over time. Conclusion. The proposed approach to UA considers the challenges specific to PMS models, such as selection of parameters and calculation of MC runs and population size to reduce computational burden. Our example of UA shows that the proposed approach is feasible. Estimation of uncertainty intervals should become a standard practice in the reporting of results from PMS models.

  10. Hazard Response Modeling Uncertainty (A Quantitative Method)

    Science.gov (United States)

    1988-10-01

    ersio 114-11aiaiI I I I II L I ATINI Iri Ig IN Ig - ISI I I s InWLS I I I II I I I IWILLa RguOSmI IT$ INDS In s list INDIN I Im Ad inla o o ILLS I...OesotASII II I I I" GASau ATI im IVS ES Igo Igo INC 9 -U TIg IN ImS. I IgoIDI II i t I I ol f i isI I I I I I * WOOL ETY tGMIM (SU I YESMI jWM# GUSA imp I...is the concentration predicted by some component or model.P The variance of C /C is calculated and defined as var(Model I), where Modelo p I could be

  11. Deterministic Method for Obtaining Nominal and Uncertainty Models of CD Drives

    DEFF Research Database (Denmark)

    Vidal, Enrique Sanchez; Stoustrup, Jakob; Andersen, Palle;

    2002-01-01

    properties. The method provides a systematic way to derive a nominal average model as well as a structures multiplicative input uncertainty model, and it is demonstrated how to apply mu-theory to design a controller based on the models obtained that meets certain robust performance criteria.......In this paper a deterministic method for obtaining the nominal and uncertainty models of the focus loop in a CD-player is presented based on parameter identification and measurements in the focus loop of 12 actual CD drives that differ by having worst-case behaviors with respect to various...

  12. Uncertainty Models for Knowledge-Based Systems

    Science.gov (United States)

    1991-08-01

    D. V. (1982). Improving judgment by reconciling incoherence. The behavioral and brain Sciences, 4, 317-370. (26] Carnap , R. (1958). Introduction to...Symbolic Logic and its Applications. Dover, N. Y. References 597 [271 Carnap , R. (1959). The Logical Syntax of Language. Littlefield, Adam and Co...Paterson, New Jersey. [28] Carnap , R. (1960). Meaning and Necessity, a Study in Semantic and Model Logic. Phoenix Books, Univ. of Chicago. [29] Carrega

  13. Bayesian uncertainty assessment of rainfall-runoff models for small urban basins - the influence of the rating curve

    Science.gov (United States)

    Sikorska, A. E.; Scheidegger, A.; Banasik, K.; Rieckermann, J.

    2012-04-01

    Keywords: uncertainty assessment, rating curve uncertainties, Bayesian inference, rainfall-runoff models, small urban basins In hydrological flood forecasting, the problem of quantitative assessment of predictive uncertainties has been widely recognized. Despite several important findings in recent years, which helped to distinguish uncertainty contribution from input uncertainty (e.g., due to poor rainfall data), model structure deficits, parameter uncertainties and measurement errors, uncertainty analysis still remains a challenging task. This is especially true for small urbanized basins, where monitoring data are often poor. Among other things, measurement errors have been generally assumed to be significantly smaller than the other sources of uncertainty. It has been also shown that input error and model structure deficits are contributing more to the predictive uncertainties than uncertainties regarding the model parameters (Sikorska et al., 2011). These assumptions, however, are only correct when the modeled output is directly measurable in the system. Unfortunately, river discharge usually cannot be directly measured but is converted from the measured water stage with a rating curve method. The uncertainty introduced by the rating curve was shown in resent studies (Di Baldassarre et al., 2011) to be potentially significant in flood forecasting. This is especially true when extrapolating a rating curve above the measured level, which is often the case in (urban) flooding. In this work, we therefore investigated how flood predictions for small urban basins are affected by the uncertainties associated with the rating curve. To this aim, we augmented the model structure of a conceptual rainfall-runoff model to include the applied rating curve. This enabled us not only to directly modeled measurable water levels instead of discharges, but also to propagate the uncertainty of the rating curve through the model. To compare the importance of the rating curve to the

  14. Three Dimensional Vapor Intrusion Modeling: Model Validation and Uncertainty Analysis

    Science.gov (United States)

    Akbariyeh, S.; Patterson, B.; Rakoczy, A.; Li, Y.

    2013-12-01

    Volatile organic chemicals (VOCs), such as chlorinated solvents and petroleum hydrocarbons, are prevalent groundwater contaminants due to their improper disposal and accidental spillage. In addition to contaminating groundwater, VOCs may partition into the overlying vadose zone and enter buildings through gaps and cracks in foundation slabs or basement walls, a process termed vapor intrusion. Vapor intrusion of VOCs has been recognized as a detrimental source for human exposures to potential carcinogenic or toxic compounds. The simulation of vapor intrusion from a subsurface source has been the focus of many studies to better understand the process and guide field investigation. While multiple analytical and numerical models were developed to simulate the vapor intrusion process, detailed validation of these models against well controlled experiments is still lacking, due to the complexity and uncertainties associated with site characterization and soil gas flux and indoor air concentration measurement. In this work, we present an effort to validate a three-dimensional vapor intrusion model based on a well-controlled experimental quantification of the vapor intrusion pathways into a slab-on-ground building under varying environmental conditions. Finally, a probabilistic approach based on Monte Carlo simulations is implemented to determine the probability distribution of indoor air concentration based on the most uncertain input parameters.

  15. Model for predicting mountain wave field uncertainties

    Science.gov (United States)

    Damiens, Florentin; Lott, François; Millet, Christophe; Plougonven, Riwal

    2017-04-01

    Studying the propagation of acoustic waves throughout troposphere requires knowledge of wind speed and temperature gradients from the ground up to about 10-20 km. Typical planetary boundary layers flows are known to present vertical low level shears that can interact with mountain waves, thereby triggering small-scale disturbances. Resolving these fluctuations for long-range propagation problems is, however, not feasible because of computer memory/time restrictions and thus, they need to be parameterized. When the disturbances are small enough, these fluctuations can be described by linear equations. Previous works by co-authors have shown that the critical layer dynamics that occur near the ground produces large horizontal flows and buoyancy disturbances that result in intense downslope winds and gravity wave breaking. While these phenomena manifest almost systematically for high Richardson numbers and when the boundary layer depth is relatively small compare to the mountain height, the process by which static stability affects downslope winds remains unclear. In the present work, new linear mountain gravity wave solutions are tested against numerical predictions obtained with the Weather Research and Forecasting (WRF) model. For Richardson numbers typically larger than unity, the mesoscale model is used to quantify the effect of neglected nonlinear terms on downslope winds and mountain wave patterns. At these regimes, the large downslope winds transport warm air, a so called "Foehn" effect than can impact sound propagation properties. The sensitivity of small-scale disturbances to Richardson number is quantified using two-dimensional spectral analysis. It is shown through a pilot study of subgrid scale fluctuations of boundary layer flows over realistic mountains that the cross-spectrum of mountain wave field is made up of the same components found in WRF simulations. The impact of each individual component on acoustic wave propagation is discussed in terms of

  16. Stochastic modelling of landfill leachate and biogas production incorporating waste heterogeneity. Model formulation and uncertainty analysis.

    Science.gov (United States)

    Zacharof, A I; Butler, A P

    2004-01-01

    A mathematical model simulating the hydrological and biochemical processes occurring in landfilled waste is presented and demonstrated. The model combines biochemical and hydrological models into an integrated representation of the landfill environment. Waste decomposition is modelled using traditional biochemical waste decomposition pathways combined with a simplified methodology for representing the rate of decomposition. Water flow through the waste is represented using a statistical velocity model capable of representing the effects of waste heterogeneity on leachate flow through the waste. Given the limitations in data capture from landfill sites, significant emphasis is placed on improving parameter identification and reducing parameter requirements. A sensitivity analysis is performed, highlighting the model's response to changes in input variables. A model test run is also presented, demonstrating the model capabilities. A parameter perturbation model sensitivity analysis was also performed. This has been able to show that although the model is sensitive to certain key parameters, its overall intuitive response provides a good basis for making reasonable predictions of the future state of the landfill system. Finally, due to the high uncertainty associated with landfill data, a tool for handling input data uncertainty is incorporated in the model's structure. It is concluded that the model can be used as a reasonable tool for modelling landfill processes and that further work should be undertaken to assess the model's performance.

  17. Uncertainty Quantification for Large-Scale Ice Sheet Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [Univ. of Texas, Austin, TX (United States)

    2016-02-05

    This report summarizes our work to develop advanced forward and inverse solvers and uncertainty quantification capabilities for a nonlinear 3D full Stokes continental-scale ice sheet flow model. The components include: (1) forward solver: a new state-of-the-art parallel adaptive scalable high-order-accurate mass-conservative Newton-based 3D nonlinear full Stokes ice sheet flow simulator; (2) inverse solver: a new adjoint-based inexact Newton method for solution of deterministic inverse problems governed by the above 3D nonlinear full Stokes ice flow model; and (3) uncertainty quantification: a novel Hessian-based Bayesian method for quantifying uncertainties in the inverse ice sheet flow solution and propagating them forward into predictions of quantities of interest such as ice mass flux to the ocean.

  18. Flight Dynamics and Control of Elastic Hypersonic Vehicles Uncertainty Modeling

    Science.gov (United States)

    Chavez, Frank R.; Schmidt, David K.

    1994-01-01

    It has been shown previously that hypersonic air-breathing aircraft exhibit strong aeroelastic/aeropropulsive dynamic interactions. To investigate these, especially from the perspective of the vehicle dynamics and control, analytical expressions for key stability derivatives were derived, and an analysis of the dynamics was performed. In this paper, the important issue of model uncertainty, and the appropriate forms for representing this uncertainty, is addressed. It is shown that the methods suggested in the literature for analyzing the robustness of multivariable feedback systems, which as a prerequisite to their application assume particular forms of model uncertainty, can be difficult to apply on real atmospheric flight vehicles. Also, the extent to which available methods are conservative is demonstrated for this class of vehicle dynamics.

  19. Improved Wave-vessel Transfer Functions by Uncertainty Modelling

    DEFF Research Database (Denmark)

    Nielsen, Ulrik Dam; Fønss Bach, Kasper; Iseki, Toshio

    2016-01-01

    This paper deals with uncertainty modelling of wave-vessel transfer functions used to calculate or predict wave-induced responses of a ship in a seaway. Although transfer functions, in theory, can be calculated to exactly reflect the behaviour of the ship when exposed to waves, uncertainty in input...... variables, notably speed, draft and relative wave eading, often compromises results. In this study, uncertling is applied to improve theoretically calculated transfer functions, so they better fit the corresponding experimental, full-scale ones. Based on a vast amount of full-scale measurements data......, it is shown that uncertainty modelling can be successfully used to improve accuracy (and reliability) of theoretical transfer functions....

  20. Uncertainty analysis in dissolved oxygen modeling in streams.

    Science.gov (United States)

    Hamed, Maged M; El-Beshry, Manar Z

    2004-08-01

    Uncertainty analysis in surface water quality modeling is an important issue. This paper presents a method based on the first-order reliability method (FORM) to assess the exceedance probability of a target dissolved oxygen concentration in a stream, using a Streeter-Phelps prototype model. Basic uncertainty in the input parameters is considered by representing them as random variables with prescribed probability distributions. Results obtained from FORM analysis compared well with those of the Monte Carlo simulation method. The analysis also presents the stochastic sensitivity of the probabilistic outcome in the form of uncertainty importance factors, and shows how they change with changing simulation time. Furthermore, a parametric sensitivity analysis was conducted to show the effect of selection of different probability distribution functions for the three most important parameters on the design point, exceedance probability, and importance factors.

  1. Spectral optimization and uncertainty quantification in combustion modeling

    Science.gov (United States)

    Sheen, David Allan

    Reliable simulations of reacting flow systems require a well-characterized, detailed chemical model as a foundation. Accuracy of such a model can be assured, in principle, by a multi-parameter optimization against a set of experimental data. However, the inherent uncertainties in the rate evaluations and experimental data leave a model still characterized by some finite kinetic rate parameter space. Without a careful analysis of how this uncertainty space propagates into the model's predictions, those predictions can at best be trusted only qualitatively. In this work, the Method of Uncertainty Minimization using Polynomial Chaos Expansions is proposed to quantify these uncertainties. In this method, the uncertainty in the rate parameters of the as-compiled model is quantified. Then, the model is subjected to a rigorous multi-parameter optimization, as well as a consistency-screening process. Lastly, the uncertainty of the optimized model is calculated using an inverse spectral optimization technique, and then propagated into a range of simulation conditions. An as-compiled, detailed H2/CO/C1-C4 kinetic model is combined with a set of ethylene combustion data to serve as an example. The idea that the hydrocarbon oxidation model should be understood and developed in a hierarchical fashion has been a major driving force in kinetics research for decades. How this hierarchical strategy works at a quantitative level, however, has never been addressed. In this work, we use ethylene and propane combustion as examples and explore the question of hierarchical model development quantitatively. The Method of Uncertainty Minimization using Polynomial Chaos Expansions is utilized to quantify the amount of information that a particular combustion experiment, and thereby each data set, contributes to the model. This knowledge is applied to explore the relationships among the combustion chemistry of hydrogen/carbon monoxide, ethylene, and larger alkanes. Frequently, new data will

  2. Quantifying uncertainty in partially specified biological models: how can optimal control theory help us?

    Science.gov (United States)

    Adamson, M W; Morozov, A Y; Kuzenkov, O A

    2016-09-01

    Mathematical models in biology are highly simplified representations of a complex underlying reality and there is always a high degree of uncertainty with regards to model function specification. This uncertainty becomes critical for models in which the use of different functions fitting the same dataset can yield substantially different predictions-a property known as structural sensitivity. Thus, even if the model is purely deterministic, then the uncertainty in the model functions carries through into uncertainty in model predictions, and new frameworks are required to tackle this fundamental problem. Here, we consider a framework that uses partially specified models in which some functions are not represented by a specific form. The main idea is to project infinite dimensional function space into a low-dimensional space taking into account biological constraints. The key question of how to carry out this projection has so far remained a serious mathematical challenge and hindered the use of partially specified models. Here, we propose and demonstrate a potentially powerful technique to perform such a projection by using optimal control theory to construct functions with the specified global properties. This approach opens up the prospect of a flexible and easy to use method to fulfil uncertainty analysis of biological models.

  3. Linear models in the mathematics of uncertainty

    CERN Document Server

    Mordeson, John N; Clark, Terry D; Pham, Alex; Redmond, Michael A

    2013-01-01

    The purpose of this book is to present new mathematical techniques for modeling global issues. These mathematical techniques are used to determine linear equations between a dependent variable and one or more independent variables in cases where standard techniques such as linear regression are not suitable. In this book, we examine cases where the number of data points is small (effects of nuclear warfare), where the experiment is not repeatable (the breakup of the former Soviet Union), and where the data is derived from expert opinion (how conservative is a political party). In all these cases the data  is difficult to measure and an assumption of randomness and/or statistical validity is questionable.  We apply our methods to real world issues in international relations such as  nuclear deterrence, smart power, and cooperative threat reduction. We next apply our methods to issues in comparative politics such as successful democratization, quality of life, economic freedom, political stability, and fail...

  4. Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador

    Science.gov (United States)

    Chicaiza, E. G.; Leiva, C. A.; Arranz, J. J.; Buenańo, X. E.

    2017-06-01

    Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA), trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.

  5. Spatial uncertainty of a geoid undulation model in Guayaquil, Ecuador

    Directory of Open Access Journals (Sweden)

    Chicaiza E.G.

    2017-06-01

    Full Text Available Geostatistics is a discipline that deals with the statistical analysis of regionalized variables. In this case study, geostatistics is used to estimate geoid undulation in the rural area of Guayaquil town in Ecuador. The geostatistical approach was chosen because the estimation error of prediction map is getting. Open source statistical software R and mainly geoR, gstat and RGeostats libraries were used. Exploratory data analysis (EDA, trend and structural analysis were carried out. An automatic model fitting by Iterative Least Squares and other fitting procedures were employed to fit the variogram. Finally, Kriging using gravity anomaly of Bouguer as external drift and Universal Kriging were used to get a detailed map of geoid undulation. The estimation uncertainty was reached in the interval [-0.5; +0.5] m for errors and a maximum estimation standard deviation of 2 mm in relation with the method of interpolation applied. The error distribution of the geoid undulation map obtained in this study provides a better result than Earth gravitational models publicly available for the study area according the comparison with independent validation points. The main goal of this paper is to confirm the feasibility to use geoid undulations from Global Navigation Satellite Systems and leveling field measurements and geostatistical techniques methods in order to use them in high-accuracy engineering projects.

  6. Nuclear uncertainties in the spin-dependent structure functions for direct dark matter detection

    CERN Document Server

    Cerdeno, David G; Huh, Ji-Haeng; Peiro, Miguel

    2012-01-01

    We study the effect that uncertainties in the nuclear spin-dependent structure functions have in the determination of the dark matter (DM) parameters in a direct detection experiment. We show that different nuclear models that describe the spin-dependent structure function of specific target nuclei can lead to variations in the reconstructed values of the DM mass and scattering cross-section. We propose a parametrization of the spin structure functions that allows us to treat these uncertainties as variations of three parameters, with a central value and deviation that depend on the specific nucleus. The method is illustrated for germanium and xenon detectors with an exposure of 300 kg yr, assuming a hypothetical detection of DM and studying a series of benchmark points for the DM properties. We find that the effect of these uncertainties can be similar in amplitude to that of astrophysical uncertainties, especially in those cases where the spin-dependent contribution to the elastic scattering cross-section i...

  7. A Generalized Statistical Uncertainty Model for Satellite Precipitation Products

    Science.gov (United States)

    Sarachi, S.

    2013-12-01

    A mixture model of Generalized Normal Distribution and Gamma distribution (GND-G) is used to model the joint probability distribution of satellite-based and stage IV radar rainfall under a given spatial and temporal resolution (e.g. 1°x1° and daily rainfall). The distribution parameters of GND-G are extended across various rainfall rates and spatial and temporal resolutions. In the study, GND-G is used to describe the uncertainty of the estimates from Precipitation Estimation from Remote Sensing Information using Artificial Neural Network algorithm (PERSIANN). The stage IV-based multi-sensor precipitation estimates (MPE) are used as reference measurements .The study area for constructing the uncertainty model covers a 15°×15°box of 0.25°×0.25° cells over the eastern United States for summer 2004 to 2009. Cells are aggregated in space and time to obtain data with different resolutions for the construction of the model's parameter space. Result shows that comparing to the other statistical uncertainty models, GND-G fits better than the other models, such as Gaussian and Gamma distributions, to the reference precipitation data. The impact of precipitation uncertainty to the stream flow is further demonstrated by Monte Carlo simulation of precipitation forcing in the hydrologic model. The NWS DMIP2 basins over Illinois River basin south of Siloam is selected in this case study. The data covers the time period of 2006 to 2008.The uncertainty range of stream flow from precipitation of GND-G distributions calculated and will be discussed.

  8. Experimental Active Vibration Control in Truss Structures Considering Uncertainties in System Parameters

    Directory of Open Access Journals (Sweden)

    Douglas Domingues Bueno

    2008-01-01

    Full Text Available This paper deals with the study of algorithms for robust active vibration control in flexible structures considering uncertainties in system parameters. It became an area of enormous interest, mainly due to the countless demands of optimal performance in mechanical systems as aircraft, aerospace, and automotive structures. An important and difficult problem for designing active vibration control is to get a representative dynamic model. Generally, this model can be obtained using finite element method (FEM or an identification method using experimental data. Actuators and sensors may affect the dynamics properties of the structure, for instance, electromechanical coupling of piezoelectric material must be considered in FEM formulation for flexible and lightly damping structure. The nonlinearities and uncertainties involved in these structures make it a difficult task, mainly for complex structures as spatial truss structures. On the other hand, by using an identification method, it is possible to obtain the dynamic model represented through a state space realization considering this coupling. This paper proposes an experimental methodology for vibration control in a 3D truss structure using PZT wafer stacks and a robust control algorithm solved by linear matrix inequalities.

  9. A Simplified Model of Choice Behavior under Uncertainty

    OpenAIRE

    Lin, Ching-Hung; Lin, Yu-Kai; Song, Tzu-Jiun; Huang, Jong-Tsun; Chiu, Yao-Chu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated that m...

  10. A simplified model of choice behavior under uncertainty

    OpenAIRE

    Ching-Hung Lin; Yu-Kai Lin; Tzu-Jiun Song; Jong-Tsun Huang; Yao-Chu Chiu

    2016-01-01

    The Iowa Gambling Task (IGT) has been standardized as a clinical assessment tool (Bechara, 2007). Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU) model (Busemeyer and Stout, 2002) to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the pr...

  11. Uncertainty Quantification in Control Problems for Flocking Models

    Directory of Open Access Journals (Sweden)

    Giacomo Albi

    2015-01-01

    Full Text Available The optimal control of flocking models with random inputs is investigated from a numerical point of view. The effect of uncertainty in the interaction parameters is studied for a Cucker-Smale type model using a generalized polynomial chaos (gPC approach. Numerical evidence of threshold effects in the alignment dynamic due to the random parameters is given. The use of a selective model predictive control permits steering of the system towards the desired state even in unstable regimes.

  12. A GLUE uncertainty analysis of a drying model of pharmaceutical granules

    DEFF Research Database (Denmark)

    Mortier, Séverine Thérèse F.C.; Van Hoey, Stijn; Cierkens, Katrijn;

    2013-01-01

    A shift from batch processing towards continuous processing is of interest in the pharmaceutical industry. However, this transition requires detailed knowledge and process understanding of all consecutive unit operations in a continuous manufacturing line to design adequate control strategies...... uncertainty) originating from uncertainty in input data, model parameters, model structure, boundary conditions and software. In this paper, the model prediction uncertainty is evaluated for a model describing the continuous drying of single pharmaceutical wet granules in a six-segmented fluidized bed drying...... unit, which is part of the full continuous from-powder-to-tablet manufacturing line (Consigma™, GEA Pharma Systems). A validated model describing the drying behaviour of a single pharmaceutical granule in two consecutive phases is used. First of all, the effect of the assumptions at the particle level...

  13. Prediction Uncertainty Analyses for the Combined Physically-Based and Data-Driven Models

    Science.gov (United States)

    Demissie, Y. K.; Valocchi, A. J.; Minsker, B. S.; Bailey, B. A.

    2007-12-01

    The unavoidable simplification associated with physically-based mathematical models can result in biased parameter estimates and correlated model calibration errors, which in return affect the accuracy of model predictions and the corresponding uncertainty analyses. In this work, a physically-based groundwater model (MODFLOW) together with error-correcting artificial neural networks (ANN) are used in a complementary fashion to obtain an improved prediction (i.e. prediction with reduced bias and error correlation). The associated prediction uncertainty of the coupled MODFLOW-ANN model is then assessed using three alternative methods. The first method estimates the combined model confidence and prediction intervals using first-order least- squares regression approximation theory. The second method uses Monte Carlo and bootstrap techniques for MODFLOW and ANN, respectively, to construct the combined model confidence and prediction intervals. The third method relies on a Bayesian approach that uses analytical or Monte Carlo methods to derive the intervals. The performance of these approaches is compared with Generalized Likelihood Uncertainty Estimation (GLUE) and Calibration-Constrained Monte Carlo (CCMC) intervals of the MODFLOW predictions alone. The results are demonstrated for a hypothetical case study developed based on a phytoremediation site at the Argonne National Laboratory. This case study comprises structural, parameter, and measurement uncertainties. The preliminary results indicate that the proposed three approaches yield comparable confidence and prediction intervals, thus making the computationally efficient first-order least-squares regression approach attractive for estimating the coupled model uncertainty. These results will be compared with GLUE and CCMC results.

  14. Space Surveillance Network Scheduling Under Uncertainty: Models and Benefits

    Science.gov (United States)

    Valicka, C.; Garcia, D.; Staid, A.; Watson, J.; Rintoul, M.; Hackebeil, G.; Ntaimo, L.

    2016-09-01

    Advances in space technologies continue to reduce the cost of placing satellites in orbit. With more entities operating space vehicles, the number of orbiting vehicles and debris has reached unprecedented levels and the number continues to grow. Sensor operators responsible for maintaining the space catalog and providing space situational awareness face an increasingly complex and demanding scheduling requirements. Despite these trends, a lack of advanced tools continues to prevent sensor planners and operators from fully utilizing space surveillance resources. One key challenge involves optimally selecting sensors from a network of varying capabilities for missions with differing requirements. Another open challenge, the primary focus of our work, is building robust schedules that effectively plan for uncertainties associated with weather, ad hoc collections, and other target uncertainties. Existing tools and techniques are not amenable to rigorous analysis of schedule optimality and do not adequately address the presented challenges. Building on prior research, we have developed stochastic mixed-integer linear optimization models to address uncertainty due to weather's effect on collection quality. By making use of the open source Pyomo optimization modeling software, we have posed and solved sensor network scheduling models addressing both forms of uncertainty. We present herein models that allow for concurrent scheduling of collections with the same sensor configuration and for proactively scheduling against uncertain ad hoc collections. The suitability of stochastic mixed-integer linear optimization for building sensor network schedules under different run-time constraints will be discussed.

  15. Reducing uncertainty based on model fitness: Application to a ...

    African Journals Online (AJOL)

    2015-01-07

    Jan 7, 2015 ... 2Hydrology and Water Quality, Agricultural and Biological Engineering ... This general methodology is applied to a reservoir model of the Okavango ... Global sensitivity and uncertainty analysis (GSA/UA) system- ... and weighing risks between decisions (Saltelli et al., 2008). ...... resources and support.

  16. Model parameter uncertainty analysis for an annual field-scale P loss model

    Science.gov (United States)

    Bolster, Carl H.; Vadas, Peter A.; Boykin, Debbie

    2016-08-01

    Phosphorous (P) fate and transport models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. Because all models are simplifications of complex systems, there will exist an inherent amount of uncertainty associated with their predictions. It is therefore important that efforts be directed at identifying, quantifying, and communicating the different sources of model uncertainties. In this study, we conducted an uncertainty analysis with the Annual P Loss Estimator (APLE) model. Our analysis included calculating parameter uncertainties and confidence and prediction intervals for five internal regression equations in APLE. We also estimated uncertainties of the model input variables based on values reported in the literature. We then predicted P loss for a suite of fields under different management and climatic conditions while accounting for uncertainties in the model parameters and inputs and compared the relative contributions of these two sources of uncertainty to the overall uncertainty associated with predictions of P loss. Both the overall magnitude of the prediction uncertainties and the relative contributions of the two sources of uncertainty varied depending on management practices and field characteristics. This was due to differences in the number of model input variables and the uncertainties in the regression equations associated with each P loss pathway. Inspection of the uncertainties in the five regression equations brought attention to a previously unrecognized limitation with the equation used to partition surface-applied fertilizer P between leaching and runoff losses. As a result, an alternate equation was identified that provided similar predictions with much less uncertainty. Our results demonstrate how a thorough uncertainty and model residual analysis can be used to identify limitations with a model. Such insight can then be used to guide future data collection and model

  17. Uncertainty Modeling Based on Bayesian Network in Ontology Mapping

    Institute of Scientific and Technical Information of China (English)

    LI Yuhua; LIU Tao; SUN Xiaolin

    2006-01-01

    How to deal with uncertainty is crucial in exact concept mapping between ontologies. This paper presents a new framework on modeling uncertainty in ontologies based on bayesian networks (BN). In our approach, ontology Web language (OWL) is extended to add probabilistic markups for attaching probability information, the source and target ontologies (expressed by patulous OWL) are translated into bayesian networks (BNs), the mapping between the two ontologies can be digged out by constructing the conditional probability tables (CPTs) of the BN using a improved algorithm named I-IPFP based on iterative proportional fitting procedure (IPFP). The basic idea of this framework and algorithm are validated by positive results from computer experiments.

  18. Evaluating the uncertainty of input quantities in measurement models

    Science.gov (United States)

    Possolo, Antonio; Elster, Clemens

    2014-06-01

    The Guide to the Expression of Uncertainty in Measurement (GUM) gives guidance about how values and uncertainties should be assigned to the input quantities that appear in measurement models. This contribution offers a concrete proposal for how that guidance may be updated in light of the advances in the evaluation and expression of measurement uncertainty that were made in the course of the twenty years that have elapsed since the publication of the GUM, and also considering situations that the GUM does not yet contemplate. Our motivation is the ongoing conversation about a new edition of the GUM. While generally we favour a Bayesian approach to uncertainty evaluation, we also recognize the value that other approaches may bring to the problems considered here, and focus on methods for uncertainty evaluation and propagation that are widely applicable, including to cases that the GUM has not yet addressed. In addition to Bayesian methods, we discuss maximum-likelihood estimation, robust statistical methods, and measurement models where values of nominal properties play the same role that input quantities play in traditional models. We illustrate these general-purpose techniques in concrete examples, employing data sets that are realistic but that also are of conveniently small sizes. The supplementary material available online lists the R computer code that we have used to produce these examples (stacks.iop.org/Met/51/3/339/mmedia). Although we strive to stay close to clause 4 of the GUM, which addresses the evaluation of uncertainty for input quantities, we depart from it as we review the classes of measurement models that we believe are generally useful in contemporary measurement science. We also considerably expand and update the treatment that the GUM gives to Type B evaluations of uncertainty: reviewing the state-of-the-art, disciplined approach to the elicitation of expert knowledge, and its encapsulation in probability distributions that are usable in

  19. The impact of model and rainfall forcing errors on characterizing soil moisture uncertainty in land surface modeling

    Directory of Open Access Journals (Sweden)

    V. Maggioni

    2012-10-01

    Full Text Available The contribution of rainfall forcing errors relative to model (structural and parameter uncertainty in the prediction of soil moisture is investigated by integrating the NASA Catchment Land Surface Model (CLSM, forced with hydro-meteorological data, in the Oklahoma region. Rainfall-forcing uncertainty is introduced using a stochastic error model that generates ensemble rainfall fields from satellite rainfall products. The ensemble satellite rain fields are propagated through CLSM to produce soil moisture ensembles. Errors in CLSM are modeled with two different approaches: either by perturbing model parameters (representing model parameter uncertainty or by adding randomly generated noise (representing model structure and parameter uncertainty to the model prognostic variables. Our findings highlight that the method currently used in the NASA GEOS-5 Land Data Assimilation System to perturb CLSM variables poorly describes the uncertainty in the predicted soil moisture, even when combined with rainfall model perturbations. On the other hand, by adding model parameter perturbations to rainfall forcing perturbations, a better characterization of uncertainty in soil moisture simulations is observed. Specifically, an analysis of the rank histograms shows that the most consistent ensemble of soil moisture is obtained by combining rainfall and model parameter perturbations. When rainfall forcing and model prognostic perturbations are added, the rank histogram shows a U-shape at the domain average scale, which corresponds to a lack of variability in the forecast ensemble. The more accurate estimation of the soil moisture prediction uncertainty obtained by combining rainfall and parameter perturbations is encouraging for the application of this approach in ensemble data assimilation systems.

  20. Managing uncertainty in collaborative robotics engineering projects: The influence of task structure and peer interaction

    Science.gov (United States)

    Jordan, Michelle

    Uncertainty is ubiquitous in life, and learning is an activity particularly likely to be fraught with uncertainty. Previous research suggests that students and teachers struggle in their attempts to manage the psychological experience of uncertainty and that students often fail to experience uncertainty when uncertainty may be warranted. Yet, few educational researchers have explicitly and systematically observed what students do, their behaviors and strategies, as they attempt to manage the uncertainty they experience during academic tasks. In this study I investigated how students in one fifth grade class managed uncertainty they experienced while engaged in collaborative robotics engineering projects, focusing particularly on how uncertainty management was influenced by task structure and students' interactions with their peer collaborators. The study was initiated at the beginning of instruction related to robotics engineering and preceded through the completion of several long-term collaborative robotics projects, one of which was a design project. I relied primarily on naturalistic observation of group sessions, semi-structured interviews, and collection of artifacts. My data analysis was inductive and interpretive, using qualitative discourse analysis techniques and methods of grounded theory. Three theoretical frameworks influenced the conception and design of this study: community of practice, distributed cognition, and complex adaptive systems theory. Uncertainty was a pervasive experience for the students collaborating in this instructional context. Students experienced uncertainty related to the project activity and uncertainty related to the social system as they collaborated to fulfill the requirements of their robotics engineering projects. They managed their uncertainty through a diverse set of tactics for reducing, ignoring, maintaining, and increasing uncertainty. Students experienced uncertainty from more different sources and used more and

  1. Reliable Estimation of Prediction Uncertainty for Physicochemical Property Models.

    Science.gov (United States)

    Proppe, Jonny; Reiher, Markus

    2017-07-11

    One of the major challenges in computational science is to determine the uncertainty of a virtual measurement, that is the prediction of an observable based on calculations. As highly accurate first-principles calculations are in general unfeasible for most physical systems, one usually resorts to parameteric property models of observables, which require calibration by incorporating reference data. The resulting predictions and their uncertainties are sensitive to systematic errors such as inconsistent reference data, parametric model assumptions, or inadequate computational methods. Here, we discuss the calibration of property models in the light of bootstrapping, a sampling method that can be employed for identifying systematic errors and for reliable estimation of the prediction uncertainty. We apply bootstrapping to assess a linear property model linking the (57)Fe Mössbauer isomer shift to the contact electron density at the iron nucleus for a diverse set of 44 molecular iron compounds. The contact electron density is calculated with 12 density functionals across Jacob's ladder (PWLDA, BP86, BLYP, PW91, PBE, M06-L, TPSS, B3LYP, B3PW91, PBE0, M06, TPSSh). We provide systematic-error diagnostics and reliable, locally resolved uncertainties for isomer-shift predictions. Pure and hybrid density functionals yield average prediction uncertainties of 0.06-0.08 mm s(-1) and 0.04-0.05 mm s(-1), respectively, the latter being close to the average experimental uncertainty of 0.02 mm s(-1). Furthermore, we show that both model parameters and prediction uncertainty depend significantly on the composition and number of reference data points. Accordingly, we suggest that rankings of density functionals based on performance measures (e.g., the squared coefficient of correlation, r(2), or the root-mean-square error, RMSE) should not be inferred from a single data set. This study presents the first statistically rigorous calibration analysis for theoretical M

  2. Combined Estimation of Hydrogeologic Conceptual Model and Parameter Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, Philip D.; Ye, Ming; Neuman, Shlomo P.; Cantrell, Kirk J.

    2004-03-01

    The objective of the research described in this report is the development and application of a methodology for comprehensively assessing the hydrogeologic uncertainties involved in dose assessment, including uncertainties associated with conceptual models, parameters, and scenarios. This report describes and applies a statistical method to quantitatively estimate the combined uncertainty in model predictions arising from conceptual model and parameter uncertainties. The method relies on model averaging to combine the predictions of a set of alternative models. Implementation is driven by the available data. When there is minimal site-specific data the method can be carried out with prior parameter estimates based on generic data and subjective prior model probabilities. For sites with observations of system behavior (and optionally data characterizing model parameters), the method uses model calibration to update the prior parameter estimates and model probabilities based on the correspondence between model predictions and site observations. The set of model alternatives can contain both simplified and complex models, with the requirement that all models be based on the same set of data. The method was applied to the geostatistical modeling of air permeability at a fractured rock site. Seven alternative variogram models of log air permeability were considered to represent data from single-hole pneumatic injection tests in six boreholes at the site. Unbiased maximum likelihood estimates of variogram and drift parameters were obtained for each model. Standard information criteria provided an ambiguous ranking of the models, which would not justify selecting one of them and discarding all others as is commonly done in practice. Instead, some of the models were eliminated based on their negligibly small updated probabilities and the rest were used to project the measured log permeabilities by kriging onto a rock volume containing the six boreholes. These four

  3. A simplified model of choice behavior under uncertainty

    Directory of Open Access Journals (Sweden)

    Ching-Hung Lin

    2016-08-01

    Full Text Available The Iowa Gambling Task (IGT has been standardized as a clinical assessment tool (Bechara, 2007. Nonetheless, numerous research groups have attempted to modify IGT models to optimize parameters for predicting the choice behavior of normal controls and patients. A decade ago, most researchers considered the expected utility (EU model (Busemeyer and Stout, 2002 to be the optimal model for predicting choice behavior under uncertainty. However, in recent years, studies have demonstrated the prospect utility (PU models (Ahn et al., 2008 to be more effective than the EU models in the IGT. Nevertheless, after some preliminary tests, we propose that Ahn et al. (2008 PU model is not optimal due to some incompatible results between our behavioral and modeling data. This study aims to modify Ahn et al. (2008 PU model to a simplified model and collected 145 subjects’ IGT performance as the benchmark data for comparison. In our simplified PU model, the best goodness-of-fit was found mostly while α approaching zero. More specifically, we retested the key parameters α, λ , and A in the PU model. Notably, the power of influence of the parameters α, λ, and A has a hierarchical order in terms of manipulating the goodness-of-fit in the PU model. Additionally, we found that the parameters λ and A may be ineffective when the parameter α is close to zero in the PU model. The present simplified model demonstrated that decision makers mostly adopted the strategy of gain-stay-loss-shift rather than foreseeing the long-term outcome. However, there still have other behavioral variables that are not well revealed under these dynamic uncertainty situations. Therefore, the optimal behavioral models may not have been found. In short, the best model for predicting choice behavior under dynamic-uncertainty situations should be further evaluated.

  4. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2012-06-01

    Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.

  5. Developing Uncertainty Models for Robust Flutter Analysis Using Ground Vibration Test Data

    Science.gov (United States)

    Potter, Starr; Lind, Rick; Kehoe, Michael W. (Technical Monitor)

    2001-01-01

    A ground vibration test can be used to obtain information about structural dynamics that is important for flutter analysis. Traditionally, this information#such as natural frequencies of modes#is used to update analytical models used to predict flutter speeds. The ground vibration test can also be used to obtain uncertainty models, such as natural frequencies and their associated variations, that can update analytical models for the purpose of predicting robust flutter speeds. Analyzing test data using the -norm, rather than the traditional 2-norm, is shown to lead to a minimum-size uncertainty description and, consequently, a least-conservative robust flutter speed. This approach is demonstrated using ground vibration test data for the Aerostructures Test Wing. Different norms are used to formulate uncertainty models and their associated robust flutter speeds to evaluate which norm is least conservative.

  6. Extended Range Hydrological Predictions: Uncertainty Associated with Model Parametrization

    Science.gov (United States)

    Joseph, J.; Ghosh, S.; Sahai, A. K.

    2016-12-01

    The better understanding of various atmospheric processes has led to improved predictions of meteorological conditions at various temporal scale, ranging from short term which cover a period up to 2 days to long term covering a period of more than 10 days. Accurate prediction of hydrological variables can be done using these predicted meteorological conditions, which would be helpful in proper management of water resources. Extended range hydrological simulation includes the prediction of hydrological variables for a period more than 10 days. The main sources of uncertainty in hydrological predictions include the uncertainty in the initial conditions, meteorological forcing and model parametrization. In the present study, the Extended Range Prediction developed for India for monsoon by Indian Institute of Tropical Meteorology (IITM), Pune is used as meteorological forcing for the Variable Infiltration Capacity (VIC) model. Sensitive hydrological parameters, as derived from literature, along with a few vegetation parameters are assumed to be uncertain and 1000 random values are generated given their prescribed ranges. Uncertainty bands are generated by performing Monte-Carlo Simulations (MCS) for the generated sets of parameters and observed meteorological forcings. The basins with minimum human intervention, within the Indian Peninsular region, are identified and validation of results are carried out using the observed gauge discharge. Further, the uncertainty bands are generated for the extended range hydrological predictions by performing MCS for the same set of parameters and extended range meteorological predictions. The results demonstrate the uncertainty associated with the model parametrisation for the extended range hydrological simulations. Keywords: Extended Range Prediction, Variable Infiltration Capacity model, Monte Carlo Simulation.

  7. Formal modeling of a system of chemical reactions under uncertainty.

    Science.gov (United States)

    Ghosh, Krishnendu; Schlipf, John

    2014-10-01

    We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.

  8. Comparative Analysis of Uncertainties in Urban Surface Runoff Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Schaarup-Jensen, Kjeld

    2007-01-01

    In the present paper a comparison between three different surface runoff models, in the numerical urban drainage tool MOUSE, is conducted. Analysing parameter uncertainty, it is shown that the models are very sensitive with regards to the choice of hydrological parameters, when combined overflow...... analysis, further research in improved parameter assessment for surface runoff models is needed....... volumes are compared - especially when the models are uncalibrated. The occurrences of flooding and surcharge are highly dependent on both hydrological and hydrodynamic parameters. Thus, the conclusion of the paper is that if the use of model simulations is to be a reliable tool for drainage system...

  9. Modeling Heterogeneity in Networks using Uncertainty Quantification Tools

    CERN Document Server

    Rajendran, Karthikeyan; Siettos, Constantinos I; Laing, Carlo R; Kevrekidis, Ioannis G

    2015-01-01

    Using the dynamics of information propagation on a network as our illustrative example, we present and discuss a systematic approach to quantifying heterogeneity and its propagation that borrows established tools from Uncertainty Quantification. The crucial assumption underlying this mathematical and computational "technology transfer" is that the evolving states of the nodes in a network quickly become correlated with the corresponding node "identities": features of the nodes imparted by the network structure (e.g. the node degree, the node clustering coefficient). The node dynamics thus depend on heterogeneous (rather than uncertain) parameters, whose distribution over the network results from the network structure. Knowing these distributions allows us to obtain an efficient coarse-grained representation of the network state in terms of the expansion coefficients in suitable orthogonal polynomials. This representation is closely related to mathematical/computational tools for uncertainty quantification (th...

  10. The effect of uncertainty and systematic errors in hydrological modelling

    Science.gov (United States)

    Steinsland, I.; Engeland, K.; Johansen, S. S.; Øverleir-Petersen, A.; Kolberg, S. A.

    2014-12-01

    The aims of hydrological model identification and calibration are to find the best possible set of process parametrization and parameter values that transform inputs (e.g. precipitation and temperature) to outputs (e.g. streamflow). These models enable us to make predictions of streamflow. Several sources of uncertainties have the potential to hamper the possibility of a robust model calibration and identification. In order to grasp the interaction between model parameters, inputs and streamflow, it is important to account for both systematic and random errors in inputs (e.g. precipitation and temperatures) and streamflows. By random errors we mean errors that are independent from time step to time step whereas by systematic errors we mean errors that persists for a longer period. Both random and systematic errors are important in the observation and interpolation of precipitation and temperature inputs. Important random errors comes from the measurements themselves and from the network of gauges. Important systematic errors originate from the under-catch in precipitation gauges and from unknown spatial trends that are approximated in the interpolation. For streamflow observations, the water level recordings might give random errors whereas the rating curve contributes mainly with a systematic error. In this study we want to answer the question "What is the effect of random and systematic errors in inputs and observed streamflow on estimated model parameters and streamflow predictions?". To answer we test systematically the effect of including uncertainties in inputs and streamflow during model calibration and simulation in distributed HBV model operating on daily time steps for the Osali catchment in Norway. The case study is based on observations from, uncertainty carefullt quantified, and increased uncertainties and systmatical errors are done realistically by for example removing a precipitation gauge from the network.We find that the systematical errors in

  11. A Multi-Model Approach for Uncertainty Propagation and Model Calibration in CFD Applications

    CERN Document Server

    Wang, Jian-xun; Xiao, Heng

    2015-01-01

    Proper quantification and propagation of uncertainties in computational simulations are of critical importance. This issue is especially challenging for CFD applications. A particular obstacle for uncertainty quantifications in CFD problems is the large model discrepancies associated with the CFD models used for uncertainty propagation. Neglecting or improperly representing the model discrepancies leads to inaccurate and distorted uncertainty distribution for the Quantities of Interest. High-fidelity models, being accurate yet expensive, can accommodate only a small ensemble of simulations and thus lead to large interpolation errors and/or sampling errors; low-fidelity models can propagate a large ensemble, but can introduce large modeling errors. In this work, we propose a multi-model strategy to account for the influences of model discrepancies in uncertainty propagation and to reduce their impact on the predictions. Specifically, we take advantage of CFD models of multiple fidelities to estimate the model ...

  12. Assessment of errors and uncertainty patterns in GIA modeling

    DEFF Research Database (Denmark)

    Barletta, Valentina Roberta; Spada, G.

    During the last decade many efforts have been devoted to the assessment of global sea level rise and to the determination of the mass balance of continental ice sheets. In this context, the important role of glacial-isostatic adjustment (GIA) has been clearly recognized. Yet, in many cases only one...... "preferred" GIA model has been used, without any consideration of the possible errors involved. Lacking a rigorous assessment of systematic errors in GIA modeling, the reliability of the results is uncertain. GIA sensitivity and uncertainties associated with the viscosity models have been explored......, such as time-evolving shorelines and paleo-coastlines. In this study we quantify these uncertainties and their propagation in GIA response using a Monte Carlo approach to obtain spatio-temporal patterns of GIA errors. A direct application is the error estimates in ice mass balance in Antarctica and Greenland...

  13. RANS turbulence model form uncertainty quantification for wind engineering flows

    Science.gov (United States)

    Gorle, Catherine; Zeoli, Stephanie; Bricteux, Laurent

    2016-11-01

    Reynolds-averaged Navier-Stokes simulations with linear eddy-viscosity turbulence models are commonly used for modeling wind engineering flows, but the use of the results for critical design decisions is hindered by the limited capability of the models to correctly predict bluff body flows. A turbulence model form uncertainty quantification (UQ) method to define confidence intervals for the results could remove this limitation, and promising results were obtained in a previous study of the flow in downtown Oklahoma City. The objective of the present study is to further investigate the validity of these results by considering the simplified test case of the flow around a wall-mounted cube. DNS data is used to determine: 1. whether the marker, which identifies regions that deviate from parallel shear flow, is a good indicator for the regions where the turbulence model fails, and 2. which Reynolds stress perturbations, in terms of the tensor magnitude and the eigenvalues and eigenvectors of the normalized anisotropy tensor, can capture the uncertainty in the flow field. A comparison of confidence intervals obtained with the UQ method and the DNS solution indicates that the uncertainty in the velocity field can be captured correctly in a large portion of the flow field.

  14. Effects of input uncertainty on cross-scale crop modeling

    Science.gov (United States)

    Waha, Katharina; Huth, Neil; Carberry, Peter

    2014-05-01

    The quality of data on climate, soils and agricultural management in the tropics is in general low or data is scarce leading to uncertainty in process-based modeling of cropping systems. Process-based crop models are common tools for simulating crop yields and crop production in climate change impact studies, studies on mitigation and adaptation options or food security studies. Crop modelers are concerned about input data accuracy as this, together with an adequate representation of plant physiology processes and choice of model parameters, are the key factors for a reliable simulation. For example, assuming an error in measurements of air temperature, radiation and precipitation of ± 0.2°C, ± 2 % and ± 3 % respectively, Fodor & Kovacs (2005) estimate that this translates into an uncertainty of 5-7 % in yield and biomass simulations. In our study we seek to answer the following questions: (1) are there important uncertainties in the spatial variability of simulated crop yields on the grid-cell level displayed on maps, (2) are there important uncertainties in the temporal variability of simulated crop yields on the aggregated, national level displayed in time-series, and (3) how does the accuracy of different soil, climate and management information influence the simulated crop yields in two crop models designed for use at different spatial scales? The study will help to determine whether more detailed information improves the simulations and to advise model users on the uncertainty related to input data. We analyse the performance of the point-scale crop model APSIM (Keating et al., 2003) and the global scale crop model LPJmL (Bondeau et al., 2007) with different climate information (monthly and daily) and soil conditions (global soil map and African soil map) under different agricultural management (uniform and variable sowing dates) for the low-input maize-growing areas in Burkina Faso/West Africa. We test the models' response to different levels of input

  15. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2013-08-01

    Full Text Available The assessment of climate change impacts on the risk for pesticide leaching needs careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-west Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO-model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-west Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios could provide robust probabilistic estimates of future pesticide losses and assessments of changes in pesticide leaching risks.

  16. Modelling pesticide leaching under climate change: parameter vs. climate input uncertainty

    Directory of Open Access Journals (Sweden)

    K. Steffens

    2014-02-01

    Full Text Available Assessing climate change impacts on pesticide leaching requires careful consideration of different sources of uncertainty. We investigated the uncertainty related to climate scenario input and its importance relative to parameter uncertainty of the pesticide leaching model. The pesticide fate model MACRO was calibrated against a comprehensive one-year field data set for a well-structured clay soil in south-western Sweden. We obtained an ensemble of 56 acceptable parameter sets that represented the parameter uncertainty. Nine different climate model projections of the regional climate model RCA3 were available as driven by different combinations of global climate models (GCM, greenhouse gas emission scenarios and initial states of the GCM. The future time series of weather data used to drive the MACRO model were generated by scaling a reference climate data set (1970–1999 for an important agricultural production area in south-western Sweden based on monthly change factors for 2070–2099. 30 yr simulations were performed for different combinations of pesticide properties and application seasons. Our analysis showed that both the magnitude and the direction of predicted change in pesticide leaching from present to future depended strongly on the particular climate scenario. The effect of parameter uncertainty was of major importance for simulating absolute pesticide losses, whereas the climate uncertainty was relatively more important for predictions of changes of pesticide losses from present to future. The climate uncertainty should be accounted for by applying an ensemble of different climate scenarios. The aggregated ensemble prediction based on both acceptable parameterizations and different climate scenarios has the potential to provide robust probabilistic estimates of future pesticide losses.

  17. Dealing with unquantifiable uncertainties in landslide modelling for urban risk reduction in developing countries

    Science.gov (United States)

    Almeida, Susana; Holcombe, Liz; Pianosi, Francesca; Wagener, Thorsten

    2016-04-01

    Landslides have many negative economic and societal impacts, including the potential for significant loss of life and damage to infrastructure. Slope stability assessment can be used to guide decisions about the management of landslide risk, but its usefulness can be challenged by high levels of uncertainty in predicting landslide occurrence. Prediction uncertainty may be associated with the choice of model that is used to assess slope stability, the quality of the available input data, or a lack of knowledge of how future climatic and socio-economic changes may affect future landslide risk. While some of these uncertainties can be characterised by relatively well-defined probability distributions, for other uncertainties, such as those linked to climate change, no probability distribution is available to characterise them. This latter type of uncertainty, often referred to as deep uncertainty, means that robust policies need to be developed that are expected to perform acceptably well over a wide range of future conditions. In our study the impact of deep uncertainty on slope stability predictions is assessed in a quantitative and structured manner using Global Sensitivity Analysis (GSA) and the Combined Hydrology and Stability Model (CHASM). In particular, we use several GSA methods including the Method of Morris, Regional Sensitivity Analysis and Classification and Regression Trees (CART), as well as advanced visualization tools, to assess the combination of conditions that may lead to slope failure. Our example application is a slope in the Caribbean, an area that is naturally susceptible to landslides due to a combination of high rainfall rates during the hurricane season, steep slopes, and highly weathered residual soils. Rapid unplanned urbanisation and changing climate may further exacerbate landslide risk in the future. Our example shows how we can gain useful information in the presence of deep uncertainty by combining physically based models with GSA in

  18. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  19. Robust model-reference control for descriptor linear systems subject to parameter uncertainties

    Institute of Scientific and Technical Information of China (English)

    Guangren DUAN; Biao ZHANG

    2007-01-01

    Robust model-reference control for descriptor linear systems with structural parameter uncertainties is investigated. A sufficient condition for existing a model-reference zero-error asymptotic tracking controller is given. It is shown that the robust model reference control problem can be decomposed into two subproblems: a robust state feedback stabilization problem for descriptor systems subject to parameter uncertainties and a robust compensation problem. The latter aims to find three coefficient matrices which satisfy four matrix equations and simultaneously minimize the effect of the uncertainties to the tracking error. Based on a complete parametric solution to a class of generalized Sylvester matrix equations, the robust compensation problem is converted into a minimization problem with quadratic cost and linear constraints. A numerical example shows the effect of the proposed approach.

  20. The role of swift relationship and institutional structures in uncertainty reduction

    NARCIS (Netherlands)

    Huang, Q.; Ou, Carol; Davison, R.M.

    2016-01-01

    Uncertainty has been regarded as the most prominent barrier in ecommerce. However, how communication between buyers and seller contributes to a reduction in uncertainty is under-investigated. Integrating uncertainty reduction theory and relational contract theory, we develop a model that explain how

  1. Assimilating multi-source uncertainties of a parsimonious conceptual hydrological model using hierarchical Bayesian modeling

    Science.gov (United States)

    Wei Wu; James Clark; James Vose

    2010-01-01

    Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model – GR4J – by coherently assimilating the uncertainties from the...

  2. Assessing and propagating uncertainty in model inputs in corsim

    Energy Technology Data Exchange (ETDEWEB)

    Molina, G.; Bayarri, M. J.; Berger, J. O.

    2001-07-01

    CORSIM is a large simulator for vehicular traffic, and is being studied with respect to its ability to successfully model and predict behavior of traffic in a 36 block section of Chicago. Inputs to the simulator include information about street configuration, driver behavior, traffic light timing, turning probabilities at each corner and distributions of traffic ingress into the system. This work is described in more detail in the article Fast Simulators for Assessment and Propagation of Model Uncertainty also in these proceedings. The focus of this conference poster is on the computational aspects of this problem. In particular, we address the description of the full conditional distributions needed for implementation of the MCMC algorithm and, in particular, how the constraints can be incorporated; details concerning the run time and convergence of the MCMC algorithm; and utilisation of the MCMC output for prediction and uncertainty analysis concerning the CORSIM computer model. As this last is the ultimate goal, it is worth emphasizing that the incorporation of all uncertainty concerning inputs can significantly affect the model predictions. (Author)

  3. Transforming Information into Models: A Discussion of Uncertainty and Its Treatment

    Science.gov (United States)

    Gupta, H. V.

    2005-12-01

    Uncertainty and insufficiency of information are unavoidable in modeling. The treatment of uncertainty has received a surge in attention as (a) decision makers push for better quantification of the accuracy and precision of environmental model predictions, (b) interest grows in proper methods for merging information & data with models, and (c) scientists push to better represent what is (and is not) well understood about the environmental systems we study. As recognized by this session, there is a critical need to better understand and reflect the nature of ``model error'' in hydrologic prediction and decision making. This is an interesting challenge, particularly if one argues that there is no such thing as a ``true'' model. This talk will discuss the different kinds of uncertainties associated with different components of any model, and propose a structural basis for a theory of model evaluation. While it seems common to view models as explicit statements of what we (think we) know, I find it more interesting and productive to view models as explicit statements of the uncertainty in our knowledge. A philosophical shift towards this complementary approach has the potential to open new ways forward.

  4. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Directory of Open Access Journals (Sweden)

    A. E. Sikorska

    2011-12-01

    Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced by 150% with Bayesian updating, using only a few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  5. Bayesian uncertainty assessment of flood predictions in ungauged urban basins for conceptual rainfall-runoff models

    Directory of Open Access Journals (Sweden)

    A. E. Sikorska

    2012-04-01

    Full Text Available Urbanization and the resulting land-use change strongly affect the water cycle and runoff-processes in watersheds. Unfortunately, small urban watersheds, which are most affected by urban sprawl, are mostly ungauged. This makes it intrinsically difficult to assess the consequences of urbanization. Most of all, it is unclear how to reliably assess the predictive uncertainty given the structural deficits of the applied models. In this study, we therefore investigate the uncertainty of flood predictions in ungauged urban basins from structurally uncertain rainfall-runoff models. To this end, we suggest a procedure to explicitly account for input uncertainty and model structure deficits using Bayesian statistics with a continuous-time autoregressive error model. In addition, we propose a concise procedure to derive prior parameter distributions from base data and successfully apply the methodology to an urban catchment in Warsaw, Poland. Based on our results, we are able to demonstrate that the autoregressive error model greatly helps to meet the statistical assumptions and to compute reliable prediction intervals. In our study, we found that predicted peak flows were up to 7 times higher than observations. This was reduced to 5 times with Bayesian updating, using only few discharge measurements. In addition, our analysis suggests that imprecise rainfall information and model structure deficits contribute mostly to the total prediction uncertainty. In the future, flood predictions in ungauged basins will become more important due to ongoing urbanization as well as anthropogenic and climatic changes. Thus, providing reliable measures of uncertainty is crucial to support decision making.

  6. Propagating Uncertainties from Source Model Estimations to Coulomb Stress Changes

    Science.gov (United States)

    Baumann, C.; Jonsson, S.; Woessner, J.

    2009-12-01

    Multiple studies have shown that static stress changes due to permanent fault displacement trigger earthquakes on the causative and on nearby faults. Calculations of static stress changes in previous studies have been based on fault parameters without considering any source model uncertainties or with crude assumptions about fault model errors based on available different source models. In this study, we investigate the influence of fault model parameter uncertainties on Coulomb Failure Stress change (ΔCFS) calculations by propagating the uncertainties from the fault estimation process to the Coulomb Failure stress changes. We use 2500 sets of correlated model parameters determined for the June 2000 Mw = 5.8 Kleifarvatn earthquake, southwest Iceland, which were estimated by using a repeated optimization procedure and multiple data sets that had been modified by synthetic noise. The model parameters show that the event was predominantly a right-lateral strike-slip earthquake on a north-south striking fault. The variability of the sets of models represents the posterior probability density distribution for the Kleifarvatn source model. First we investigate the influence of individual source model parameters on the ΔCFS calculations. We show through a correlation analysis that for this event, changes in dip, east location, strike, width and in part north location have stronger impact on the Coulomb failure stress changes than changes in fault length, depth, dip-slip and strike-slip. Second we find that the accuracy of Coulomb failure stress changes appears to increase with increasing distance from the fault. The absolute value of the standard deviation decays rapidly with distance within about 5-6 km around the fault from about 3-3.5 MPa down to a few Pa, implying that the influence of parameter changes decrease with increasing distance. This is underlined by the coefficient of variation CV, defined as the ratio of the standard deviation of the Coulomb stress

  7. Structural system identification: Structural dynamics model validation

    Energy Technology Data Exchange (ETDEWEB)

    Red-Horse, J.R.

    1997-04-01

    Structural system identification is concerned with the development of systematic procedures and tools for developing predictive analytical models based on a physical structure`s dynamic response characteristics. It is a multidisciplinary process that involves the ability (1) to define high fidelity physics-based analysis models, (2) to acquire accurate test-derived information for physical specimens using diagnostic experiments, (3) to validate the numerical simulation model by reconciling differences that inevitably exist between the analysis model and the experimental data, and (4) to quantify uncertainties in the final system models and subsequent numerical simulations. The goal of this project was to develop structural system identification techniques and software suitable for both research and production applications in code and model validation.

  8. Uncertainty Analysis of Integrated Navigation Model for Underwater Vehicle

    Directory of Open Access Journals (Sweden)

    Zhang Tao

    2013-02-01

    Full Text Available In this study, to reduce information uncertainty of integrated navigation model for underwater vehicle, we present a multi-sensor information fusion algorithm based on evidence theory. The algorithm reduces attribution by rough set in order to acquire simplified ELMAN neural network and improve basic probability assignment. And then it uses improved D-S evidence to deal with the inaccuracy and fuzzy information, make the final decision. The simulation example shows feasibility and effectiveness of the algorithm.

  9. Energy and Uncertainty: Models and Algorithms for Complex Energy Systems

    OpenAIRE

    2014-01-01

    The problem of controlling energy systems (generation, transmission, storage, investment) introduces a number of optimization problems which need to be solved in the presence of different types of uncertainty. We highlight several of these applications, using a simple energy storage problem as a case application. Using this setting, we describe a modeling framework based around five fundamental dimensions which is more natural than the standard canonical form widely used in the reinforcement ...

  10. Hydrological model uncertainty due to spatial evapotranspiration estimation methods

    Science.gov (United States)

    Yu, Xuan; Lamačová, Anna; Duffy, Christopher; Krám, Pavel; Hruška, Jakub

    2016-05-01

    Evapotranspiration (ET) continues to be a difficult process to estimate in seasonal and long-term water balances in catchment models. Approaches to estimate ET typically use vegetation parameters (e.g., leaf area index [LAI], interception capacity) obtained from field observation, remote sensing data, national or global land cover products, and/or simulated by ecosystem models. In this study we attempt to quantify the uncertainty that spatial evapotranspiration estimation introduces into hydrological simulations when the age of the forest is not precisely known. The Penn State Integrated Hydrologic Model (PIHM) was implemented for the Lysina headwater catchment, located 50°03‧N, 12°40‧E in the western part of the Czech Republic. The spatial forest patterns were digitized from forest age maps made available by the Czech Forest Administration. Two ET methods were implemented in the catchment model: the Biome-BGC forest growth sub-model (1-way coupled to PIHM) and with the fixed-seasonal LAI method. From these two approaches simulation scenarios were developed. We combined the estimated spatial forest age maps and two ET estimation methods to drive PIHM. A set of spatial hydrologic regime and streamflow regime indices were calculated from the modeling results for each method. Intercomparison of the hydrological responses to the spatial vegetation patterns suggested considerable variation in soil moisture and recharge and a small uncertainty in the groundwater table elevation and streamflow. The hydrologic modeling with ET estimated by Biome-BGC generated less uncertainty due to the plant physiology-based method. The implication of this research is that overall hydrologic variability induced by uncertain management practices was reduced by implementing vegetation models in the catchment models.

  11. Sensitivities and uncertainties of modeled ground temperatures in mountain environments

    Directory of Open Access Journals (Sweden)

    S. Gubler

    2013-02-01

    Full Text Available Before operational use or for decision making, models must be validated, and the degree of trust in model outputs should be quantified. Often, model validation is performed at single locations due to the lack of spatially-distributed data. Since the analysis of parametric model uncertainties can be performed independently of observations, it is a suitable method to test the influence of environmental variability on model evaluation. In this study, the sensitivities and uncertainty of a physically-based mountain permafrost model are quantified within an artificial topography consisting of different elevations and exposures combined with six ground types characterized by their hydraulic properties. The analyses performed for all combinations of topographic factors and ground types allowed to quantify the variability of model sensitivity and uncertainty within mountain regions. We found that modeled snow duration considerably influences the mean annual ground temperature (MAGT. The melt-out day of snow (MD is determined by processes determining snow accumulation and melting. Parameters such as the temperature and precipitation lapse rate and the snow correction factor have therefore a great impact on modeled MAGT. Ground albedo changes MAGT from 0.5 to 4°C in dependence of the elevation, the aspect and the ground type. South-exposed inclined locations are more sensitive to changes in ground albedo than north-exposed slopes since they receive more solar radiation. The sensitivity to ground albedo increases with decreasing elevation due to shorter snow cover. Snow albedo and other parameters determining the amount of reflected solar radiation are important, changing MAGT at different depths by more than 1°C. Parameters influencing the turbulent fluxes as the roughness length or the dew temperature are more sensitive at low elevation sites due to higher air temperatures and decreased solar radiation. Modeling the individual terms of the energy

  12. Probabilistic uncertainty quantification of wavelet-transform-based structural health monitoring features

    Science.gov (United States)

    Sarrafi, Aral; Mao, Zhu

    2016-04-01

    In the application of Structural Health Monitoring (SHM), processing the online-acquired data plays a very important role, among which wavelet transform is an outstanding tool and compared to Fourier transform, it handles the nonstationary behaviors in the time series in an adaptive fashion. When dealing with time-variant data, there are uncertainties from numerous resources inherent to the feature estimation, such as measurement noise, operational and environmental variability, hardware limitation, etc. The corruption from uncertainty will make the data interpretation ambiguous and thereby dramatically degrades the decision quality with regard to the occurrence, location, severity, and extent of damages. This paper derives a probabilistic model to quantify analytically the uncertainty of wavelet transform feature as a random variable, and variance is derived analytically in this work. Considering central limit theorem, Gaussian probability density function characterizes the distribution and this has been validated via Monte Carlo testing. By fully characterizing the uncertainty, the damage detection implementations may be facilitated with a quantified false alarm rate and miss catch rate.

  13. Stochastic reduced order models for inverse problems under uncertainty.

    Science.gov (United States)

    Warner, James E; Aquino, Wilkins; Grigoriu, Mircea D

    2015-03-01

    This work presents a novel methodology for solving inverse problems under uncertainty using stochastic reduced order models (SROMs). Given statistical information about an observed state variable in a system, unknown parameters are estimated probabilistically through the solution of a model-constrained, stochastic optimization problem. The point of departure and crux of the proposed framework is the representation of a random quantity using a SROM - a low dimensional, discrete approximation to a continuous random element that permits e cient and non-intrusive stochastic computations. Characterizing the uncertainties with SROMs transforms the stochastic optimization problem into a deterministic one. The non-intrusive nature of SROMs facilitates e cient gradient computations for random vector unknowns and relies entirely on calls to existing deterministic solvers. Furthermore, the method is naturally extended to handle multiple sources of uncertainty in cases where state variable data, system parameters, and boundary conditions are all considered random. The new and widely-applicable SROM framework is formulated for a general stochastic optimization problem in terms of an abstract objective function and constraining model. For demonstration purposes, however, we study its performance in the specific case of inverse identification of random material parameters in elastodynamics. We demonstrate the ability to efficiently recover random shear moduli given material displacement statistics as input data. We also show that the approach remains effective for the case where the loading in the problem is random as well.

  14. Economic-mathematical methods and models under uncertainty

    CERN Document Server

    Aliyev, A G

    2013-01-01

    Brief Information on Finite-Dimensional Vector Space and its Application in EconomicsBases of Piecewise-Linear Economic-Mathematical Models with Regard to Influence of Unaccounted Factors in Finite-Dimensional Vector SpacePiecewise Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence in Three-Dimensional Vector SpacePiecewise-Linear Economic-Mathematical Models with Regard to Unaccounted Factors Influence on a PlaneBases of Software for Computer Simulation and Multivariant Prediction of Economic Even at Uncertainty Conditions on the Base of N-Comp

  15. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... uncertainties can be implemented in probabilistic reliability assessments....

  16. Bird-landscape relations in the Chihuahuan Desert: Coping with uncertainties about predictive models

    Science.gov (United States)

    Gutzwiller, K.J.; Barrow, W.C.

    2001-01-01

    During the springs of 1995-1997, we studied birds and landscapes in the Chihuahuan Desert along part of the Texas-Mexico border. Our objectives were to assess bird-landscape relations and their interannual consistency and to identify ways to cope with associated uncertainties that undermine confidence in using such relations in conservation decision processes. Bird distributions were often significantly associated with landscape features, and many bird-landscape models were valid and useful for predictive purposes. Differences in early spring rainfall appeared to influence bird abundance, but there was no evidence that annual differences in bird abundance affected model consistency. Model consistency for richness (42%) was higher than mean model consistency for 26 focal species (mean 30%, range 0-67%), suggesting that relations involving individual species are, on average, more subject to factors that cause variation than are richness-landscape relations. Consistency of bird-landscape relations may be influenced by such factors as plant succession, exotic species invasion, bird species' tolerances for environmental variation, habitat occupancy patterns, and variation in food density or weather. The low model consistency that we observed for most species indicates the high variation in bird-landscape relations that managers and other decision makers may encounter. The uncertainty of interannual variation in bird-landscape relations can be reduced by using projections of bird distributions from different annual models to determine the likely range of temporal and spatial variation in a species' distribution. Stochastic simulation models can be used to incorporate the uncertainty of random environmental variation into predictions of bird distributions based on bird-landscape relations and to provide probabilistic projections with which managers can weigh the costs and benefits of various decisions, Uncertainty about the true structure of bird-landscape relations

  17. A model of mechanical contacts in hearing aids for uncertainty analysis

    DEFF Research Database (Denmark)

    Creixell Mediante, Ester; Brunskog, Jonas; Jensen, Jakob Søndergaard;

    2015-01-01

    Modelling the contact between assembled parts is a key point in the design of complex structures. Uncertainties at the joint parameters arise as a result of randomness in physical properties such as contact surface, load distribution or geometric details. This is a challenge of concern in the hea......Modelling the contact between assembled parts is a key point in the design of complex structures. Uncertainties at the joint parameters arise as a result of randomness in physical properties such as contact surface, load distribution or geometric details. This is a challenge of concern...... in the hearing aid field, where the small lightweight structures present vibration modes at frequencies within the hearing range. To approach this issue, a model of contacts based on lumped elements is suggested. The joint parameters are the stiffness of a series of spring elements placed along the contact...

  18. Decision making for stable inspection planning of deteriorating structures based on constraint reliability and uncertainties

    Directory of Open Access Journals (Sweden)

    Jalal alsarraf

    2014-06-01

    Full Text Available Life-time cost minimization is considered as the optimal criterion for planning of inspection, repair and maintenance of structures. However, most of the probabilities and the cost items related to the cost analysis generally contain inevitable uncertainties in actual cases. The appropriateness of inspection planning may be lost by several errors induced by such uncertainties. In this study, a cost minimization method with the constraint of reliability is developed in order to obtain stable inspection planning against the estimation errors of the parameters. In the analysis, the life-time cost optimization is carried out under the constraint that the failure probabilities of the members are controlled below the respective target values allowed for the members. First, initial target failure probabilities are assumed for each member. Then, the robustness of the inspection planning is investigated by adjusting the parameters within the range of uncertainties. The initial values of the target failure probabilities are altered until an acceptable result is obtained. The applicability of the proposed method is examined for a structure with several uncertain parameters. A sequential cost minimization method is employed to optimize the life-time cost. It is made clear that by using this approach, the stability of the life-time cost is maintained without losing the benefit of the cost minimization method. Keywords: Crack, Fatigue, Inspection, Numerical Model, Reliability, Structure.

  19. Parameter and uncertainty estimation for mechanistic, spatially explicit epidemiological models

    Science.gov (United States)

    Finger, Flavio; Schaefli, Bettina; Bertuzzo, Enrico; Mari, Lorenzo; Rinaldo, Andrea

    2014-05-01

    Epidemiological models can be a crucially important tool for decision-making during disease outbreaks. The range of possible applications spans from real-time forecasting and allocation of health-care resources to testing alternative intervention mechanisms such as vaccines, antibiotics or the improvement of sanitary conditions. Our spatially explicit, mechanistic models for cholera epidemics have been successfully applied to several epidemics including, the one that struck Haiti in late 2010 and is still ongoing. Calibration and parameter estimation of such models represents a major challenge because of properties unusual in traditional geoscientific domains such as hydrology. Firstly, the epidemiological data available might be subject to high uncertainties due to error-prone diagnosis as well as manual (and possibly incomplete) data collection. Secondly, long-term time-series of epidemiological data are often unavailable. Finally, the spatially explicit character of the models requires the comparison of several time-series of model outputs with their real-world counterparts, which calls for an appropriate weighting scheme. It follows that the usual assumption of a homoscedastic Gaussian error distribution, used in combination with classical calibration techniques based on Markov chain Monte Carlo algorithms, is likely to be violated, whereas the construction of an appropriate formal likelihood function seems close to impossible. Alternative calibration methods, which allow for accurate estimation of total model uncertainty, particularly regarding the envisaged use of the models for decision-making, are thus needed. Here we present the most recent developments regarding methods for parameter and uncertainty estimation to be used with our mechanistic, spatially explicit models for cholera epidemics, based on informal measures of goodness of fit.

  20. Usage of ensemble geothermal models to consider geological uncertainties

    Science.gov (United States)

    Rühaak, Wolfram; Steiner, Sarah; Welsch, Bastian; Sass, Ingo

    2015-04-01

    The usage of geothermal energy for instance by borehole heat exchangers (BHE) is a promising concept for a sustainable supply of heat for buildings. BHE are closed pipe systems, in which a fluid is circulating. Heat from the surrounding rocks is transferred to the fluid purely by conduction. The fluid carries the heat to the surface, where it can be utilized. Larger arrays of BHE require typically previous numerical models. Motivations are the design of the system (number and depth of the required BHE) but also regulatory reasons. Especially such regulatory operating permissions often require maximum realistic models. Although such realistic models are possible in many cases with today's codes and computer resources, they are often expensive in terms of time and effort. A particular problem is the knowledge about the accuracy of the achieved results. An issue, which is often neglected while dealing with highly complex models, is the quantification of parameter uncertainties as a consequence of the natural heterogeneity of the geological subsurface. Experience has shown, that these heterogeneities can lead to wrong forecasts. But also variations in the technical realization and especially of the operational parameters (which are mainly a consequence of the regional climate) can lead to strong variations in the simulation results. Instead of one very detailed single forecast model, it should be considered, to model numerous more simple models. By varying parameters, the presumed subsurface uncertainties, but also the uncertainties in the presumed operational parameters can be reflected. Finally not only one single result should be reported, but instead the range of possible solutions and their respective probabilities. In meteorology such an approach is well known as ensemble-modeling. The concept is demonstrated at a real world data set and discussed.

  1. Estimation of Model and Parameter Uncertainty For A Distributed Rainfall-runoff Model

    Science.gov (United States)

    Engeland, K.

    The distributed rainfall-runoff model Ecomag is applied as a regional model for nine catchments in the NOPEX area in Sweden. Ecomag calculates streamflow on a daily time resolution. The posterior distribution of the model parameters is conditioned on the observed streamflow in all nine catchments, and calculated using Bayesian statistics. The distribution is estimated by Markov chain Monte Carlo (MCMC). The Bayesian method requires a definition of the likelihood of the parameters. Two alter- native formulations are used. The first formulation is a subjectively chosen objective function describing the goodness of fit between the simulated and observed streamflow as it is used in the GLUE framework. The second formulation is to use a more statis- tically correct likelihood function that describes the simulation errors. The simulation error is defined as the difference between log-transformed observed and simulated streamflows. A statistical model for the simulation errors is constructed. Some param- eters are dependent on the catchment, while others depend on climate. The statistical and the hydrological parameters are estimated simultaneously. Confidence intervals, due to the uncertainty of the Ecomag parameters, for the simulated streamflow are compared for the two likelihood functions. Confidence intervals based on the statis- tical model for the simulation errors are also calculated. The results indicate that the parameter uncertainty depends on the formulation of the likelihood function. The sub- jectively chosen likelihood function gives relatively wide confidence intervals whereas the 'statistical' likelihood function gives more narrow confidence intervals. The statis- tical model for the simulation errors indicates that the structural errors of the model are as least as important as the parameter uncertainty.

  2. Integrating fire with hydrological projections: model evaluation to identify uncertainties and tradeoffs in model complexity

    Science.gov (United States)

    Kennedy, M.; McKenzie, D.

    2013-12-01

    outcomes with respect to both. This two-stage model evaluation against multiple criteria and for more than one landscape demonstrates that a relatively simple model of fire spread can be sufficiently robust to simulate fire regimes for varying ecosystems and time periods. A careful model evaluation allows for identification of model uncertainties, which are then reduced by improvements to model structure. When integrating a fire spread model with a hydrological model for watershed projections it is insufficient to determine the adequacy of the fire spread module independently of the hydrological model. The integration of the two models should be assessed as vigorously as the individual modules.

  3. Infiltration under snow cover: Modeling approaches and predictive uncertainty

    Science.gov (United States)

    Meeks, Jessica; Moeck, Christian; Brunner, Philip; Hunkeler, Daniel

    2017-03-01

    Groundwater recharge from snowmelt represents a temporal redistribution of precipitation. This is extremely important because the rate and timing of snowpack drainage has substantial consequences to aquifer recharge patterns, which in turn affect groundwater availability throughout the rest of the year. The modeling methods developed to estimate drainage from a snowpack, which typically rely on temporally-dense point-measurements or temporally-limited spatially-dispersed calibration data, range in complexity from the simple degree-day method to more complex and physically-based energy balance approaches. While the gamut of snowmelt models are routinely used to aid in water resource management, a comparison of snowmelt models' predictive uncertainties had previously not been done. Therefore, we established a snowmelt model calibration dataset that is both temporally dense and represents the integrated snowmelt infiltration signal for the Vers Chez le Brandt research catchment, which functions as a rather unique natural lysimeter. We then evaluated the uncertainty associated with the degree-day, a modified degree-day and energy balance snowmelt model predictions using the null-space Monte Carlo approach. All three melt models underestimate total snowpack drainage, underestimate the rate of early and midwinter drainage and overestimate spring snowmelt rates. The actual rate of snowpack water loss is more constant over the course of the entire winter season than the snowmelt models would imply, indicating that mid-winter melt can contribute as significantly as springtime snowmelt to groundwater recharge in low alpine settings. Further, actual groundwater recharge could be between 2 and 31% greater than snowmelt models suggest, over the total winter season. This study shows that snowmelt model predictions can have considerable uncertainty, which may be reduced by the inclusion of more data that allows for the use of more complex approaches such as the energy balance

  4. Eye tracker uncertainty analysis and modelling in real time

    Science.gov (United States)

    Fornaser, A.; De Cecco, M.; Leuci, M.; Conci, N.; Daldoss, M.; Armanini, A.; Maule, L.; De Natale, F.; Da Lio, M.

    2017-01-01

    Techniques for tracking the eyes took place since several decades for different applications that range from military, to education, entertainment and clinics. The existing systems are in general of two categories: precise but intrusive or comfortable but less accurate. The idea of this work is to calibrate an eye tracker of the second category. In particular we have estimated the uncertainty both in nominal and in case of variable operating conditions. We took into consideration different influencing factors such as: head movement and rotation, eyes detected, target position on the screen, illumination and objects in front of the eyes. Results proved that the 2D uncertainty can be modelled as a circular confidence interval as far as there is no stable principal directions in both the systematic and the repeatability effects. This confidence region was also modelled as a function of the current working conditions. In this way we can obtain a value of the uncertainty that is a function of the operating condition estimated in real time opening the field to new applications that reconfigure the human machine interface as a function of the operating conditions. Examples can range from option buttons reshape, local zoom dynamically adjusted, speed optimization to regulate interface responsiveness, the possibility to take into account the uncertainty associated to a particular interaction. Furthermore, in the analysis of visual scanning patterns, the resulting Point of Regard maps would be associated with proper confidence levels thus allowing to draw accurate conclusions. We conducted an experimental campaign to estimate and validate the overall modelling procedure obtaining valid results in 86% of the cases.

  5. Evaluation of Trapped Radiation Model Uncertainties for Spacecraft Design

    Science.gov (United States)

    Armstrong, T. W.; Colborn, B. L.

    2000-01-01

    The standard AP8 and AE8 models for predicting trapped proton and electron environments have been compared with several sets of flight data to evaluate model uncertainties. Model comparisons are made with flux, dose, and activation measurements made on various U.S. low-Earth orbit satellites (APEX, CRRES, DMSP, LDEF, NOAA) and Space Shuttle flights, on Russian satellites (Photon-8, Cosmos-1887, Cosmos-2044), and on the Russian Mir Space Station. This report gives a summary of the model-data comparisons-detailed results are given in a companion report. Results from the model comparisons with flic,ht data show, for example, the AP8 model underpredicts the trapped proton flux at low altitudes by a factor of about two (independent of proton energy and solar cycle conditions), and that the AE8 model overpredicts the flux in the outer electron belt by an order of magnitude or more.

  6. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  7. Characterization and modeling of uncertainty intended for a secured MANET

    Directory of Open Access Journals (Sweden)

    Md. Amir Khusru Akhtar

    2013-08-01

    Full Text Available Mobile ad-hoc network is a chaos for decades due to its dynamic and heuristic base. It employs several forms of uncertainty such as vagueness and imprecision. Vagueness can be taken in terms of linguistic assumptions such as grading and classification for the acceptance. Imprecision on the other hand can be associated with countable or noncountable assumptions such as the weights of acceptance calculated by the members of the MANET. This paper presents “Certainty Intended Model (CIM” for a secured MANET by introducing one or more expert nodes together with the inclusion of various theories (such as monotone measure, belief, plausibility, evidence. These theories can be used for the characterization and modeling various forms of uncertainty. Further, these characterizations help in quantifying the uncertainty spectrum because, as much information about the problem is available we can transform from one theory to another. In this work we have shown how these theories and expert opinion helps to identify the setback associated with the MANET in respect of trust management and finally, enhances the security, reliability and performance of the MANET.

  8. Crop model improvement reduces the uncertainty of the response to temperature of multi-model ensembles

    DEFF Research Database (Denmark)

    Maiorano, Andrea; Martre, Pierre; Asseng, Senthold

    2017-01-01

    To improve climate change impact estimates and to quantify their uncertainty, multi-model ensembles (MMEs) have been suggested. Model improvements can improve the accuracy of simulations and reduce the uncertainty of climate change impact assessments. Furthermore, they can reduce the number of mo...

  9. Model parameter uncertainty analysis for an annual field-scale phosphorus loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  10. Model parameter uncertainty analysis for annual field-scale P loss model

    Science.gov (United States)

    Phosphorous (P) loss models are important tools for developing and evaluating conservation practices aimed at reducing P losses from agricultural fields. All P loss models, however, have an inherent amount of uncertainty associated with them. In this study, we conducted an uncertainty analysis with ...

  11. Improving uncertainty estimation in urban hydrological modeling by statistically describing bias

    Directory of Open Access Journals (Sweden)

    D. Del Giudice

    2013-04-01

    Full Text Available Hydrodynamic models are useful tools for urban water management. Unfortunately, it is still challenging to obtain accurate results and plausible uncertainty estimates when using these models. In particular, with the currently applied statistical techniques, flow predictions are usually overconfident and biased. In this study, we present a flexible and computationally efficient methodology (i to obtain more reliable hydrological simulations in terms of coverage of validation data by the uncertainty bands and (ii to separate prediction uncertainty into its components. Our approach acknowledges that urban drainage predictions are biased. This is mostly due to input errors and structural deficits of the model. We address this issue by describing model bias in a Bayesian framework. The bias becomes an autoregressive term additional to white measurement noise, the only error type accounted for in traditional uncertainty analysis in urban hydrology. To allow for bigger discrepancies during wet weather, we make the variance of bias dependent on the input (rainfall or/and output (runoff of the system. Specifically, we present a structured approach to select, among five variants, the optimal bias description for a given urban or natural case study. We tested the methodology in a small monitored stormwater system described by means of a parsimonious model. Our results clearly show that flow simulations are much more reliable when bias is accounted for than when it is neglected. Furthermore, our probabilistic predictions can discriminate between three uncertainty contributions: parametric uncertainty, bias (due to input and structural errors, and measurement errors. In our case study, the best performing bias description was the output-dependent bias using a log-sinh transformation of data and model results. The limitations of the framework presented are some ambiguity due to the subjective choice of priors for bias parameters and its inability to directly

  12. IDENTIFICATION ERROR BOUNDS AND ASYMPTOTIC DISTRIBUTIONS FOR SYSTEMS WITH STRUCTURAL UNCERTAINTIES

    Institute of Scientific and Technical Information of China (English)

    Gang George YIN; Shaobai KAN; Le Yi WANG

    2006-01-01

    This work is concerned with identification of systems that are subject to not only measurement noises, but also structural uncertainties such as unmodeled dynamics, sensor nonlinear mismatch,and observation bias. Identification errors are analyzed for their dependence on these structural uncertainties. Asymptotic distributions of scaled sequences of estimation errors are derived.

  13. Incorporating rainfall uncertainty in a SWAT model: the river Zenne basin (Belgium) case study

    Science.gov (United States)

    Tolessa Leta, Olkeba; Nossent, Jiri; van Griensven, Ann; Bauwens, Willy

    2013-04-01

    The European Union Water Framework Directive (EU-WFD) called its member countries to achieve a good ecological status for all inland and coastal water bodies by 2015. According to recent studies, the river Zenne (Belgium) is far from this objective. Therefore, an interuniversity and multidisciplinary project "Towards a Good Ecological Status in the river Zenne (GESZ)" was launched to evaluate the effects of wastewater management plans on the river. In this project, different models have been developed and integrated using the Open Modelling Interface (OpenMI). The hydrologic, semi-distributed Soil and Water Assessment Tool (SWAT) is hereby used as one of the model components in the integrated modelling chain in order to model the upland catchment processes. The assessment of the uncertainty of SWAT is an essential aspect of the decision making process, in order to design robust management strategies that take the predicted uncertainties into account. Model uncertainty stems from the uncertainties on the model parameters, the input data (e.g, rainfall), the calibration data (e.g., stream flows) and on the model structure itself. The objective of this paper is to assess the first three sources of uncertainty in a SWAT model of the river Zenne basin. For the assessment of rainfall measurement uncertainty, first, we identified independent rainfall periods, based on the daily precipitation and stream flow observations and using the Water Engineering Time Series PROcessing tool (WETSPRO). Secondly, we assigned a rainfall multiplier parameter for each of the independent rainfall periods, which serves as a multiplicative input error corruption. Finally, we treated these multipliers as latent parameters in the model optimization and uncertainty analysis (UA). For parameter uncertainty assessment, due to the high number of parameters of the SWAT model, first, we screened out its most sensitive parameters using the Latin Hypercube One-factor-At-a-Time (LH-OAT) technique

  14. Uncertainty in Ecohydrological Modeling in an Arid Region Determined with Bayesian Methods.

    Directory of Open Access Journals (Sweden)

    Junjun Yang

    Full Text Available In arid regions, water resources are a key forcing factor in ecosystem circulation, and soil moisture is the critical link that constrains plant and animal life on the soil surface and underground. Simulation of soil moisture in arid ecosystems is inherently difficult due to high variability. We assessed the applicability of the process-oriented CoupModel for forecasting of soil water relations in arid regions. We used vertical soil moisture profiling for model calibration. We determined that model-structural uncertainty constituted the largest error; the model did not capture the extremes of low soil moisture in the desert-oasis ecotone (DOE, particularly below 40 cm soil depth. Our results showed that total uncertainty in soil moisture prediction was improved when input and output data, parameter value array, and structure errors were characterized explicitly. Bayesian analysis was applied with prior information to reduce uncertainty. The need to provide independent descriptions of uncertainty analysis (UA in the input and output data was demonstrated. Application of soil moisture simulation in arid regions will be useful for dune-stabilization and revegetation efforts in the DOE.

  15. A sliding mode observer for hemodynamic characterization under modeling uncertainties

    KAUST Repository

    Zayane, Chadia

    2014-06-01

    This paper addresses the case of physiological states reconstruction in a small region of the brain under modeling uncertainties. The misunderstood coupling between the cerebral blood volume and the oxygen extraction fraction has lead to a partial knowledge of the so-called balloon model describing the hemodynamic behavior of the brain. To overcome this difficulty, a High Order Sliding Mode observer is applied to the balloon system, where the unknown coupling is considered as an internal perturbation. The effectiveness of the proposed method is illustrated through a set of synthetic data that mimic fMRI experiments.

  16. Comparison of different uncertainty techniques in urban stormwater quantity and quality modelling.

    Science.gov (United States)

    Dotto, Cintia B S; Mannina, Giorgio; Kleidorfer, Manfred; Vezzaro, Luca; Henrichs, Malte; McCarthy, David T; Freni, Gabriele; Rauch, Wolfgang; Deletic, Ana

    2012-05-15

    Urban drainage models are important tools used by both practitioners and scientists in the field of stormwater management. These models are often conceptual and usually require calibration using local datasets. The quantification of the uncertainty associated with the models is a must, although it is rarely practiced. The International Working Group on Data and Models, which works under the IWA/IAHR Joint Committee on Urban Drainage, has been working on the development of a framework for defining and assessing uncertainties in the field of urban drainage modelling. A part of that work is the assessment and comparison of different techniques generally used in the uncertainty assessment of the parameters of water models. This paper compares a number of these techniques: the Generalized Likelihood Uncertainty Estimation (GLUE), the Shuffled Complex Evolution Metropolis algorithm (SCEM-UA), an approach based on a multi-objective auto-calibration (a multialgorithm, genetically adaptive multi-objective method, AMALGAM) and a Bayesian approach based on a simplified Markov Chain Monte Carlo method (implemented in the software MICA). To allow a meaningful comparison among the different uncertainty techniques, common criteria have been set for the likelihood formulation, defining the number of simulations, and the measure of uncertainty bounds. Moreover, all the uncertainty techniques were implemented for the same case study, in which the same stormwater quantity and quality model was used alongside the same dataset. The comparison results for a well-posed rainfall/runoff model showed that the four methods provide similar probability distributions of model parameters, and model prediction intervals. For ill-posed water quality model the differences between the results were much wider; and the paper provides the specific advantages and disadvantages of each method. In relation to computational efficiency (i.e. number of iterations required to generate the probability

  17. Parametric uncertainties in global model simulations of black carbon column mass concentration

    Science.gov (United States)

    Pearce, Hana; Lee, Lindsay; Reddington, Carly; Carslaw, Ken; Mann, Graham

    2016-04-01

    Previous studies have deduced that the annual mean direct radiative forcing from black carbon (BC) aerosol may regionally be up to 5 W m-2 larger than expected due to underestimation of global atmospheric BC absorption in models. We have identified the magnitude and important sources of parametric uncertainty in simulations of BC column mass concentration from a global aerosol microphysics model (GLOMAP-Mode). A variance-based uncertainty analysis of 28 parameters has been performed, based on statistical emulators trained on model output from GLOMAP-Mode. This is the largest number of uncertain model parameters to be considered in a BC uncertainty analysis to date and covers primary aerosol emissions, microphysical processes and structural parameters related to the aerosol size distribution. We will present several recommendations for further research to improve the fidelity of simulated BC. In brief, we find that the standard deviation around the simulated mean annual BC column mass concentration varies globally between 2.5 x 10-9 g cm-2 in remote marine regions and 1.25 x 10-6 g cm-2 near emission sources due to parameter uncertainty Between 60 and 90% of the variance over source regions is due to uncertainty associated with primary BC emission fluxes, including biomass burning, fossil fuel and biofuel emissions. While the contributions to BC column uncertainty from microphysical processes, for example those related to dry and wet deposition, are increased over remote regions, we find that emissions still make an important contribution in these areas. It is likely, however, that the importance of structural model error, i.e. differences between models, is greater than parametric uncertainty. We have extended our analysis to emulate vertical BC profiles at several locations in the mid-Pacific Ocean and identify the parameters contributing to uncertainty in the vertical distribution of black carbon at these locations. We will present preliminary comparisons of

  18. Cascading uncertainties in flood inundation models to uncertain estimates of damage and loss

    Science.gov (United States)

    Fewtrell, Timothy; Michel, Gero; Ntelekos, Alexandros; Bates, Paul

    2010-05-01

    extents. Perturbations to characteristic depth/damage curves for residential, industrial and commercial structures are used to assess the variability in vulnerability information of insurance portfolios. As a result, it is possible to assess the relative magnitudes and consequently, determine whether reducing the uncertainties in inundation modeling or gaining a better understanding of portfolio vulnerability is more necessary for understanding flood risk.

  19. Assessment of Solution Uncertainties in Single-Column Modeling Frameworks.

    Science.gov (United States)

    Hack, James J.; Pedretti, John A.

    2000-01-01

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  20. Handling uncertainty and networked structure in robot control

    CERN Document Server

    Tamás, Levente

    2015-01-01

    This book focuses on two challenges posed in robot control by the increasing adoption of robots in the everyday human environment: uncertainty and networked communication. Part I of the book describes learning control to address environmental uncertainty. Part II discusses state estimation, active sensing, and complex scenario perception to tackle sensing uncertainty. Part III completes the book with control of networked robots and multi-robot teams. Each chapter features in-depth technical coverage and case studies highlighting the applicability of the techniques, with real robots or in simulation. Platforms include mobile ground, aerial, and underwater robots, as well as humanoid robots and robot arms. Source code and experimental data are available at http://extras.springer.com. The text gathers contributions from academic and industry experts, and offers a valuable resource for researchers or graduate students in robot control and perception. It also benefits researchers in related areas, such as computer...

  1. Quantifying uncertainty, variability and likelihood for ordinary differential equation models

    LENUS (Irish Health Repository)

    Weisse, Andrea Y

    2010-10-28

    Abstract Background In many applications, ordinary differential equation (ODE) models are subject to uncertainty or variability in initial conditions and parameters. Both, uncertainty and variability can be quantified in terms of a probability density function on the state and parameter space. Results The partial differential equation that describes the evolution of this probability density function has a form that is particularly amenable to application of the well-known method of characteristics. The value of the density at some point in time is directly accessible by the solution of the original ODE extended by a single extra dimension (for the value of the density). This leads to simple methods for studying uncertainty, variability and likelihood, with significant advantages over more traditional Monte Carlo and related approaches especially when studying regions with low probability. Conclusions While such approaches based on the method of characteristics are common practice in other disciplines, their advantages for the study of biological systems have so far remained unrecognized. Several examples illustrate performance and accuracy of the approach and its limitations.

  2. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    Energy Technology Data Exchange (ETDEWEB)

    Estep, Donald [Colorado State Univ., Fort Collins, CO (United States); El-Azab, Anter [Florida State Univ., Tallahassee, FL (United States); Pernice, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States); Peterson, John W. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Polyakov, Peter [Univ. of Wyoming, Laramie, WY (United States); Tavener, Simon [Colorado State Univ., Fort Collins, CO (United States); Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States); Univ. of Utah, Salt Lake City, UT (United States)

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis for computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.

  3. Reliability- and performance-based robust design optimization of MEMS structures considering technological uncertainties

    Science.gov (United States)

    Martowicz, Adam; Uhl, Tadeusz

    2012-10-01

    The paper discusses the applicability of a reliability- and performance-based multi-criteria robust design optimization technique for micro-electromechanical systems, considering their technological uncertainties. Nowadays, micro-devices are commonly applied systems, especially in the automotive industry, taking advantage of utilizing both the mechanical structure and electronic control circuit on one board. Their frequent use motivates the elaboration of virtual prototyping tools that can be applied in design optimization with the introduction of technological uncertainties and reliability. The authors present a procedure for the optimization of micro-devices, which is based on the theory of reliability-based robust design optimization. This takes into consideration the performance of a micro-device and its reliability assessed by means of uncertainty analysis. The procedure assumes that, for each checked design configuration, the assessment of uncertainty propagation is performed with the meta-modeling technique. The described procedure is illustrated with an example of the optimization carried out for a finite element model of a micro-mirror. The multi-physics approach allowed the introduction of several physical phenomena to correctly model the electrostatic actuation and the squeezing effect present between electrodes. The optimization was preceded by sensitivity analysis to establish the design and uncertain domains. The genetic algorithms fulfilled the defined optimization task effectively. The best discovered individuals are characterized by a minimized value of the multi-criteria objective function, simultaneously satisfying the constraint on material strength. The restriction of the maximum equivalent stresses was introduced with the conditionally formulated objective function with a penalty component. The yielded results were successfully verified with a global uniform search through the input design domain.

  4. Values and uncertainties in the predictions of global climate models.

    Science.gov (United States)

    Winsberg, Eric

    2012-06-01

    Over the last several years, there has been an explosion of interest and attention devoted to the problem of Uncertainty Quantification (UQ) in climate science-that is, to giving quantitative estimates of the degree of uncertainty associated with the predictions of global and regional climate models. The technical challenges associated with this project are formidable, and so the statistical community has understandably devoted itself primarily to overcoming them. But even as these technical challenges are being met, a number of persistent conceptual difficulties remain. So why is UQ so important in climate science? UQ, I would like to argue, is first and foremost a tool for communicating knowledge from experts to policy makers in a way that is meant to be free from the influence of social and ethical values. But the standard ways of using probabilities to separate ethical and social values from scientific practice cannot be applied in a great deal of climate modeling, because the roles of values in creating the models cannot be discerned after the fact-the models are too complex and the result of too much distributed epistemic labor. I argue, therefore, that typical approaches for handling ethical/social values in science do not work well here.

  5. Selection of Representative Models for Decision Analysis Under Uncertainty

    Science.gov (United States)

    Meira, Luis A. A.; Coelho, Guilherme P.; Santos, Antonio Alberto S.; Schiozer, Denis J.

    2016-03-01

    The decision-making process in oil fields includes a step of risk analysis associated with the uncertainties present in the variables of the problem. Such uncertainties lead to hundreds, even thousands, of possible scenarios that are supposed to be analyzed so an effective production strategy can be selected. Given this high number of scenarios, a technique to reduce this set to a smaller, feasible subset of representative scenarios is imperative. The selected scenarios must be representative of the original set and also free of optimistic and pessimistic bias. This paper is devoted to propose an assisted methodology to identify representative models in oil fields. To do so, first a mathematical function was developed to model the representativeness of a subset of models with respect to the full set that characterizes the problem. Then, an optimization tool was implemented to identify the representative models of any problem, considering not only the cross-plots of the main output variables, but also the risk curves and the probability distribution of the attribute-levels of the problem. The proposed technique was applied to two benchmark cases and the results, evaluated by experts in the field, indicate that the obtained solutions are richer than those identified by previously adopted manual approaches. The program bytecode is available under request.

  6. Sustainable infrastructure system modeling under uncertainties and dynamics

    Science.gov (United States)

    Huang, Yongxi

    potential risks caused by feedstock seasonality and demand uncertainty. Facility spatiality, time variation of feedstock yields, and demand uncertainty are integrated into a two-stage stochastic programming (SP) framework. In the study of Transitional Energy System Modeling under Uncertainty, a multistage stochastic dynamic programming is established to optimize the process of building and operating fuel production facilities during the transition. Dynamics due to the evolving technologies and societal changes and uncertainty due to demand fluctuations are the major issues to be addressed.

  7. Robust Performance of Systems with Structured Uncertainties in State Space

    DEFF Research Database (Denmark)

    Zhou, Kemin; Khargonekar, Pramod P.; Stoustrup, Jakob

    1995-01-01

    This paper considers robust performance analysis and state feedback design for systems with time-varying parameter uncertainties. The notion of a strongly robust % performance criterion is introduced, and its applications in robust performance analysis and synthesis for nominally linear systems...

  8. Model requirements for decision support under uncertainty in data scarce dynamic deltas

    NARCIS (Netherlands)

    Haasnoot, Marjolijn; van Deursen, W.P.A.; Kwakkel, J. H.; Middelkoop, H.

    2016-01-01

    There is a long tradition of model-based decision support in water management. The consideration of deep uncertainty, however, changes the requirements imposed on models.. In the face of deep uncertainty, models are used to explore many uncertainties and the decision space across multiple outcomes o

  9. Modelling of physical properties - databases, uncertainties and predictive power

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    Physical and thermodynamic property in the form of raw data or estimated values for pure compounds and mixtures are important pre-requisites for performing tasks such as, process design, simulation and optimization; computer aided molecular/mixture (product) design; and, product-process analysis....... While use of experimentally measured values of the needed properties is desirable in these tasks, the experimental data of the properties of interest may not be available or may not be measurable in many cases. Therefore, property models that are reliable, predictive and easy to use are necessary....... However, which models should be used to provide the reliable estimates of the required properties? And, how much measured data is necessary to regress the model parameters? How to ensure predictive capabilities in the developed models? Also, as it is necessary to know the associated uncertainties...

  10. System convergence in transport models: algorithms efficiency and output uncertainty

    DEFF Research Database (Denmark)

    Rich, Jeppe; Nielsen, Otto Anker

    2015-01-01

    much in the literature. The paper first investigates several variants of the Method of Successive Averages (MSA) by simulation experiments on a toy-network. It is found that the simulation experiments produce support for a weighted MSA approach. The weighted MSA approach is then analysed on large......-scale in the Danish National Transport Model (DNTM). It is revealed that system convergence requires that either demand or supply is without random noise but not both. In that case, if MSA is applied to the model output with random noise, it will converge effectively as the random effects are gradually dampened...... in the MSA process. In connection to DNTM it is shown that MSA works well when applied to travel-time averaging, whereas trip averaging is generally infected by random noise resulting from the assignment model. The latter implies that the minimum uncertainty in the final model output is dictated...

  11. Opinion:the use of natural hazard modeling for decision making under uncertainty

    Institute of Scientific and Technical Information of China (English)

    David E Calkin; Mike Mentis

    2015-01-01

    Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex environmental models. However, to our knowledge there has been less focus on the conditions where decision makers can confidently rely on results from these models. In this review we propose a preliminary set of conditions necessary for the appropriate application of modeled results to natural hazard decision making and provide relevant examples within US wildfire management programs.

  12. Improved methodology for developing cost uncertainty models for naval vessels

    OpenAIRE

    Brown, Cinda L.

    2008-01-01

    The purpose of this thesis is to analyze the probabilistic cost model currently in use by NAVSEA 05C to predict cost uncertainty in naval vessel construction and to develop a method that better predicts the ultimate cost risk. The data used to develop the improved approach is collected from analysis of the CG(X) class ship by NAVSEA 05C. The NAVSEA 05C cost risk factors are reviewed and analyzed to determine if different factors are better cost predictors. The impact of data elicitation, t...

  13. Modelling with uncertainties: The role of the fission barrier

    Directory of Open Access Journals (Sweden)

    Lü Hongliang

    2013-12-01

    Full Text Available Fission is the dominant decay channel of super-heavy elements formed in heavy ions collisions. The probability of synthesizing heavy or super-heavy nuclei in fusion-evaporation reactions is then very sensitive to the height of their fission barriers. This contribution will firstly address the influence of theoretical uncertainty on excitation functions. Our second aim is to investigate the inverse problem, i.e., what information about the fission barriers can be extracted from excitation functions? For this purpose, Bayesian methods have been used with a simplified toy model.

  14. The uncertainty of modeled soil carbon stock change for Finland

    Science.gov (United States)

    Lehtonen, Aleksi; Heikkinen, Juha

    2013-04-01

    Countries should report soil carbon stock changes of forests for Kyoto Protocol. Under Kyoto Protocol one can omit reporting of a carbon pool by verifying that the pool is not a source of carbon, which is especially tempting for the soil pool. However, verifying that soils of a nation are not a source of carbon in given year seems to be nearly impossible. The Yasso07 model was parametrized against various decomposition data using MCMC method. Soil carbon change in Finland between 1972 and 2011 were simulated with Yasso07 model using litter input data derived from the National Forest Inventory (NFI) and fellings time series. The uncertainties of biomass models, litter turnoverrates, NFI sampling and Yasso07 model were propagated with Monte Carlo simulations. Due to biomass estimation methods, uncertainties of various litter input sources (e.g. living trees, natural mortality and fellings) correlate strongly between each other. We show how original covariance matrices can be analytically combined and the amount of simulated components reduce greatly. While doing simulations we found that proper handling correlations may be even more essential than accurate estimates of standard errors. As a preliminary results, from the analysis we found that both Southern- and Northern Finland were soil carbon sinks, coefficient of variations (CV) varying 10%-25% when model was driven with long term constant weather data. When we applied annual weather data, soils were both sinks and sources of carbon and CVs varied from 10%-90%. This implies that the success of soil carbon sink verification depends on the weather data applied with models. Due to this fact IPCC should provide clear guidance for the weather data applied with soil carbon models and also for soil carbon sink verification. In the UNFCCC reporting carbon sinks of forest biomass have been typically averaged for five years - similar period for soil model weather data would be logical.

  15. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi......Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume...... uncertainty. This aspect is evident particularly for stretches of the network with a high number of competing routes. Model sensitivity was also tested for BPR parameter uncertainty combined with link capacity uncertainty. The resultant increase in model sensitivity demonstrates even further the importance...

  16. Uncertainty in Earthquake Source Imaging Due to Variations in Source Time Function and Earth Structure

    KAUST Repository

    Razafindrakoto, H. N. T.

    2014-03-25

    One way to improve the accuracy and reliability of kinematic earthquake source imaging is to investigate the origin of uncertainty and to minimize their effects. The difficulties in kinematic source inversion arise from the nonlinearity of the problem, nonunique choices in the parameterization, and observational errors. We analyze particularly the uncertainty related to the choice of the source time function (STF) and the variability in Earth structure. We consider a synthetic data set generated from a spontaneous dynamic rupture calculation. Using Bayesian inference, we map the solution space of peak slip rate, rupture time, and rise time to characterize the kinematic rupture in terms of posterior density functions. Our test to investigate the effect of the choice of STF reveals that all three tested STFs (isosceles triangle, regularized Yoffe with acceleration time of 0.1 and 0.3 s) retrieve the patch of high slip and slip rate around the hypocenter. However, the use of an isosceles triangle as STF artificially accelerates the rupture to propagate faster than the target solution. It additionally generates an artificial linear correlation between rupture onset time and rise time. These appear to compensate for the dynamic source effects that are not included in the symmetric triangular STF. The exact rise time for the tested STFs is difficult to resolve due to the small amount of radiated seismic moment in the tail of STF. To highlight the effect of Earth structure variability, we perform inversions including the uncertainty in the wavespeed only, and variability in both wavespeed and layer depth. We find that little difference is noticeable between the resulting rupture model uncertainties from these two parameterizations. Both significantly broaden the posterior densities and cause faster rupture propagation particularly near the hypocenter due to the major velocity change at the depth where the fault is located.

  17. Bayesian methods for model choice and propagation of model uncertainty in groundwater transport modeling

    Science.gov (United States)

    Mendes, B. S.; Draper, D.

    2008-12-01

    The issue of model uncertainty and model choice is central in any groundwater modeling effort [Neuman and Wierenga, 2003]; among the several approaches to the problem we favour using Bayesian statistics because it is a method that integrates in a natural way uncertainties (arising from any source) and experimental data. In this work, we experiment with several Bayesian approaches to model choice, focusing primarily on demonstrating the usefulness of the Reversible Jump Markov Chain Monte Carlo (RJMCMC) simulation method [Green, 1995]; this is an extension of the now- common MCMC methods. Standard MCMC techniques approximate posterior distributions for quantities of interest, often by creating a random walk in parameter space; RJMCMC allows the random walk to take place between parameter spaces with different dimensionalities. This fact allows us to explore state spaces that are associated with different deterministic models for experimental data. Our work is exploratory in nature; we restrict our study to comparing two simple transport models applied to a data set gathered to estimate the breakthrough curve for a tracer compound in groundwater. One model has a mean surface based on a simple advection dispersion differential equation; the second model's mean surface is also governed by a differential equation but in two dimensions. We focus on artificial data sets (in which truth is known) to see if model identification is done correctly, but we also address the issues of over and under-paramerization, and we compare RJMCMC's performance with other traditional methods for model selection and propagation of model uncertainty, including Bayesian model averaging, BIC and DIC.References Neuman and Wierenga (2003). A Comprehensive Strategy of Hydrogeologic Modeling and Uncertainty Analysis for Nuclear Facilities and Sites. NUREG/CR-6805, Division of Systems Analysis and Regulatory Effectiveness Office of Nuclear Regulatory Research, U. S. Nuclear Regulatory Commission

  18. Uncertainties in modeling hazardous gas releases for emergency response

    Directory of Open Access Journals (Sweden)

    Kathrin Baumann-Stanzer

    2011-02-01

    Full Text Available In case of an accidental release of toxic gases the emergency responders need fast information about the affected area and the maximum impact. Hazard distances calculated with the models MET, ALOHA, BREEZE, TRACE and SAMS for scenarios with chlorine, ammoniac and butane releases are compared in this study. The variations of the model results are measures for uncertainties in source estimation and dispersion calculation. Model runs for different wind speeds, atmospheric stability and roughness lengths indicate the model sensitivity to these input parameters. In-situ measurements at two urban near-traffic sites are compared to results of the Integrated Nowcasting through Comprehensive Analysis (INCA in order to quantify uncertainties in the meteorological input. The hazard zone estimates from the models vary up to a factor of 4 due to different input requirements as well as due to different internal model assumptions. None of the models is found to be 'more conservative' than the others in all scenarios. INCA wind-speeds are correlated to in-situ observations at two urban sites in Vienna with a factor of 0.89. The standard deviations of the normal error distribution are 0.8 ms-1 in wind speed, on the scale of 50 degrees in wind direction, up to 4°C in air temperature and up to 10 % in relative humidity. The observed air temperature and humidity are well reproduced by INCA with correlation coefficients of 0.96 to 0.99. INCA is therefore found to give a good representation of the local meteorological conditions. Besides of real-time data, the INCA-short range forecast for the following hours may support the action planning of the first responders.

  19. Sensitivity and predictive uncertainty of the ACASA model at a spruce forest site

    Directory of Open Access Journals (Sweden)

    K. Staudt

    2010-06-01

    Full Text Available The sensitivity and predictive uncertainty of the Advanced Canopy-Atmosphere-Soil Algorithm (ACASA was assessed by employing the Generalized Likelihood Uncertainty Estimation (GLUE method. ACASA is a stand-scale, multi-layer soil-vegetation-atmosphere transfer model that incorporates a third order closure method to simulate the turbulent exchange of energy and matter within and above the canopy. Fluxes simulated by the model were compared to sensible and latent heat fluxes as well as the net ecosystem exchange measured by an eddy-covariance system above the spruce canopy at the FLUXNET-station Waldstein-Weidenbrunnen in the Fichtelgebirge Mountains in Germany. From each of the intensive observation periods carried out within the EGER project (ExchanGE processes in mountainous Regions in autumn 2007 and summer 2008, five days of flux measurements were selected. A large number (20 000 of model runs using randomly generated parameter sets were performed and goodness of fit measures for all fluxes for each of these runs calculated. The 10% best model runs for each flux were used for further investigation of the sensitivity of the fluxes to parameter values and to calculate uncertainty bounds.

    A strong sensitivity of the individual fluxes to a few parameters was observed, such as the leaf area index. However, the sensitivity analysis also revealed the equifinality of many parameters in the ACASA model for the investigated periods. The analysis of two time periods, each representing different meteorological conditions, provided an insight into the seasonal variation of parameter sensitivity. The calculated uncertainty bounds demonstrated that all fluxes were well reproduced by the ACASA model. In general, uncertainty bounds encompass measured values better when these are conditioned on the respective individual flux only and not on all three fluxes concurrently. Structural weaknesses of the ACASA model concerning the soil respiration

  20. Modeling a Hybrid Microgrid Using Probabilistic Reconfiguration under System Uncertainties

    Directory of Open Access Journals (Sweden)

    Hadis Moradi

    2017-09-01

    Full Text Available A novel method for a day-ahead optimal operation of a hybrid microgrid system including fuel cells, photovoltaic arrays, a microturbine, and battery energy storage in order to fulfill the required load demand is presented in this paper. In the proposed system, the microgrid has access to the main utility grid in order to exchange power when required. Available municipal waste is utilized to produce the hydrogen required for running the fuel cells, and natural gas will be used as the backup source. In the proposed method, an energy scheduling is introduced to optimize the generating unit power outputs for the next day, as well as the power flow with the main grid, in order to minimize the operational costs and produced greenhouse gases emissions. The nature of renewable energies and electric power consumption is both intermittent and unpredictable, and the uncertainty related to the PV array power generation and power consumption has been considered in the next-day energy scheduling. In order to model uncertainties, some scenarios are produced according to Monte Carlo (MC simulations, and microgrid optimal energy scheduling is analyzed under the generated scenarios. In addition, various scenarios created by MC simulations are applied in order to solve unit commitment (UC problems. The microgrid’s day-ahead operation and emission costs are considered as the objective functions, and the particle swarm optimization algorithm is employed to solve the optimization problem. Overall, the proposed model is capable of minimizing the system costs, as well as the unfavorable influence of uncertainties on the microgrid’s profit, by generating different scenarios.

  1. Modeling the uncertainty of estimating forest carbon stocks in China

    Directory of Open Access Journals (Sweden)

    T. X. Yue

    2015-12-01

    Full Text Available Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ, the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA. The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.

  2. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    Science.gov (United States)

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-01-01

    Evapotranspiration (ET) is an important component of the water cycle – ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001–2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within

  3. Uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model at multiple flux tower sites

    Science.gov (United States)

    Chen, Mingshi; Senay, Gabriel B.; Singh, Ramesh K.; Verdin, James P.

    2016-05-01

    Evapotranspiration (ET) is an important component of the water cycle - ET from the land surface returns approximately 60% of the global precipitation back to the atmosphere. ET also plays an important role in energy transport among the biosphere, atmosphere, and hydrosphere. Current regional to global and daily to annual ET estimation relies mainly on surface energy balance (SEB) ET models or statistical and empirical methods driven by remote sensing data and various climatological databases. These models have uncertainties due to inevitable input errors, poorly defined parameters, and inadequate model structures. The eddy covariance measurements on water, energy, and carbon fluxes at the AmeriFlux tower sites provide an opportunity to assess the ET modeling uncertainties. In this study, we focused on uncertainty analysis of the Operational Simplified Surface Energy Balance (SSEBop) model for ET estimation at multiple AmeriFlux tower sites with diverse land cover characteristics and climatic conditions. The 8-day composite 1-km MODerate resolution Imaging Spectroradiometer (MODIS) land surface temperature (LST) was used as input land surface temperature for the SSEBop algorithms. The other input data were taken from the AmeriFlux database. Results of statistical analysis indicated that the SSEBop model performed well in estimating ET with an R2 of 0.86 between estimated ET and eddy covariance measurements at 42 AmeriFlux tower sites during 2001-2007. It was encouraging to see that the best performance was observed for croplands, where R2 was 0.92 with a root mean square error of 13 mm/month. The uncertainties or random errors from input variables and parameters of the SSEBop model led to monthly ET estimates with relative errors less than 20% across multiple flux tower sites distributed across different biomes. This uncertainty of the SSEBop model lies within the error range of other SEB models, suggesting systematic error or bias of the SSEBop model is within the

  4. Uncertainty assessment of a dominant-process catchment model of dissolved phosphorus transfer

    Science.gov (United States)

    Dupas, Rémi; Salmon-Monviola, Jordy; Beven, Keith J.; Durand, Patrick; Haygarth, Philip M.; Hollaway, Michael J.; Gascuel-Odoux, Chantal

    2016-12-01

    We developed a parsimonious topography-based hydrologic model coupled with a soil biogeochemistry sub-model in order to improve understanding and prediction of soluble reactive phosphorus (SRP) transfer in agricultural headwater catchments. The model structure aims to capture the dominant hydrological and biogeochemical processes identified from multiscale observations in a research catchment (Kervidy-Naizin, 5 km2). Groundwater fluctuations, responsible for the connection of soil SRP production zones to the stream, were simulated with a fully distributed hydrologic model at 20 m resolution. The spatial variability of the soil phosphorus content and the temporal variability of soil moisture and temperature, which had previously been identified as key controlling factors of SRP solubilization in soils, were included as part of an empirical soil biogeochemistry sub-model. The modelling approach included an analysis of the information contained in the calibration data and propagation of uncertainty in model predictions using a generalized likelihood uncertainty estimation (GLUE) "limits of acceptability" framework. Overall, the model appeared to perform well given the uncertainty in the observational data, with a Nash-Sutcliffe efficiency on daily SRP loads between 0.1 and 0.8 for acceptable models. The role of hydrological connectivity via groundwater fluctuation and the role of increased SRP solubilization following dry/hot periods were captured well. We conclude that in the absence of near-continuous monitoring, the amount of information contained in the data is limited; hence, parsimonious models are more relevant than highly parameterized models. An analysis of uncertainty in the data is recommended for model calibration in order to provide reliable predictions.

  5. Solvable Models on Noncommutative Spaces with Minimal Length Uncertainty Relations

    CERN Document Server

    Dey, Sanjib

    2014-01-01

    Our main focus is to explore different models in noncommutative spaces in higher dimensions. We provide a procedure to relate a three dimensional q-deformed oscillator algebra to the corresponding algebra satisfied by canonical variables describing non-commutative spaces. The representations for the corresponding operators obey algebras whose uncertainty relations lead to minimal length, areas and volumes in phase space, which are in principle natural candidates of many different approaches of quantum gravity. We study some explicit models on these types of noncommutative spaces, first by utilising the perturbation theory, later in an exact manner. In many cases the operators are not Hermitian, therefore we use PT -symmetry and pseudo-Hermiticity property, wherever applicable, to make them self-consistent. Apart from building mathematical models, we focus on the physical implications of noncommutative theories too. We construct Klauder coherent states for the perturbative and nonperturbative noncommutative ha...

  6. Incentive salience attribution under reward uncertainty: A Pavlovian model.

    Science.gov (United States)

    Anselme, Patrick

    2015-02-01

    There is a vast literature on the behavioural effects of partial reinforcement in Pavlovian conditioning. Compared with animals receiving continuous reinforcement, partially rewarded animals typically show (a) a slower development of the conditioned response (CR) early in training and (b) a higher asymptotic level of the CR later in training. This phenomenon is known as the partial reinforcement acquisition effect (PRAE). Learning models of Pavlovian conditioning fail to account for it. In accordance with the incentive salience hypothesis, it is here argued that incentive motivation (or 'wanting') plays a more direct role in controlling behaviour than does learning, and reward uncertainty is shown to have an excitatory effect on incentive motivation. The psychological origin of that effect is discussed and a computational model integrating this new interpretation is developed. Many features of CRs under partial reinforcement emerge from this model.

  7. Vibration and stress analysis in the presence of structural uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Langley, R S, E-mail: RSL21@eng.cam.ac.u [Department of Engineering, University of Cambridge, Cambridge CB2 1PZ (United Kingdom)

    2009-08-01

    At medium to high frequencies the dynamic response of a built-up engineering system, such as an automobile, can be sensitive to small random manufacturing imperfections. Ideally the statistics of the system response in the presence of these uncertainties should be computed at the design stage, but in practice this is an extremely difficult task. In this paper a brief review of the methods available for the analysis of systems with uncertainty is presented, and attention is then focused on two particular ''non-parametric'' methods: statistical energy analysis (SEA), and the hybrid method. The main governing equations are presented, and a number of example applications are considered, ranging from academic benchmark studies to industrial design studies.

  8. Vibration and stress analysis in the presence of structural uncertainty

    Science.gov (United States)

    Langley, R. S.

    2009-08-01

    At medium to high frequencies the dynamic response of a built-up engineering system, such as an automobile, can be sensitive to small random manufacturing imperfections. Ideally the statistics of the system response in the presence of these uncertainties should be computed at the design stage, but in practice this is an extremely difficult task. In this paper a brief review of the methods available for the analysis of systems with uncertainty is presented, and attention is then focused on two particular "non-parametric" methods: statistical energy analysis (SEA), and the hybrid method. The main governing equations are presented, and a number of example applications are considered, ranging from academic benchmark studies to industrial design studies.

  9. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    DEFF Research Database (Denmark)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.

    2015-01-01

    uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties....... This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions...

  10. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation.

    Science.gov (United States)

    Zajac, Zuzanna; Stith, Bradley; Bowling, Andrea C; Langtimm, Catherine A; Swain, Eric D

    2015-07-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust

  11. Evaluation of habitat suitability index models by global sensitivity and uncertainty analyses: a case study for submerged aquatic vegetation

    Science.gov (United States)

    Zajac, Zuzanna; Stith, Bradley M.; Bowling, Andrea C.; Langtimm, Catherine A.; Swain, Eric D.

    2015-01-01

    Habitat suitability index (HSI) models are commonly used to predict habitat quality and species distributions and are used to develop biological surveys, assess reserve and management priorities, and anticipate possible change under different management or climate change scenarios. Important management decisions may be based on model results, often without a clear understanding of the level of uncertainty associated with model outputs. We present an integrated methodology to assess the propagation of uncertainty from both inputs and structure of the HSI models on model outputs (uncertainty analysis: UA) and relative importance of uncertain model inputs and their interactions on the model output uncertainty (global sensitivity analysis: GSA). We illustrate the GSA/UA framework using simulated hydrology input data from a hydrodynamic model representing sea level changes and HSI models for two species of submerged aquatic vegetation (SAV) in southwest Everglades National Park: Vallisneria americana (tape grass) and Halodule wrightii (shoal grass). We found considerable spatial variation in uncertainty for both species, but distributions of HSI scores still allowed discrimination of sites with good versus poor conditions. Ranking of input parameter sensitivities also varied spatially for both species, with high habitat quality sites showing higher sensitivity to different parameters than low-quality sites. HSI models may be especially useful when species distribution data are unavailable, providing means of exploiting widely available environmental datasets to model past, current, and future habitat conditions. The GSA/UA approach provides a general method for better understanding HSI model dynamics, the spatial and temporal variation in uncertainties, and the parameters that contribute most to model uncertainty. Including an uncertainty and sensitivity analysis in modeling efforts as part of the decision-making framework will result in better-informed, more robust

  12. Uncertainty analysis of statistical downscaling models using general circulation model over an international wetland

    Science.gov (United States)

    Etemadi, H.; Samadi, S.; Sharifikia, M.

    2014-06-01

    Regression-based statistical downscaling model (SDSM) is an appropriate method which broadly uses to resolve the coarse spatial resolution of general circulation models (GCMs). Nevertheless, the assessment of uncertainty propagation linked with climatic variables is essential to any climate change impact study. This study presents a procedure to characterize uncertainty analysis of two GCM models link with Long Ashton Research Station Weather Generator (LARS-WG) and SDSM in one of the most vulnerable international wetland, namely "Shadegan" in an arid region of Southwest Iran. In the case of daily temperature, uncertainty is estimated by comparing monthly mean and variance of downscaled and observed daily data at a 95 % confidence level. Uncertainties were then evaluated from comparing monthly mean dry and wet spell lengths and their 95 % CI in daily precipitation downscaling using 1987-2005 interval. The uncertainty results indicated that the LARS-WG is the most proficient model at reproducing various statistical characteristics of observed data at a 95 % uncertainty bounds while the SDSM model is the least capable in this respect. The results indicated a sequences uncertainty analysis at three different climate stations and produce significantly different climate change responses at 95 % CI. Finally the range of plausible climate change projections suggested a need for the decision makers to augment their long-term wetland management plans to reduce its vulnerability to climate change impacts.

  13. Uncertainty Estimation in SiGe HBT Small-Signal Modeling

    DEFF Research Database (Denmark)

    Masood, Syed M.; Johansen, Tom Keinicke; Vidkjær, Jens;

    2005-01-01

    An uncertainty estimation and sensitivity analysis is performed on multi-step de-embedding for SiGe HBT small-signal modeling. The uncertainty estimation in combination with uncertainty model for deviation in measured S-parameters, quantifies the possible error value in de-embedded two...

  14. Uncertainty analysis in WWTP model applications: a critical discussion using an example from design

    DEFF Research Database (Denmark)

    Sin, Gürkan; Gernaey, Krist; Neumann, Marc B.

    2009-01-01

    This study focuses on uncertainty analysis of WWTP models and analyzes the issue of framing and how it affects the interpretation of uncertainty analysis results. As a case study, the prediction of uncertainty involved in model-based design of a wastewater treatment plant is studied. The Monte Ca...

  15. Feature Extraction for Structural Dynamics Model Validation

    Energy Technology Data Exchange (ETDEWEB)

    Farrar, Charles [Los Alamos National Laboratory; Nishio, Mayuko [Yokohama University; Hemez, Francois [Los Alamos National Laboratory; Stull, Chris [Los Alamos National Laboratory; Park, Gyuhae [Chonnam Univesity; Cornwell, Phil [Rose-Hulman Institute of Technology; Figueiredo, Eloi [Universidade Lusófona; Luscher, D. J. [Los Alamos National Laboratory; Worden, Keith [University of Sheffield

    2016-01-13

    As structural dynamics becomes increasingly non-modal, stochastic and nonlinear, finite element model-updating technology must adopt the broader notions of model validation and uncertainty quantification. For example, particular re-sampling procedures must be implemented to propagate uncertainty through a forward calculation, and non-modal features must be defined to analyze nonlinear data sets. The latter topic is the focus of this report, but first, some more general comments regarding the concept of model validation will be discussed.

  16. Equilibrium Assignment Model with Uncertainties in Traffic Demands

    Directory of Open Access Journals (Sweden)

    Aiwu Kuang

    2013-01-01

    Full Text Available In this study, we present an equilibrium traffic assignment model considering uncertainties in traffic demands. The link and route travel time distributions are derived based on the assumption that OD traffic demand follows a log-normal distribution. We postulate that travelers can acquire the variability of route travel times from past experiences and factor such variability into their route choice considerations in the form of mean route travel time. Furthermore, all travelers want to minimize their mean route travel times. We formulate the assignment problem as a variational inequality, which can be solved by a route-based heuristic solution algorithm. Some numerical studies on a small test road network are carried out to validate the proposed model and algorithm, at the same time, some reasonable results are obtained.

  17. Quantifying uncertainty in LCA-modelling of waste management systems

    DEFF Research Database (Denmark)

    Clavreul, Julie; Guyonnet, D.; Christensen, Thomas Højlund

    2012-01-01

    Uncertainty analysis in LCA studies has been subject to major progress over the last years. In the context of waste management, various methods have been implemented but a systematic method for uncertainty analysis of waste-LCA studies is lacking. The objective of this paper is (1) to present...... the sources of uncertainty specifically inherent to waste-LCA studies, (2) to select and apply several methods for uncertainty analysis and (3) to develop a general framework for quantitative uncertainty assessment of LCA of waste management systems. The suggested method is a sequence of four steps combining...

  18. The low-energy structure of the nucleon-nucleon interaction: statistical versus systematic uncertainties

    Science.gov (United States)

    Navarro Pérez, R.; Amaro, J. E.; Ruiz Arriola, E.

    2016-11-01

    We analyze the low-energy nucleon-nucleon (NN) interaction by confronting statistical versus systematic uncertainties. This is carried out with the help of model potentials fitted to the Granada-2013 database where a statistically meaningful partial wave analysis comprising a total of 6713 np and pp published scattering data below 350 MeV from 1950 till 2013 has been made. We extract threshold parameter uncertainties from the coupled-channel effective range expansion up to j≤slant 5. We find that for threshold parameters systematic uncertainties are generally at least an order of magnitude larger than statistical uncertainties. Similar results are found for np phase shifts and amplitude parameters.

  19. The Effects of Uncertainty in Speed-Flow Curve Parameters on a Large-Scale Model

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2014-01-01

    Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume-delay functi......Uncertainty is inherent in transport models and prevents the use of a deterministic approach when traffic is modeled. Quantifying uncertainty thus becomes an indispensable step to produce a more informative and reliable output of transport models. In traffic assignment models, volume...

  20. The climate dependence of the terrestrial carbon cycle; including parameter and structural uncertainties

    Directory of Open Access Journals (Sweden)

    M. J. Smith

    2012-10-01

    Full Text Available The feedback between climate and the terrestrial carbon cycle will be a key determinant of the dynamics of the Earth System over the coming decades and centuries. However Earth System Model projections of the terrestrial carbon-balance vary widely over these timescales. This is largely due to differences in their carbon cycle models. A major goal in biogeosciences is therefore to improve understanding of the terrestrial carbon cycle to enable better constrained projections. Essential to achieving this goal will be assessing the empirical support for alternative models of component processes, identifying key uncertainties and inconsistencies, and ultimately identifying the models that are most consistent with empirical evidence. To begin meeting these requirements we data-constrained all parameters of all component processes within a global terrestrial carbon model. Our goals were to assess the climate dependencies obtained for different component processes when all parameters have been inferred from empirical data, assess whether these were consistent with current knowledge and understanding, assess the importance of different data sets and the model structure for inferring those dependencies, assess the predictive accuracy of the model, and to identify a methodology by which alternative component models could be compared within the same framework in future. Although formulated as differential equations describing carbon fluxes through plant and soil pools, the model was fitted assuming the carbon pools were in states of dynamic equilibrium (input rates equal output rates. Thus, the parameterised model is of the equilibrium terrestrial carbon cycle. All but 2 of the 12 component processes to the model were inferred to have strong climate dependencies although it was not possible to data-constrain all parameters indicating some potentially redundant details. Similar climate dependencies were obtained for most processes whether inferred

  1. Quantile uncertainty and value-at-risk model risk.

    Science.gov (United States)

    Alexander, Carol; Sarabia, José María

    2012-08-01

    This article develops a methodology for quantifying model risk in quantile risk estimates. The application of quantile estimates to risk assessment has become common practice in many disciplines, including hydrology, climate change, statistical process control, insurance and actuarial science, and the uncertainty surrounding these estimates has long been recognized. Our work is particularly important in finance, where quantile estimates (called Value-at-Risk) have been the cornerstone of banking risk management since the mid 1980s. A recent amendment to the Basel II Accord recommends additional market risk capital to cover all sources of "model risk" in the estimation of these quantiles. We provide a novel and elegant framework whereby quantile estimates are adjusted for model risk, relative to a benchmark which represents the state of knowledge of the authority that is responsible for model risk. A simulation experiment in which the degree of model risk is controlled illustrates how to quantify Value-at-Risk model risk and compute the required regulatory capital add-on for banks. An empirical example based on real data shows how the methodology can be put into practice, using only two time series (daily Value-at-Risk and daily profit and loss) from a large bank. We conclude with a discussion of potential applications to nonfinancial risks.

  2. Addressing Conceptual Model Uncertainty in the Evaluation of Model Prediction Errors

    Science.gov (United States)

    Carrera, J.; Pool, M.

    2014-12-01

    Model predictions are uncertain because of errors in model parameters, future forcing terms, and model concepts. The latter remain the largest and most difficult to assess source of uncertainty in long term model predictions. We first review existing methods to evaluate conceptual model uncertainty. We argue that they are highly sensitive to the ingenuity of the modeler, in the sense that they rely on the modeler's ability to propose alternative model concepts. Worse, we find that the standard practice of stochastic methods leads to poor, potentially biased and often too optimistic, estimation of actual model errors. This is bad news because stochastic methods are purported to properly represent uncertainty. We contend that the problem does not lie on the stochastic approach itself, but on the way it is applied. Specifically, stochastic inversion methodologies, which demand quantitative information, tend to ignore geological understanding, which is conceptually rich. We illustrate some of these problems with the application to Mar del Plata aquifer, where extensive data are available for nearly a century. Geologically based models, where spatial variability is handled through zonation, yield calibration fits similar to geostatiscally based models, but much better predictions. In fact, the appearance of the stochastic T fields is similar to the geologically based models only in areas with high density of data. We take this finding to illustrate the ability of stochastic models to accommodate many data, but also, ironically, their inability to address conceptual model uncertainty. In fact, stochastic model realizations tend to be too close to the "most likely" one (i.e., they do not really realize the full conceptualuncertainty). The second part of the presentation is devoted to argue that acknowledging model uncertainty may lead to qualitatively different decisions than just working with "most likely" model predictions. Therefore, efforts should concentrate on

  3. Uncertainty quantification in structural health monitoring: Applications on cultural heritage buildings

    Science.gov (United States)

    Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio

    2016-01-01

    In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.

  4. Carbon accounting and economic model uncertainty of emissions from biofuels-induced land use change.

    Science.gov (United States)

    Plevin, Richard J; Beckman, Jayson; Golub, Alla A; Witcover, Julie; O'Hare, Michael

    2015-03-03

    Few of the numerous published studies of the emissions from biofuels-induced "indirect" land use change (ILUC) attempt to propagate and quantify uncertainty, and those that have done so have restricted their analysis to a portion of the modeling systems used. In this study, we pair a global, computable general equilibrium model with a model of greenhouse gas emissions from land-use change to quantify the parametric uncertainty in the paired modeling system's estimates of greenhouse gas emissions from ILUC induced by expanded production of three biofuels. We find that for the three fuel systems examined--US corn ethanol, Brazilian sugar cane ethanol, and US soybean biodiesel--95% of the results occurred within ±20 g CO2e MJ(-1) of the mean (coefficient of variation of 20-45%), with economic model parameters related to crop yield and the productivity of newly converted cropland (from forestry and pasture) contributing most of the variance in estimated ILUC emissions intensity. Although the experiments performed here allow us to characterize parametric uncertainty, changes to the model structure have the potential to shift the mean by tens of grams of CO2e per megajoule and further broaden distributions for ILUC emission intensities.

  5. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; van den Berg, Stéphanie Martine

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  6. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2016-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  7. Assessing Model Selection Uncertainty Using a Bootstrap Approach: An Update

    NARCIS (Netherlands)

    Lubke, Gitta H.; Campbell, Ian; McArtor, Dan; Miller, Patrick; Luningham, Justin; Berg, van den Stephanie M.

    2017-01-01

    Model comparisons in the behavioral sciences often aim at selecting the model that best describes the structure in the population. Model selection is usually based on fit indexes such as Akaike’s information criterion (AIC) or Bayesian information criterion (BIC), and inference is done based on the

  8. Aeroelastic Uncertainty Quantification Studies Using the S4T Wind Tunnel Model

    Science.gov (United States)

    Nikbay, Melike; Heeg, Jennifer

    2017-01-01

    This paper originates from the joint efforts of an aeroelastic study team in the Applied Vehicle Technology Panel from NATO Science and Technology Organization, with the Task Group number AVT-191, titled "Application of Sensitivity Analysis and Uncertainty Quantification to Military Vehicle Design." We present aeroelastic uncertainty quantification studies using the SemiSpan Supersonic Transport wind tunnel model at the NASA Langley Research Center. The aeroelastic study team decided treat both structural and aerodynamic input parameters as uncertain and represent them as samples drawn from statistical distributions, propagating them through aeroelastic analysis frameworks. Uncertainty quantification processes require many function evaluations to asses the impact of variations in numerous parameters on the vehicle characteristics, rapidly increasing the computational time requirement relative to that required to assess a system deterministically. The increased computational time is particularly prohibitive if high-fidelity analyses are employed. As a remedy, the Istanbul Technical University team employed an Euler solver in an aeroelastic analysis framework, and implemented reduced order modeling with Polynomial Chaos Expansion and Proper Orthogonal Decomposition to perform the uncertainty propagation. The NASA team chose to reduce the prohibitive computational time by employing linear solution processes. The NASA team also focused on determining input sample distributions.

  9. Parameter and model uncertainty in a life-table model for fine particles (PM2.5: a statistical modeling study

    Directory of Open Access Journals (Sweden)

    Jantunen Matti J

    2007-08-01

    Full Text Available Abstract Background The estimation of health impacts involves often uncertain input variables and assumptions which have to be incorporated into the model structure. These uncertainties may have significant effects on the results obtained with model, and, thus, on decision making. Fine particles (PM2.5 are believed to cause major health impacts, and, consequently, uncertainties in their health impact assessment have clear relevance to policy-making. We studied the effects of various uncertain input variables by building a life-table model for fine particles. Methods Life-expectancy of the Helsinki metropolitan area population and the change in life-expectancy due to fine particle exposures were predicted using a life-table model. A number of parameter and model uncertainties were estimated. Sensitivity analysis for input variables was performed by calculating rank-order correlations between input and output variables. The studied model uncertainties were (i plausibility of mortality outcomes and (ii lag, and parameter uncertainties (iii exposure-response coefficients for different mortality outcomes, and (iv exposure estimates for different age groups. The monetary value of the years-of-life-lost and the relative importance of the uncertainties related to monetary valuation were predicted to compare the relative importance of the monetary valuation on the health effect uncertainties. Results The magnitude of the health effects costs depended mostly on discount rate, exposure-response coefficient, and plausibility of the cardiopulmonary mortality. Other mortality outcomes (lung cancer, other non-accidental and infant mortality and lag had only minor impact on the output. The results highlight the importance of the uncertainties associated with cardiopulmonary mortality in the fine particle impact assessment when compared with other uncertainties. Conclusion When estimating life-expectancy, the estimates used for cardiopulmonary exposure

  10. Impact of AMS-02 Measurements on Reducing GCR Model Uncertainties

    Science.gov (United States)

    Slaba, T. C.; O'Neill, P. M.; Golge, S.; Norbury, J. W.

    2015-01-01

    For vehicle design, shield optimization, mission planning, and astronaut risk assessment, the exposure from galactic cosmic rays (GCR) poses a significant and complex problem both in low Earth orbit and in deep space. To address this problem, various computational tools have been developed to quantify the exposure and risk in a wide range of scenarios. Generally, the tool used to describe the ambient GCR environment provides the input into subsequent computational tools and is therefore a critical component of end-to-end procedures. Over the past few years, several researchers have independently and very carefully compared some of the widely used GCR models to more rigorously characterize model differences and quantify uncertainties. All of the GCR models studied rely heavily on calibrating to available near-Earth measurements of GCR particle energy spectra, typically over restricted energy regions and short time periods. In this work, we first review recent sensitivity studies quantifying the ions and energies in the ambient GCR environment of greatest importance to exposure quantities behind shielding. Currently available measurements used to calibrate and validate GCR models are also summarized within this context. It is shown that the AMS-II measurements will fill a critically important gap in the measurement database. The emergence of AMS-II measurements also provides a unique opportunity to validate existing models against measurements that were not used to calibrate free parameters in the empirical descriptions. Discussion is given regarding rigorous approaches to implement the independent validation efforts, followed by recalibration of empirical parameters.

  11. Approximate Uncertainty Modeling in Risk Analysis with Vine Copulas.

    Science.gov (United States)

    Bedford, Tim; Daneshkhah, Alireza; Wilson, Kevin J

    2016-04-01

    Many applications of risk analysis require us to jointly model multiple uncertain quantities. Bayesian networks and copulas are two common approaches to modeling joint uncertainties with probability distributions. This article focuses on new methodologies for copulas by developing work of Cooke, Bedford, Kurowica, and others on vines as a way of constructing higher dimensional distributions that do not suffer from some of the restrictions of alternatives such as the multivariate Gaussian copula. The article provides a fundamental approximation result, demonstrating that we can approximate any density as closely as we like using vines. It further operationalizes this result by showing how minimum information copulas can be used to provide parametric classes of copulas that have such good levels of approximation. We extend previous approaches using vines by considering nonconstant conditional dependencies, which are particularly relevant in financial risk modeling. We discuss how such models may be quantified, in terms of expert judgment or by fitting data, and illustrate the approach by modeling two financial data sets.

  12. Effects of climate model interdependency on the uncertainty quantification of extreme rainfall projections

    DEFF Research Database (Denmark)

    Sunyer Pinya, Maria Antonia; Madsen, H.; Rosbjerg, Dan;

    Changes in rainfall extremes under climate change conditions are subject to numerous uncertainties. One of the most important uncertainties arises from the inherent uncertainty in climate models. In recent years, many efforts have been made in creating large multi-model ensembles of both Regional...... Climate Models (RCMs) and General Circulation Models (GCMs). These multi-model ensembles provide the information needed to estimate probabilistic climate change projections. Several probabilistic methods have been suggested. One common assumption in most of these methods is that the climate models...... of accounting for the climate model interdependency when estimating the uncertainty of climate change projections....

  13. Feedback versus uncertainty

    NARCIS (Netherlands)

    Van Nooyen, R.R.P.; Hrachowitz, M.; Kolechkina, A.G.

    2014-01-01

    Even without uncertainty about the model structure or parameters, the output of a hydrological model run still contains several sources of uncertainty. These are: measurement errors affecting the input, the transition from continuous time and space to discrete time and space, which causes loss of in

  14. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison

    Energy Technology Data Exchange (ETDEWEB)

    Prestele, Reinhard [Environmental Geography Group, Department of Earth Sciences, Vrije Universiteit Amsterdam, De Boelelaan 1087 1081 HV Amsterdam The Netherlands; Alexander, Peter [School of GeoSciences, University of Edinburgh, Drummond Street Edinburgh EH89XP UK; Rounsevell, Mark D. A. [School of GeoSciences, University of Edinburgh, Drummond Street Edinburgh EH89XP UK; Arneth, Almut [Department Atmospheric Environmental Research (IMK-IFU), Karlsruhe Institute of Technology, Kreuzeckbahnstr. 19 82467 Garmisch-Partenkirchen Germany; Calvin, Katherine [Joint Global Change Research Institute, Pacific Northwest National Laboratory, College Park MD 20740 USA; Doelman, Jonathan [PBL Netherlands Environmental Assessment Agency, P.O. Box 303 3720 AH Bilthoven The Netherlands; Eitelberg, David A. [Environmental Geography Group, Department of Earth Sciences, Vrije Universiteit Amsterdam, De Boelelaan 1087 1081 HV Amsterdam The Netherlands; Engström, Kerstin [Department of Geography and Ecosystem Science, Lund University, Sölvegatan 12 Lund Sweden; Fujimori, Shinichiro [Center for Social and Environmental Systems Research, National Institute for Environmental Studies, 16-2 Onogawa Tsukuba Ibaraki 305-8506 Japan; Hasegawa, Tomoko [Center for Social and Environmental Systems Research, National Institute for Environmental Studies, 16-2 Onogawa Tsukuba Ibaraki 305-8506 Japan; Havlik, Petr [Ecosystem Services and Management Program, International Institute for Applied Systems Analysis, A-2361 Laxenburg Austria; Humpenöder, Florian [Potsdam Institute for Climate Impact Research (PIK), P.O. Box 60 12 03 14412 Potsdam Germany; Jain, Atul K. [Department of Atmospheric Sciences, University of Illinois, Urbana IL 61801 USA; Krisztin, Tamás [Ecosystem Services and Management Program, International Institute for Applied Systems Analysis, A-2361 Laxenburg Austria; Kyle, Page [Joint Global Change Research Institute, Pacific Northwest National Laboratory, College Park MD 20740 USA; Meiyappan, Prasanth [Department of Atmospheric Sciences, University of Illinois, Urbana IL 61801 USA; Popp, Alexander [Potsdam Institute for Climate Impact Research (PIK), P.O. Box 60 12 03 14412 Potsdam Germany; Sands, Ronald D. [Resource and Rural Economics Division, Economic Research Service, US Department of Agriculture, Washington DC 20250 USA; Schaldach, Rüdiger [Center for Environmental Systems Research, University of Kassel, Wilhelmshöher Allee 47 D-34109 Kassel Germany; Schüngel, Jan [Center for Environmental Systems Research, University of Kassel, Wilhelmshöher Allee 47 D-34109 Kassel Germany; Stehfest, Elke [PBL Netherlands Environmental Assessment Agency, P.O. Box 303 3720 AH Bilthoven The Netherlands; Tabeau, Andrzej [LEI, Wageningen University and Research Centre, P.O. Box 29703 2502 LS The Hague The Netherlands; Van Meijl, Hans [LEI, Wageningen University and Research Centre, P.O. Box 29703 2502 LS The Hague The Netherlands; Van Vliet, Jasper [Environmental Geography Group, Department of Earth Sciences, Vrije Universiteit Amsterdam, De Boelelaan 1087 1081 HV Amsterdam The Netherlands; Verburg, Peter H. [Environmental Geography Group, Department of Earth Sciences, Vrije Universiteit Amsterdam, De Boelelaan 1087 1081 HV Amsterdam The Netherlands; Swiss Federal Research Institute WSL, Zürcherstrasse 111 CH-8903 Birmensdorf Switzerland

    2016-06-08

    Model-based global projections of future land use and land cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socio-economic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g. boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process as well as improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches and many studies ignore the uncertainty in LULC projections in assessments of

  15. Hotspots of uncertainty in land-use and land-cover change projections: a global-scale model comparison.

    Science.gov (United States)

    Prestele, Reinhard; Alexander, Peter; Rounsevell, Mark D A; Arneth, Almut; Calvin, Katherine; Doelman, Jonathan; Eitelberg, David A; Engström, Kerstin; Fujimori, Shinichiro; Hasegawa, Tomoko; Havlik, Petr; Humpenöder, Florian; Jain, Atul K; Krisztin, Tamás; Kyle, Page; Meiyappan, Prasanth; Popp, Alexander; Sands, Ronald D; Schaldach, Rüdiger; Schüngel, Jan; Stehfest, Elke; Tabeau, Andrzej; Van Meijl, Hans; Van Vliet, Jasper; Verburg, Peter H

    2016-12-01

    Model-based global projections of future land-use and land-cover (LULC) change are frequently used in environmental assessments to study the impact of LULC change on environmental services and to provide decision support for policy. These projections are characterized by a high uncertainty in terms of quantity and allocation of projected changes, which can severely impact the results of environmental assessments. In this study, we identify hotspots of uncertainty, based on 43 simulations from 11 global-scale LULC change models representing a wide range of assumptions of future biophysical and socioeconomic conditions. We attribute components of uncertainty to input data, model structure, scenario storyline and a residual term, based on a regression analysis and analysis of variance. From this diverse set of models and scenarios, we find that the uncertainty varies, depending on the region and the LULC type under consideration. Hotspots of uncertainty appear mainly at the edges of globally important biomes (e.g., boreal and tropical forests). Our results indicate that an important source of uncertainty in forest and pasture areas originates from different input data applied in the models. Cropland, in contrast, is more consistent among the starting conditions, while variation in the projections gradually increases over time due to diverse scenario assumptions and different modeling approaches. Comparisons at the grid cell level indicate that disagreement is mainly related to LULC type definitions and the individual model allocation schemes. We conclude that improving the quality and consistency of observational data utilized in the modeling process and improving the allocation mechanisms of LULC change models remain important challenges. Current LULC representation in environmental assessments might miss the uncertainty arising from the diversity of LULC change modeling approaches, and many studies ignore the uncertainty in LULC projections in assessments of LULC

  16. Influence of Uncertainties on the Dynamic Buckling Loads of Structures Liable to Asymmetric Postbuckling Behavior

    Directory of Open Access Journals (Sweden)

    Paulo B. Gonçalves

    2008-01-01

    Full Text Available Structural systems liable to asymmetric bifurcation usually become unstable at static load levels lower than the linear buckling load of the perfect structure. This is mainly due to the imperfections present in real structures. The imperfection sensitivity of structures under static loading is well studied in literature, but little is know on the sensitivity of these structures under dynamic loads. The aim of the present work is to study the behavior of an archetypal model of a harmonically forced structure, which exhibits, under increasing static load, asymmetric bifurcation. First, the integrity of the system under static load is investigated in terms of the evolution of the safe basin of attraction. Then, the stability boundaries of the harmonically excited structure are obtained, considering different loading processes. The bifurcations connected with these boundaries are identified and their influence on the evolution of safe basins is investigated. Then, a parametric analysis is conducted to investigate the influence of uncertainties in system parameters and random perturbations of the forcing on the dynamic buckling load. Finally, a safe lower bound for the buckling load, obtained by the application of the Melnikov criterion, is proposed which compare well with the scatter of buckling loads obtained numerically.

  17. Understanding uncertainty in process-based hydrological models

    Science.gov (United States)

    Clark, M. P.; Kavetski, D.; Slater, A. G.; Newman, A. J.; Marks, D. G.; Landry, C.; Lundquist, J. D.; Rupp, D. E.; Nijssen, B.

    2013-12-01

    Building an environmental model requires making a series of decisions regarding the appropriate representation of natural processes. While some of these decisions can already be based on well-established physical understanding, gaps in our current understanding of environmental dynamics, combined with incomplete knowledge of properties and boundary conditions of most environmental systems, make many important modeling decisions far more ambiguous. There is consequently little agreement regarding what a 'correct' model structure is, especially at relatively larger spatial scales such as catchments and beyond. In current practice, faced with such a range of decisions, different modelers will generally make different modeling decisions, often on an ad hoc basis, based on their balancing of process understanding, the data available to evaluate the model, the purpose of the modeling exercise, and their familiarity with or investment in an existing model infrastructure. This presentation describes development and application of multiple-hypothesis models to evaluate process-based hydrologic models. Our numerical model uses robust solutions of the hydrology and thermodynamic governing equations as the structural core, and incorporates multiple options to represent the impact of different modeling decisions, including multiple options for model parameterizations (e.g., below-canopy wind speed, thermal conductivity, storage and transmission of liquid water through soil, etc.), as well as multiple options for model architecture, that is, the coupling and organization of different model components (e.g., representations of sub-grid variability and hydrologic connectivity, coupling with groundwater, etc.). Application of this modeling framework across a collection of different research basins demonstrates that differences among model parameterizations are often overwhelmed by differences among equally-plausible model parameter sets, while differences in model architecture lead

  18. Treatment of precipitation uncertainty in rainfall-runoff modelling: a fuzzy set approach

    Science.gov (United States)

    Maskey, Shreedhar; Guinot, Vincent; Price, Roland K.

    2004-09-01

    The uncertainty in forecasted precipitation remains a major source of uncertainty in real time flood forecasting. Precipitation uncertainty consists of uncertainty in (i) the magnitude, (ii) temporal distribution, and (iii) spatial distribution of the precipitation. This paper presents a methodology for propagating the precipitation uncertainty through a deterministic rainfall-runoff-routing model for flood forecasting. It uses fuzzy set theory combined with genetic algorithms. The uncertainty due to the unknown temporal distribution of the precipitation is achieved by disaggregation of the precipitation into subperiods. The methodology based on fuzzy set theory is particularly useful where a probabilistic forecast of precipitation is not available. A catchment model of the Klodzko valley (Poland) built with HEC-1 and HEC-HMS was used for the application. The results showed that the output uncertainty due to the uncertain temporal distribution of precipitation can be significantly dominant over the uncertainty due to the uncertain quantity of precipitation.

  19. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I

    CERN Document Server

    Bautista, Manuel A; Quinet, Pascal; Dunn, Jay; Kallman, Theodore R Gull Timothy R; Mendoza, Claudio

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e. level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data. We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of O III and Fe II and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe II].

  20. Uncertainties in Atomic Data and Their Propagation Through Spectral Models. I.

    Science.gov (United States)

    Bautista, M. A.; Fivet, V.; Quinet, P.; Dunn, J.; Gull, T. R.; Kallman, T. R.; Mendoza, C.

    2013-01-01

    We present a method for computing uncertainties in spectral models, i.e., level populations, line emissivities, and emission line ratios, based upon the propagation of uncertainties originating from atomic data.We provide analytic expressions, in the form of linear sets of algebraic equations, for the coupled uncertainties among all levels. These equations can be solved efficiently for any set of physical conditions and uncertainties in the atomic data. We illustrate our method applied to spectral models of Oiii and Fe ii and discuss the impact of the uncertainties on atomic systems under different physical conditions. As to intrinsic uncertainties in theoretical atomic data, we propose that these uncertainties can be estimated from the dispersion in the results from various independent calculations. This technique provides excellent results for the uncertainties in A-values of forbidden transitions in [Fe ii]. Key words: atomic data - atomic processes - line: formation - methods: data analysis - molecular data - molecular processes - techniques: spectroscopic

  1. Using the Community Land Model to Assess Uncertainty in Basin Scale GRACE-Based Groundwater Estimates

    Science.gov (United States)

    Swenson, S. C.; Lawrence, D. M.

    2015-12-01

    One method for interpreting the variability in total water storage observed by GRACE is to partition the integrated GRACE measurement into its component storage reservoirs based on information provided by hydrological models. Such models, often designed to be used in couple Earth System models, simulate the stocks and fluxes of moisture through the land surface and subsurface. One application of this method attempts to isolate groundwater changes by removing modeled surface water, snow, and soil moisture changes from GRACE total water storage estimates. Human impacts on groundwater variability can be estimated by further removing model estimates of climate-driven groundwater changes. Errors in modeled water storage components directly affect the residual groundwater estimates. Here we examine the influence of model structure and process representation on soil moisture and groundwater uncertainty using the Community Land Model, with a particular focus on basins in the western U.S.

  2. Uncertainty in urban flood damage assessment due to urban drainage modelling and depth-damage curve estimation.

    Science.gov (United States)

    Freni, G; La Loggia, G; Notaro, V

    2010-01-01

    Due to the increased occurrence of flooding events in urban areas, many procedures for flood damage quantification have been defined in recent decades. The lack of large databases in most cases is overcome by combining the output of urban drainage models and damage curves linking flooding to expected damage. The application of advanced hydraulic models as diagnostic, design and decision-making support tools has become a standard practice in hydraulic research and application. Flooding damage functions are usually evaluated by a priori estimation of potential damage (based on the value of exposed goods) or by interpolating real damage data (recorded during historical flooding events). Hydraulic models have undergone continuous advancements, pushed forward by increasing computer capacity. The details of the flooding propagation process on the surface and the details of the interconnections between underground and surface drainage systems have been studied extensively in recent years, resulting in progressively more reliable models. The same level of was advancement has not been reached with regard to damage curves, for which improvements are highly connected to data availability; this remains the main bottleneck in the expected flooding damage estimation. Such functions are usually affected by significant uncertainty intrinsically related to the collected data and to the simplified structure of the adopted functional relationships. The present paper aimed to evaluate this uncertainty by comparing the intrinsic uncertainty connected to the construction of the damage-depth function to the hydraulic model uncertainty. In this way, the paper sought to evaluate the role of hydraulic model detail level in the wider context of flood damage estimation. This paper demonstrated that the use of detailed hydraulic models might not be justified because of the higher computational cost and the significant uncertainty in damage estimation curves. This uncertainty occurs mainly

  3. What factor generates greater uncertainty in predicting carbon flux for North America: climate characterization or model choice?

    Science.gov (United States)

    Dungan, J.; Wang, W.; Micaelis, A.; Nemani, R.

    2008-12-01

    Numerous efforts have begun to characterize a variety of sources of uncertainty in carbon flux estimates from both forward-modeling and inverse modeling approaches. One source of uncertainty is structural, created by the variety of approaches taken to select and characterize the most important biogeochemical processes. To begin to explore this structural uncertainty, we have used an ensemble of well-known models including CASA (Potter et al. (1993), version 2003.04.29), LPJ (Sitch et al. (2003), version 3.1.1-0.9.02), and BGC (White et al. (2000), version 5.0) with a consistent set of inputs for the period 1982-2006 for North America. Initially, the ensemble was run using input climate data interpolated from maximum, minimum and dew-point temperatures, precipitation, vapor pressure deficit, and incident daily solar radiation at stations from the National Climate Data Center's Global Summary of the Day, incorporating on average about 1900 stations. NCDC's Cooperative Summary of the Day data, available over the United States only, yielded a combined data set of approximately 9000 stations that was then used for the ensemble runs. The combined data set resulted in a significantly wetter surface than with the sparser set, resulting in noticeably larger gross primary production (GPP) estimates by models in the ensemble. Mexico and Canada remain significantly undersampled. Uncertainty due to the choice of a relatively sparse or dense station network was smaller than the structural uncertainty due to model choice.

  4. Uncertainty Quantification for Complex RF-structures Using the State-space Concatenation Approach

    CERN Document Server

    Heller, Johann; Schmidt, Christian; Van Rienen, Ursula

    2015-01-01

    as well as to employ robust optimizations, a so-called uncertainty quantification (UQ) is applied. For large and complex structures such computations are heavily demanding and cannot be carried out using standard brute-force approaches. In this paper, we propose a combination of established techniques to perform UQ for long and complex structures, where the uncertainty is located only in parts of the structure. As exemplary structure, we investigate the third-harmonic cavity, which is being used at the FLASH accelerator at DESY, assuming an uncertain...

  5. Hydrological model parameter dimensionality is a weak measure of prediction uncertainty

    Directory of Open Access Journals (Sweden)

    S. Pande

    2015-04-01

    Full Text Available This paper shows that instability of hydrological system representation in response to different pieces of information and associated prediction uncertainty is a function of model complexity. After demonstrating the connection between unstable model representation and model complexity, complexity is analyzed in a step by step manner. This is done measuring differences between simulations of a model under different realizations of input forcings. Algorithms are then suggested to estimate model complexity. Model complexities of the two model structures, SAC-SMA (Sacramento Soil Moisture Accounting and its simplified version SIXPAR (Six Parameter Model, are computed on resampled input data sets from basins that span across the continental US. The model complexities for SIXPAR are estimated for various parameter ranges. It is shown that complexity of SIXPAR increases with lower storage capacity and/or higher recession coefficients. Thus it is argued that a conceptually simple model structure, such as SIXPAR, can be more complex than an intuitively more complex model structure, such as SAC-SMA for certain parameter ranges. We therefore contend that magnitudes of feasible model parameters influence the complexity of the model selection problem just as parameter dimensionality (number of parameters does and that parameter dimensionality is an incomplete indicator of stability of hydrological model selection and prediction problems.

  6. Uncertainty based modeling of rainfall-runoff: Combined differential evolution adaptive Metropolis (DREAM) and K-means clustering

    Science.gov (United States)

    Zahmatkesh, Zahra; Karamouz, Mohammad; Nazif, Sara

    2015-09-01

    Simulation of rainfall-runoff process in urban areas is of great importance considering the consequences and damages of extreme runoff events and floods. The first issue in flood hazard analysis is rainfall simulation. Large scale climate signals have been proved to be effective in rainfall simulation and prediction. In this study, an integrated scheme is developed for rainfall-runoff modeling considering different sources of uncertainty. This scheme includes three main steps of rainfall forecasting, rainfall-runoff simulation and future runoff prediction. In the first step, data driven models are developed and used to forecast rainfall using large scale climate signals as rainfall predictors. Due to high effect of different sources of uncertainty on the output of hydrologic models, in the second step uncertainty associated with input data, model parameters and model structure is incorporated in rainfall-runoff modeling and simulation. Three rainfall-runoff simulation models are developed for consideration of model conceptual (structural) uncertainty in real time runoff forecasting. To analyze the uncertainty of the model structure, streamflows generated by alternative rainfall-runoff models are combined, through developing a weighting method based on K-means clustering. Model parameters and input uncertainty are investigated using an adaptive Markov Chain Monte Carlo method. Finally, calibrated rainfall-runoff models are driven using the forecasted rainfall to predict future runoff for the watershed. The proposed scheme is employed in the case study of the Bronx River watershed, New York City. Results of uncertainty analysis of rainfall-runoff modeling reveal that simultaneous estimation of model parameters and input uncertainty significantly changes the probability distribution of the model parameters. It is also observed that by combining the outputs of the hydrological models using the proposed clustering scheme, the accuracy of runoff simulation in the

  7. Effect of Baseflow Separation on Uncertainty of Hydrological Modeling in the Xinanjiang Model

    Directory of Open Access Journals (Sweden)

    Kairong Lin

    2014-01-01

    Full Text Available Based on the idea of inputting more available useful information for evaluation to gain less uncertainty, this study focuses on how well the uncertainty can be reduced by considering the baseflow estimation information obtained from the smoothed minima method (SMM. The Xinanjiang model and the generalized likelihood uncertainty estimation (GLUE method with the shuffled complex evolution Metropolis (SCEM-UA sampling algorithm were used for hydrological modeling and uncertainty analysis, respectively. The Jiangkou basin, located in the upper of the Hanjiang River, was selected as case study. It was found that the number and standard deviation of behavioral parameter sets both decreased when the threshold value for the baseflow efficiency index increased, and the high Nash-Sutcliffe efficiency coefficients correspond well with the high baseflow efficiency coefficients. The results also showed that uncertainty interval width decreased significantly, while containing ratio did not decrease by much and the simulated runoff with the behavioral parameter sets can fit better to the observed runoff, when threshold for the baseflow efficiency index was taken into consideration. These implied that using the baseflow estimation information can reduce the uncertainty in hydrological modeling to some degree and gain more reasonable prediction bounds.

  8. A Random Matrix Approach for Quantifying Model-Form Uncertainties in Turbulence Modeling

    CERN Document Server

    Xiao, Heng; Ghanem, Roger G

    2016-01-01

    With the ever-increasing use of Reynolds-Averaged Navier--Stokes (RANS) simulations in mission-critical applications, the quantification of model-form uncertainty in RANS models has attracted attention in the turbulence modeling community. Recently, a physics-based, nonparametric approach for quantifying model-form uncertainty in RANS simulations has been proposed, where Reynolds stresses are projected to physically meaningful dimensions and perturbations are introduced only in the physically realizable limits. However, a challenge associated with this approach is to assess the amount of information introduced in the prior distribution and to avoid imposing unwarranted constraints. In this work we propose a random matrix approach for quantifying model-form uncertainties in RANS simulations with the realizability of the Reynolds stress guaranteed. Furthermore, the maximum entropy principle is used to identify the probability distribution that satisfies the constraints from available information but without int...

  9. Impact of uncertainty description on assimilating hydraulic head in the MIKE SHE distributed hydrological model

    Science.gov (United States)

    Zhang, Donghua; Madsen, Henrik; Ridler, Marc E.; Refsgaard, Jens C.; Jensen, Karsten H.

    2015-12-01

    The ensemble Kalman filter (EnKF) is a popular data assimilation (DA) technique that has been extensively used in environmental sciences for combining complementary information from model predictions and observations. One of the major challenges in EnKF applications is the description of model uncertainty. In most hydrological EnKF applications, an ad hoc model uncertainty is defined with the aim of avoiding a collapse of the filter. The present work provides a systematic assessment of model uncertainty in DA applications based on combinations of forcing, model parameters, and state uncertainties. This is tested in a case where groundwater hydraulic heads are assimilated into a distributed and integrated catchment-scale model of the Karup catchment in Denmark. A series of synthetic data assimilation experiments are carried out to analyse the impact of different model uncertainty assumptions on the feasibility and efficiency of the assimilation. The synthetic data used in the assimilation study makes it possible to diagnose model uncertainty assumptions statistically. Besides the model uncertainty, other factors such as observation error, observation locations, and ensemble size are also analysed with respect to performance and sensitivity. Results show that inappropriate definition of model uncertainty can greatly degrade the assimilation performance, and an appropriate combination of different model uncertainty sources is advised.

  10. Uncertainty and Variation of Vibration in Lightweight Structures

    DEFF Research Database (Denmark)

    Dickow, Kristoffer Ahrens

    2012-01-01

    Multi-family dwellings and offices build from lightweight materials are becoming a cost efficient and environmentally friendly alternative to traditional heavy structures.......Multi-family dwellings and offices build from lightweight materials are becoming a cost efficient and environmentally friendly alternative to traditional heavy structures....

  11. Evaluation of Parameter Uncertainty Reduction in Groundwater Flow Modeling Using Multiple Environmental Tracers

    Science.gov (United States)

    Arnold, B. W.; Gardner, P.

    2013-12-01

    years is similar to the range of transport times (hundreds to thousands of years) in the heterogeneous synthetic aquifer domain. The slightly higher uncertainty range for the case using all of the environmental tracers simultaneously is probably due to structural errors in the model introduced by the pilot point regularization scheme. It is concluded that maximum information and uncertainty reduction for constraining a groundwater flow model is obtained using an environmental tracer whose half-life is well matched to the range of transport times through the groundwater flow system. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  12. Uncertainty modeling dedicated to professor Boris Kovalerchuk on his anniversary

    CERN Document Server

    2017-01-01

    This book commemorates the 65th birthday of Dr. Boris Kovalerchuk, and reflects many of the research areas covered by his work. It focuses on data processing under uncertainty, especially fuzzy data processing, when uncertainty comes from the imprecision of expert opinions. The book includes 17 authoritative contributions by leading experts.

  13. A performance weighting procedure for GCMs based on explicit probabilistic models and accounting for observation uncertainty

    Science.gov (United States)

    Renard, Benjamin; Vidal, Jean-Philippe

    2016-04-01

    In recent years, the climate modeling community has put a lot of effort into releasing the outputs of multimodel experiments for use by the wider scientific community. In such experiments, several structurally distinct GCMs are run using the same observed forcings (for the historical period) or the same projected forcings (for the future period). In addition, several members are produced for a single given model structure, by running each GCM with slightly different initial conditions. This multiplicity of GCM outputs offers many opportunities in terms of uncertainty quantification or GCM comparisons. In this presentation, we propose a new procedure to weight GCMs according to their ability to reproduce the observed climate. Such weights can be used to combine the outputs of several models in a way that rewards good-performing models and discards poorly-performing ones. The proposed procedure has the following main properties: 1. It is based on explicit probabilistic models describing the time series produced by the GCMs and the corresponding historical observations, 2. It can use several members whenever available, 3. It accounts for the uncertainty in observations, 4. It assigns a weight to each GCM (all weights summing up to one), 5. It can also assign a weight to the "H0 hypothesis" that all GCMs in the multimodel ensemble are not compatible with observations. The application of the weighting procedure is illustrated with several case studies including synthetic experiments, simple cases where the target GCM output is a simple univariate variable and more realistic cases where the target GCM output is a multivariate and/or a spatial variable. These case studies illustrate the generality of the procedure which can be applied in a wide range of situations, as long as the analyst is prepared to make an explicit probabilistic assumption on the target variable. Moreover, these case studies highlight several interesting properties of the weighting procedure. In

  14. Assessment of model behavior and acceptable forcing data uncertainty in the context of land surface soil moisture estimation

    Science.gov (United States)

    Dumedah, Gift; Walker, Jeffrey P.

    2017-03-01

    The sources of uncertainty in land surface models are numerous and varied, from inaccuracies in forcing data to uncertainties in model structure and parameterizations. Majority of these uncertainties are strongly tied to the overall makeup of the model, but the input forcing data set is independent with its accuracy usually defined by the monitoring or the observation system. The impact of input forcing data on model estimation accuracy has been collectively acknowledged to be significant, yet its quantification and the level of uncertainty that is acceptable in the context of the land surface model to obtain a competitive estimation remain mostly unknown. A better understanding is needed about how models respond to input forcing data and what changes in these forcing variables can be accommodated without deteriorating optimal estimation of the model. As a result, this study determines the level of forcing data uncertainty that is acceptable in the Joint UK Land Environment Simulator (JULES) to competitively estimate soil moisture in the Yanco area in south eastern Australia. The study employs hydro genomic mapping to examine the temporal evolution of model decision variables from an archive of values obtained from soil moisture data assimilation. The data assimilation (DA) was undertaken using the advanced Evolutionary Data Assimilation. Our findings show that the input forcing data have significant impact on model output, 35% in root mean square error (RMSE) for 5cm depth of soil moisture and 15% in RMSE for 15cm depth of soil moisture. This specific quantification is crucial to illustrate the significance of input forcing data spread. The acceptable uncertainty determined based on dominant pathway has been validated and shown to be reliable for all forcing variables, so as to provide optimal soil moisture. These findings are crucial for DA in order to account for uncertainties that are meaningful from the model standpoint. Moreover, our results point to a proper

  15. The role of uncertainty in supply chains under dynamic modeling

    Directory of Open Access Journals (Sweden)

    M. Fera

    2017-01-01

    Full Text Available The uncertainty in the supply chains (SCs for manufacturing and services firms is going to be, over the coming decades, more important for the companies that are called to compete in a new globalized economy. Risky situations for manufacturing are considered in trying to individuate the optimal positioning of the order penetration point (OPP. It aims at defining the best level of information of the client’s order going back through the several supply chain (SC phases, i.e. engineering, procurement, production and distribution. This work aims at defining a system dynamics model to assess competitiveness coming from the positioning of the order in different SC locations. A Taguchi analysis has been implemented to create a decision map for identifying possible strategic decisions under different scenarios and with alternatives for order location in the SC levels. Centralized and decentralized strategies for SC integration are discussed. In the model proposed, the location of OPP is influenced by the demand variation, production time, stock-outs and stock amount. Results of this research are as follows: (i customer-oriented strategies are preferable under high volatility of demand, (ii production-focused strategies are suggested when the probability of stock-outs is high, (iii no specific location is preferable if a centralized control architecture is implemented, (iv centralization requires cooperation among partners to achieve the SC optimum point, (v the producer must not prefer the OPP location at the Retailer level when the general strategy is focused on a decentralized approach.

  16. Evaluation of Spatial Uncertainties In Modeling of Cadastral Systems

    Science.gov (United States)

    Fathi, Morteza; Teymurian, Farideh

    2013-04-01

    Cadastre plays an essential role in sustainable development especially in developing countries like Iran. A well-developed Cadastre results in transparency of estates tax system, transparency of data of estate, reduction of action before the courts and effective management of estates and natural sources and environment. Multipurpose Cadastre through gathering of other related data has a vital role in civil, economic and social programs and projects. Iran is being performed Cadastre for many years but success in this program is subject to correct geometric and descriptive data of estates. Since there are various sources of data with different accuracy and precision in Iran, some difficulties and uncertainties are existed in modeling of geometric part of Cadastre such as inconsistency between data in deeds and Cadastral map which cause some troubles in execution of cadastre and result in losing national and natural source, rights of nation. Now there is no uniform and effective technical method for resolving such conflicts. This article describes various aspects of such conflicts in geometric part of cadastre and suggests a solution through some modeling tools of GIS.

  17. Modeling Uncertainty of Directed Movement via Markov Chains

    Directory of Open Access Journals (Sweden)

    YIN Zhangcai

    2015-10-01

    Full Text Available Probabilistic time geography (PTG is suggested as an extension of (classical time geography, in order to present the uncertainty of an agent located at the accessible position by probability. This may provide a quantitative basis for most likely finding an agent at a location. In recent years, PTG based on normal distribution or Brown bridge has been proposed, its variance, however, is irrelevant with the agent's speed or divergent with the increase of the speed; so they are difficult to take into account application pertinence and stability. In this paper, a new method is proposed to model PTG based on Markov chain. Firstly, a bidirectional conditions Markov chain is modeled, the limit of which, when the moving speed is large enough, can be regarded as the Brown bridge, thus has the characteristics of digital stability. Then, the directed movement is mapped to Markov chains. The essential part is to build step length, the state space and transfer matrix of Markov chain according to the space and time position of directional movement, movement speed information, to make sure the Markov chain related to the movement speed. Finally, calculating continuously the probability distribution of the directed movement at any time by the Markov chains, it can be get the possibility of an agent located at the accessible position. Experimental results show that, the variance based on Markov chains not only is related to speed, but also is tending towards stability with increasing the agent's maximum speed.

  18. Fine Structure Constant, Domain Walls, and Generalized Uncertainty Principle in the Universe

    Directory of Open Access Journals (Sweden)

    Luigi Tedesco

    2011-01-01

    Full Text Available We study the corrections to the fine structure constant from the generalized uncertainty principle in the spacetime of a domain wall. We also calculate the corrections to the standard formula to the energy of the electron in the hydrogen atom to the ground state, in the case of spacetime of a domain wall and generalized uncertainty principle. The results generalize the cases known in literature.

  19. Power system transient stability simulation under uncertainty based on Taylor model arithmetic

    Institute of Scientific and Technical Information of China (English)

    Shouxiang WANG; Zhijie ZHENG; Chengshan WANG

    2009-01-01

    The Taylor model arithmetic is introduced to deal with uncertainty. The uncertainty of model parameters is described by Taylor models and each variable in functions is replaced with the Taylor model (TM). Thus,time domain simulation under uncertainty is transformed to the integration of TM-based differential equations. In this paper, the Taylor series method is employed to compute differential equations; moreover, power system time domain simulation under uncertainty based on Taylor model method is presented. This method allows a rigorous estimation of the influence of either form of uncertainty and only needs one simulation. It is computationally fast compared with the Monte Carlo method, which is another technique for uncertainty analysis. The proposed method has been tested on the 39-bus New England system. The test results illustrate the effectiveness and practical value of the approach by comparing with the results of Monte Carlo simulation and traditional time domain simulation.

  20. Predicting Consumer Biomass, Size-Structure, Production, Catch Potential, Responses to Fishing and Associated Uncertainties in the World's Marine Ecosystems.

    Science.gov (United States)

    Jennings, Simon; Collingridge, Kate

    2015-01-01

    Existing estimates of fish and consumer biomass in the world's oceans are disparate. This creates uncertainty about the roles of fish and other consumers in biogeochemical cycles and ecosystem processes, the extent of human and environmental impacts and fishery potential. We develop and use a size-based macroecological model to assess the effects of parameter uncertainty on predicted consumer biomass, production and distribution. Resulting uncertainty is large (e.g. median global biomass 4.9 billion tonnes for consumers weighing 1 g to 1000 kg; 50% uncertainty intervals of 2 to 10.4 billion tonnes; 90% uncertainty intervals of 0.3 to 26.1 billion tonnes) and driven primarily by uncertainty in trophic transfer efficiency and its relationship with predator-prey body mass ratios. Even the upper uncertainty intervals for global predictions of consumer biomass demonstrate the remarkable scarcity of marine consumers, with less than one part in 30 million by volume of the global oceans comprising tissue of macroscopic animals. Thus the apparently high densities of marine life seen in surface and coastal waters and frequently visited abundance hotspots will likely give many in society a false impression of the abundance of marine animals. Unexploited baseline biomass predictions from the simple macroecological model were used to calibrate a more complex size- and trait-based model to estimate fisheries yield and impacts. Yields are highly dependent on baseline biomass and fisheries selectivity. Predicted global sustainable fisheries yield increases ≈4 fold when smaller individuals (biomass and production estimates, which have yet to be achieved with complex models, and will therefore help to highlight priorities for future research and data collection. However, the focus on simple model structures and global processes means that non-phytoplankton primary production and several groups, structures and processes of ecological and conservation interest are not represented

  1. Statistical quantification of the uncertainty in transmissibility feature for structural condition binary classification

    Science.gov (United States)

    Mao, Zhu; Todd, Michael

    2011-04-01

    Transmissibility-related features are one class of indicators used to detect structural defects, especially because of their sensitivity to local changes. In this paper, we consider a SIMO identification model and regard the change in transmissibility as a feature indicating damage occurrence. Both inherent randomness in the system identification process and the noise contamination (or other types of measurement/sampling/quantization variabilities) are included as the sources of transmissibility uncertainty. The uncertainty quantification is necessary to group-classify the measurements into either undamaged or damaged (binary) conditions with a better understanding of Type I/II trade-offs. A sensitivity research is deployed in this paper, where Receiver Operating Characteristic (ROC) curves for individual frequency lines are given with different damage levels and extraneous noise levels, and Area Under Curve (AUC) will be evaluated as the key performance metric across the entire frequency domain. The paper concludes that regions near resonance will have the best hypothesis test performance in terms of sensitivity and specificity.

  2. Pathways from uncertainty to anxiety: An evaluation of a hierarchical model of trait and disorder-specific intolerance of uncertainty on anxiety disorder symptoms.

    Science.gov (United States)

    Shihata, Sarah; McEvoy, Peter M; Mullan, Barbara A

    2017-01-01

    Uncertainty is central to anxiety-related pathology and intolerance of uncertainty (IU) appears to be a transdiagnostic risk and maintaining factor. The aim of the present study was to evaluate a hierarchical model to identify the unique contributions of trait and disorder-specific IU (i.e., uncertainty specific to generalised anxiety disorder, social anxiety, obsessive compulsive disorder, and panic disorder) to disorder-specific symptoms, beyond other disorder-specific cognitive vulnerabilities (i.e., negative metacognitive beliefs, fear of negative evaluation, inflated responsibility, and agoraphobic cognitions, respectively). Participants (N=506) completed a battery of online questionnaires. Structural equation modelling was used to evaluate model fit, as well as direct and indirect pathways. Trait and disorder-specific IU were significantly associated with multiple cognitive vulnerability factors and disorder symptoms. Indirect effects between trait IU and symptoms were observed through disorder-specific IU and cognitive vulnerabilities. The relative contribution of trait IU and disorder-specific IU to symptoms varied and theoretical and clinical implications are highlighted. Limitations include the cross-sectional design and reliance on self-report. Avenues for further research include a need for replication and extension of the model in different samples and using experimental and multi-method research methods.

  3. Quantification of the impact of precipitation spatial distribution uncertainty on predictive uncertainty of a snowmelt runoff model

    Science.gov (United States)

    Jacquin, A. P.

    2012-04-01

    This study is intended to quantify the impact of uncertainty about precipitation spatial distribution on predictive uncertainty of a snowmelt runoff model. This problem is especially relevant in mountain catchments with a sparse precipitation observation network and relative short precipitation records. The model analysed is a conceptual watershed model operating at a monthly time step. The model divides the catchment into five elevation zones, where the fifth zone corresponds to the catchment's glaciers. Precipitation amounts at each elevation zone i are estimated as the product between observed precipitation at a station and a precipitation factor FPi. If other precipitation data are not available, these precipitation factors must be adjusted during the calibration process and are thus seen as parameters of the model. In the case of the fifth zone, glaciers are seen as an inexhaustible source of water that melts when the snow cover is depleted.The catchment case study is Aconcagua River at Chacabuquito, located in the Andean region of Central Chile. The model's predictive uncertainty is measured in terms of the output variance of the mean squared error of the Box-Cox transformed discharge, the relative volumetric error, and the weighted average of snow water equivalent in the elevation zones at the end of the simulation period. Sobol's variance decomposition (SVD) method is used for assessing the impact of precipitation spatial distribution, represented by the precipitation factors FPi, on the models' predictive uncertainty. In the SVD method, the first order effect of a parameter (or group of parameters) indicates the fraction of predictive uncertainty that could be reduced if the true value of this parameter (or group) was known. Similarly, the total effect of a parameter (or group) measures the fraction of predictive uncertainty that would remain if the true value of this parameter (or group) was unknown, but all the remaining model parameters could be fixed

  4. Combination of anti-optimization and fuzzy-set-based analysis for structural optimization under uncertainty

    Directory of Open Access Journals (Sweden)

    J. Fang

    1998-01-01

    Full Text Available An approach to the optimum design of structures, in which uncertainties with a fuzzy nature in the magnitude of the loads are considered, is proposed in this study. The optimization process under fuzzy loads is transformed into a fuzzy optimization problem based on the notion of Werners' maximizing set by defining membership functions of the objective function and constraints. In this paper, Werner's maximizing set is defined using the results obtained by first conducting an optimization through anti-optimization modeling of the uncertain loads. An example of a ten-bar truss is used to illustrate the present optimization process. The results are compared with those yielded by other optimization methods.

  5. Uncertainty Assessment in Long Term Urban Drainage Modelling

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    the probability of system failures (defined as either flooding or surcharge of manholes or combined sewer overflow); (2) an application of the Generalized Likelihood Uncertainty Estimation methodology in which an event based stochastic calibration is performed; and (3) long term Monte Carlo simulations...... with the purpose of estimating the uncertainties on the extreme event statistics of maximum water levels and combined sewer overflow volumes in drainage systems. The thesis concludes that the uncertainties on both maximum water levels and combined sewer overflow volumes are considerable, especially on the large...

  6. Incorporating Fuzzy Systems Modeling and Possibility Theory in Hydrogeological Uncertainty Analysis

    Science.gov (United States)

    Faybishenko, B.

    2008-12-01

    Hydrogeological predictions are subject to numerous uncertainties, including the development of conceptual, mathematical, and numerical models, as well as determination of their parameters. Stochastic simulations of hydrogeological systems and the associated uncertainty analysis are usually based on the assumption that the data characterizing spatial and temporal variations of hydrogeological processes are random, and the output uncertainty is quantified using a probability distribution. However, hydrogeological systems are often characterized by imprecise, vague, inconsistent, incomplete or subjective information. One of the modern approaches to modeling and uncertainty quantification of such systems is based on using a combination of statistical and fuzzy-logic uncertainty analyses. The aims of this presentation are to: (1) present evidence of fuzziness in developing conceptual hydrogeological models, and (2) give examples of the integration of the statistical and fuzzy-logic analyses in modeling and assessing both aleatoric uncertainties (e.g., caused by vagueness in assessing the subsurface system heterogeneities of fractured-porous media) and epistemic uncertainties (e.g., caused by the selection of different simulation models) involved in hydrogeological modeling. The author will discuss several case studies illustrating the application of fuzzy modeling for assessing the water balance and water travel time in unsaturated-saturated media. These examples will include the evaluation of associated uncertainties using the main concepts of possibility theory, a comparison between the uncertainty evaluation using probabilistic and possibility theories, and a transformation of the probabilities into possibilities distributions (and vice versa) for modeling hydrogeological processes.

  7. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    NARCIS (Netherlands)

    Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.

    2008-01-01

    By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tun

  8. Estimating the magnitude of prediction uncertainties for field-scale P loss models

    Science.gov (United States)

    Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, an uncertainty analysis for the Annual P Loss Estima...

  9. Parameter uncertainty analysis for the annual phosphorus loss estimator (APLE) model

    Science.gov (United States)

    Technical abstract: Models are often used to predict phosphorus (P) loss from agricultural fields. While it is commonly recognized that model predictions are inherently uncertain, few studies have addressed prediction uncertainties using P loss models. In this study, we conduct an uncertainty analys...

  10. Exploring uncertainty in glacier mass balance modelling with Monte Carlo simulation

    NARCIS (Netherlands)

    Machguth, H.; Purves, R.S.; Oerlemans, J.; Hoelzle, M.; Paul, F.

    2008-01-01

    By means of Monte Carlo simulations we calculated uncertainty in modelled cumulative mass balance over 400 days at one particular point on the tongue of Morteratsch Glacier, Switzerland, using a glacier energy balance model of intermediate complexity. Before uncertainty assessment, the model was tun

  11. GCR Environmental Models III: GCR Model Validation and Propagated Uncertainties in Effective Dose

    Science.gov (United States)

    Slaba, Tony C.; Xu, Xiaojing; Blattnig, Steve R.; Norman, Ryan B.

    2014-01-01

    This is the last of three papers focused on quantifying the uncertainty associated with galactic cosmic rays (GCR) models used for space radiation shielding applications. In the first paper, it was found that GCR ions with Z>2 and boundary energy below 500 MeV/nucleon induce less than 5% of the total effective dose behind shielding. This is an important finding since GCR model development and validation have been heavily biased toward Advanced Composition Explorer/Cosmic Ray Isotope Spectrometer measurements below 500 MeV/nucleon. Weights were also developed that quantify the relative contribution of defined GCR energy and charge groups to effective dose behind shielding. In the second paper, it was shown that these weights could be used to efficiently propagate GCR model uncertainties into effective dose behind shielding. In this work, uncertainties are quantified for a few commonly used GCR models. A validation metric is developed that accounts for measurements uncertainty, and the metric is coupled to the fast uncertainty propagation method. For this work, the Badhwar-O'Neill (BON) 2010 and 2011 and the Matthia GCR models are compared to an extensive measurement database. It is shown that BON2011 systematically overestimates heavy ion fluxes in the range 0.5-4 GeV/nucleon. The BON2010 and BON2011 also show moderate and large errors in reproducing past solar activity near the 2000 solar maximum and 2010 solar minimum. It is found that all three models induce relative errors in effective dose in the interval [-20%, 20%] at a 68% confidence level. The BON2010 and Matthia models are found to have similar overall uncertainty estimates and are preferred for space radiation shielding applications.

  12. Model Uncertainty and Test of a Segmented Mirror Telescope

    Science.gov (United States)

    2014-03-01

    course of this thesis effort. I would also like to thank Dr. Alan Jennings who provided valuable background knowledge and support for all facets of...includes mass, spring, laminate , and rigid elements in addition to the structural elements. The spatial resolution is very high, resulting in 3.26... laminate face with isogrid back and support joints. To reduce the complexity of the mirror, a common modeling technique is to model uniform honeycomb

  13. Modelling sensitivity and uncertainty in a LCA model for waste management systems - EASETECH

    DEFF Research Database (Denmark)

    Damgaard, Anders; Clavreul, Julie; Baumeister, Hubert

    2013-01-01

    In the new model, EASETECH, developed for LCA modelling of waste management systems, a general approach for sensitivity and uncertainty assessment for waste management studies has been implemented. First general contribution analysis is done through a regular interpretation of inventory and impact...

  14. Disentangling the uncertainty of hydrologic drought characteristics in a multi-model century-long experiment in continental river basins

    Science.gov (United States)

    Samaniego, Luis; Kumar, Rohini; Pechlivanidis, Illias; Breuer, Lutz; Wortmann, Michel; Vetter, Tobias; Flörke, Martina; Chamorro, Alejandro; Schäfer, David; Shah, Harsh; Zeng, Xiaofan

    2016-04-01

    The quantification of the predictive uncertainty in hydrologic models and their attribution to its main sources is of particular interest in climate change studies. In recent years, a number of studies have been aimed at assessing the ability of hydrologic models (HMs) to reproduce extreme hydrologic events. Disentangling the overall uncertainty of streamflow -including its derived low-flow characteristics- into individual contributions, stemming from forcings and model structure, has also been studied. Based on recent literature, it can be stated that there is a controversy with respect to which source is the largest (e.g., Teng, et al. 2012, Bosshard et al. 2013, Prudhomme et al. 2014). Very little has also been done to estimate the relative impact of the parametric uncertainty of the HMs with respect to overall uncertainty of low-flow characteristics. The ISI-MIP2 project provides a unique opportunity to understand the propagation of forcing and model structure uncertainties into century-long time series of drought characteristics. This project defines a consistent framework to deal with compatible initial conditions for the HMs and a set of standardized historical and future forcings. Moreover, the ensemble of hydrologic model predictions varies across a broad range of climate scenarios and regions. To achieve this goal, we use six preconditioned hydrologic models (HYPE or HBV, mHM, SWIM, VIC, and WaterGAP3) set up in seven large continental river basins: Amazon, Blue Nile, Ganges, Niger, Mississippi, Rhine, Yellow. These models are forced with bias-corrected outputs of five CMIP5 general circulation models (GCM) under four extreme representative concentration pathway (RCP) scenarios (i.e. 2.6, 4.5, 6.0, and 8.5 Wm-2) for the period 1971-2099. Simulated streamflow is transformed into a monthly runoff index (RI) to analyze the attribution of the GCM and HM uncertainty into drought magnitude and duration over time. Uncertainty contributions are investigated

  15. Greenhouse Gas Source Attribution: Measurements Modeling and Uncertainty Quantification

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Sargsyan, Khachik [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); van Bloemen Waanders, Bart Gustaaf [Sandia National Lab. (SNL-CA), Livermore, CA (United States); LaFranchi, Brian W. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ivey, Mark D. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Schrader, Paul E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Michelsen, Hope A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Bambha, Ray P. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2014-09-01

    In this project we have developed atmospheric measurement capabilities and a suite of atmospheric modeling and analysis tools that are well suited for verifying emissions of green- house gases (GHGs) on an urban-through-regional scale. We have for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate atmospheric CO2 . This will allow for the examination of regional-scale transport and distribution of CO2 along with air pollutants traditionally studied using CMAQ at relatively high spatial and temporal resolution with the goal of leveraging emissions verification efforts for both air quality and climate. We have developed a bias-enhanced Bayesian inference approach that can remedy the well-known problem of transport model errors in atmospheric CO2 inversions. We have tested the approach using data and model outputs from the TransCom3 global CO2 inversion comparison project. We have also performed two prototyping studies on inversion approaches in the generalized convection-diffusion context. One of these studies employed Polynomial Chaos Expansion to accelerate the evaluation of a regional transport model and enable efficient Markov Chain Monte Carlo sampling of the posterior for Bayesian inference. The other approach uses de- terministic inversion of a convection-diffusion-reaction system in the presence of uncertainty. These approaches should, in principle, be applicable to realistic atmospheric problems with moderate adaptation. We outline a regional greenhouse gas source inference system that integrates (1) two ap- proaches of atmospheric dispersion simulation and (2) a class of Bayesian inference and un- certainty quantification algorithms. We use two different and complementary approaches to simulate atmospheric dispersion. Specifically, we use a Eulerian chemical transport model CMAQ and a Lagrangian Particle Dispersion Model - FLEXPART-WRF. These two models share the same WRF

  16. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The objective for the Phase II effort will be to develop a comprehensive, efficient, and flexible uncertainty quantification (UQ) framework implemented within a...

  17. Impact on Model Uncertainty of Diabatization in Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2014-01-01

    This work provides uncertainty and sensitivity analysis of design of conventional and heat integrated distillation columns using Monte Carlo simulations. Selected uncertain parameters are relative volatility, heat of vaporization, the overall heat transfer coefficient , tray hold-up, and adiabat ...

  18. Statistical approach for uncertainty quantification of experimental modal model parameters

    DEFF Research Database (Denmark)

    Luczak, M.; Peeters, B.; Kahsin, M.

    2014-01-01

    estimates obtained from vibration experiments. Modal testing results are influenced by numerous factors introducing uncertainty to the measurement results. Different experimental techniques applied to the same test item or testing numerous nominally identical specimens yields different test results...

  19. Quantification of Uncertainties in Integrated Spacecraft System Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed effort is to investigate a novel uncertainty quantification (UQ) approach based on non-intrusive polynomial chaos (NIPC) for computationally efficient...

  20. An Efficient Deterministic Approach to Model-based Prediction Uncertainty

    Data.gov (United States)

    National Aeronautics and Space Administration — Prognostics deals with the prediction of the end of life (EOL) of a system. EOL is a random variable, due to the presence of process noise and uncertainty in the...

  1. 'spup' - an R package for uncertainty propagation analysis in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2017-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability and being able to deal with case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected visualization methods that are understandable by non-experts with limited background in

  2. 'spup' - an R package for uncertainty propagation in spatial environmental modelling

    Science.gov (United States)

    Sawicka, Kasia; Heuvelink, Gerard

    2016-04-01

    Computer models have become a crucial tool in engineering and environmental sciences for simulating the behaviour of complex static and dynamic systems. However, while many models are deterministic, the uncertainty in their predictions needs to be estimated before they are used for decision support. Currently, advances in uncertainty propagation and assessment have been paralleled by a growing number of software tools for uncertainty analysis, but none has gained recognition for a universal applicability, including case studies with spatial models and spatial model inputs. Due to the growing popularity and applicability of the open source R programming language we undertook a project to develop an R package that facilitates uncertainty propagation analysis in spatial environmental modelling. In particular, the 'spup' package provides functions for examining the uncertainty propagation starting from input data and model parameters, via the environmental model onto model predictions. The functions include uncertainty model specification, stochastic simulation and propagation of uncertainty using Monte Carlo (MC) techniques, as well as several uncertainty visualization functions. Uncertain environmental variables are represented in the package as objects whose attribute values may be uncertain and described by probability distributions. Both numerical and categorical data types are handled. Spatial auto-correlation within an attribute and cross-correlation between attributes is also accommodated for. For uncertainty propagation the package has implemented the MC approach with efficient sampling algorithms, i.e. stratified random sampling and Latin hypercube sampling. The design includes facilitation of parallel computing to speed up MC computation. The MC realizations may be used as an input to the environmental models called from R, or externally. Selected static and interactive visualization methods that are understandable by non-experts with limited background in

  3. Parameters-related uncertainty in modeling sugar cane yield with an agro-Land Surface Model

    Science.gov (United States)

    Valade, A.; Ciais, P.; Vuichard, N.; Viovy, N.; Ruget, F.; Gabrielle, B.

    2012-12-01

    Agro-Land Surface Models (agro-LSM) have been developed from the coupling of specific crop models and large-scale generic vegetation models. They aim at accounting for the spatial distribution and variability of energy, water and carbon fluxes within soil-vegetation-atmosphere continuum with a particular emphasis on how crop phenology and agricultural management practice influence the turbulent fluxes exchanged with the atmosphere, and the underlying water and carbon pools. A part of the uncertainty in these models is related to the many parameters included in the models' equations. In this study, we quantify the parameter-based uncertainty in the simulation of sugar cane biomass production with the agro-LSM ORCHIDEE-STICS on a multi-regional approach with data from sites in Australia, La Reunion and Brazil. First, the main source of uncertainty for the output variables NPP, GPP, and sensible heat flux (SH) is determined through a screening of the main parameters of the model on a multi-site basis leading to the selection of a subset of most sensitive parameters causing most of the uncertainty. In a second step, a sensitivity analysis is carried out on the parameters selected from the screening analysis at a regional scale. For this, a Monte-Carlo sampling method associated with the calculation of Partial Ranked Correlation Coefficients is used. First, we quantify the sensitivity of the output variables to individual input parameters on a regional scale for two regions of intensive sugar cane cultivation in Australia and Brazil. Then, we quantify the overall uncertainty in the simulation's outputs propagated from the uncertainty in the input parameters. Seven parameters are identified by the screening procedure as driving most of the uncertainty in the agro-LSM ORCHIDEE-STICS model output at all sites. These parameters control photosynthesis (optimal temperature of photosynthesis, optimal carboxylation rate), radiation interception (extinction coefficient), root

  4. Uncertainty analysis of a structural-acoustic problem using imprecise probabilities based on p-box representations

    Science.gov (United States)

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Beer, Michael

    2016-12-01

    Imprecise probabilities can capture epistemic uncertainty, which reflects limited available knowledge so that a precise probabilistic model cannot be established. In this paper, the parameters of a structural-acoustic problem are represented with the aid of p-boxes to capture epistemic uncertainty in the model. To perform the necessary analysis of the structural-acoustic problem with p-boxes, a first-order matrix decomposition perturbation method (FMDPM) for interval analysis is proposed, and an efficient interval Monte Carlo method based on FMDPM is derived. In the implementation of the efficient interval Monte Carlo method based on FMDPM, constant matrices are obtained, first, through an uncertain parameter extraction on the basis of the matrix decomposition technique. Then, these constant matrices are employed to perform multiple interval analyses by using the first-order perturbation method. A numerical example is provided to illustrate the feasibility and effectiveness of the presented approach.

  5. Event based uncertainty assessment in urban drainage modelling, applying the GLUE methodology

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Beven, K.J.; Jensen, Jacob Birk

    2008-01-01

    In the present paper an uncertainty analysis on an application of the commercial urban drainage model MOUSE is conducted. Applying the Generalized Likelihood Uncertainty Estimation (GLUE) methodology the model is conditioned on observation time series from two flow gauges as well as the occurrence...... if the uncertainty analysis is unambiguous. It is shown that the GLUE methodology is very applicable in uncertainty analysis of this application of an urban drainage model, although it was shown to be quite difficult of get good fits of the whole time series....

  6. Understanding uncertainties in model-based predictions of Aedes aegypti population dynamics.

    Directory of Open Access Journals (Sweden)

    Chonggang Xu

    2010-09-01

    Full Text Available Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model.This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1 uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2 uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss. Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty. Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level.This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for

  7. Understanding uncertainties in model-based predictions of Aedes aegypti population dynamics.

    Science.gov (United States)

    Xu, Chonggang; Legros, Mathieu; Gould, Fred; Lloyd, Alun L

    2010-09-28

    Aedes aegypti is one of the most important mosquito vectors of human disease. The development of spatial models for Ae. aegypti provides a promising start toward model-guided vector control and risk assessment, but this will only be possible if models make reliable predictions. The reliability of model predictions is affected by specific sources of uncertainty in the model. This study quantifies uncertainties in the predicted mosquito population dynamics at the community level (a cluster of 612 houses) and the individual-house level based on Skeeter Buster, a spatial model of Ae. aegypti, for the city of Iquitos, Peru. The study considers two types of uncertainty: 1) uncertainty in the estimates of 67 parameters that describe mosquito biology and life history, and 2) uncertainty due to environmental and demographic stochasticity. Our results show that for pupal density and for female adult density at the community level, respectively, the 95% prediction confidence interval ranges from 1000 to 3000 and from 700 to 5,000 individuals. The two parameters contributing most to the uncertainties in predicted population densities at both individual-house and community levels are the female adult survival rate and a coefficient determining weight loss due to energy used in metabolism at the larval stage (i.e. metabolic weight loss). Compared to parametric uncertainty, stochastic uncertainty is relatively low for population density predictions at the community level (less than 5% of the overall uncertainty) but is substantially higher for predictions at the individual-house level (larger than 40% of the overall uncertainty). Uncertainty in mosquito spatial dispersal has little effect on population density predictions at the community level but is important for the prediction of spatial clustering at the individual-house level. This is the first systematic uncertainty analysis of a detailed Ae. aegypti population dynamics model and provides an approach for identifying those

  8. Sources of uncertainties in modelling black carbon at the global scale

    OpenAIRE

    2010-01-01

    Our understanding of the global black carbon (BC) cycle is essentially qualitative due to uncertainties in our knowledge of its properties. This work investigates two source of uncertainties in modelling black carbon: those due to the use of different schemes for BC ageing and its removal rate in the global Transport-Chemistry model TM5 and those due to the uncertainties in the definition and quantification of the observations, which propagate through to both the emission inventories, and the...

  9. The climate dependence of the terrestrial carbon cycle, including parameter and structural uncertainties

    Directory of Open Access Journals (Sweden)

    M. J. Smith

    2013-01-01

    processes, whether inferred individually from their corresponding data sets or using the full terrestrial carbon model and all available data sets, indicating a strong overall consistency in the information provided by different data sets under the assumed model formulation. A notable exception was plant mortality, in which qualitatively different climate dependencies were inferred depending on the model formulation and data sets used, highlighting this component as the major structural uncertainty in the model. All but two component processes predicted empirical data better than a null model in which no climate dependency was assumed. Equilibrium plant carbon was predicted especially well (explaining around 70% of the variation in the withheld evaluation data. We discuss the advantages of our approach in relation to advancing our understanding of the carbon cycle and enabling Earth System Models to make better constrained projections.

  10. Uncertainties in predicting rice yield by current crop models under a wide range of climatic conditions

    NARCIS (Netherlands)

    Li, T.; Hasegawa, T.; Yin, X.; Zhu, Y.; Boote, K.; Adam, M.; Bregaglio, S.; Buis, S.; Confalonieri, R.; Fumoto, T.; Gaydon, D.; Marcaida III, M.; Nakagawa, H.; Oriol, P.; Ruane, A.C.; Ruget, F.; Singh, B.; Singh, U.; Tang, L.; Yoshida, H.; Zhang, Z.; Bouman, B.

    2015-01-01

    Predicting rice (Oryza sativa) productivity under future climates is important for global food security. Ecophysiological crop models in combination with climate model outputs are commonly used in yield prediction, but uncertainties associated with crop models remain largely unquantified. We evaluat

  11. Opinion: the use of natural hazard modeling for decision making under uncertainty

    Directory of Open Access Journals (Sweden)

    David E Calkin

    2015-04-01

    Full Text Available Decision making to mitigate the effects of natural hazards is a complex undertaking fraught with uncertainty. Models to describe risks associated with natural hazards have proliferated in recent years. Concurrently, there is a growing body of work focused on developing best practices for natural hazard modeling and to create structured evaluation criteria for complex environmental models. However, to our knowledge there has been less focus on the conditions where decision makers can confidently rely on results from these models. In this review we propose a preliminary set of conditions necessary for the appropriate application of modeled results to natural hazard decision making and provide relevant examples within US wildfire management programs.

  12. Comparing the effects of climate and impact model uncertainty on climate impacts estimates for grain maize

    Science.gov (United States)

    Holzkämper, Annelie; Honti, Mark; Fuhrer, Jürg

    2015-04-01

    Crop models are commonly applied to estimate impacts of projected climate change and to anticipate suitable adaptation measures. Thereby, uncertainties from global climate models, regional climate models, and impacts models cascade down to impact estimates. It is essential to quantify and under